Information Management Platform for Data Analytics and Aggregation (IMPALA) System Design Document
NASA Technical Reports Server (NTRS)
Carnell, Andrew; Akinyelu, Akinyele
2016-01-01
The System Design document tracks the design activities that are performed to guide the integration, installation, verification, and acceptance testing of the IMPALA Platform. The inputs to the design document are derived from the activities recorded in Tasks 1 through 6 of the Statement of Work (SOW), with the proposed technical solution being the completion of Phase 1-A. With the documentation of the architecture of the IMPALA Platform and the installation steps taken, the SDD will be a living document, capturing the details about capability enhancements and system improvements to the IMPALA Platform to provide users in development of accurate and precise analytical models. The IMPALA Platform infrastructure team, data architecture team, system integration team, security management team, project manager, NASA data scientists and users are the intended audience of this document. The IMPALA Platform is an assembly of commercial-off-the-shelf (COTS) products installed on an Apache-Hadoop platform. User interface details for the COTS products will be sourced from the COTS tools vendor documentation. The SDD is a focused explanation of the inputs, design steps, and projected outcomes of every design activity for the IMPALA Platform through installation and validation.
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.; Lavelle, Thomas M.
1995-01-01
Modifications made to the axial-flow compressor conceptual design code CSPAN are documented in this report. Endwall blockage and stall margin predictions were added. The loss-coefficient model was upgraded. Default correlations for rotor and stator solidity and aspect-ratio inputs and for stator-exit tangential velocity inputs were included in the code along with defaults for aerodynamic design limits. A complete description of input and output along with sample cases are included.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-29
...In this document, the Wireline Competition Bureau (the Bureau) seeks comment on a number of threshold decisions regarding the design of and data inputs to the forward looking cost model, and on other assumptions in the cost models currently in the record.
NASA Technical Reports Server (NTRS)
Lyons, J. T.; Borchers, William R.
1993-01-01
Documentation for the User Interface Program for the Minimum Hamiltonian Ascent Trajectory Evaluation (MASTRE) is provided. The User Interface Program is a separate software package designed to ease the user input requirements when using the MASTRE Trajectory Program. This document supplements documentation on the MASTRE Program that consists of the MASTRE Engineering Manual and the MASTRE Programmers Guide. The User Interface Program provides a series of menus and tables using the VAX Screen Management Guideline (SMG) software. These menus and tables allow the user to modify the MASTRE Program input without the need for learning the various program dependent mnemonics. In addition, the User Interface Program allows the user to modify and/or review additional input Namelist and data files, to build and review command files, to formulate and calculate mass properties related data, and to have a plotting capability.
DDL:Digital systems design language
NASA Technical Reports Server (NTRS)
Shival, S. G.
1980-01-01
Hardware description languages are valuable tools in such applications as hardware design, system documentation, and logic design training. DDL is convenient medium for inputting design details into hardware-design automation system. It is suitable for describing digital systems at gate, register transfer, and major combinational block level.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
.... All documents in the docket are listed on the http://www.regulations.gov Web site. Although listed in... boilers (i.e., with a design heat input capacity of 10 MMBtu/hr or more). A review of the data has... small coal-fired units (i.e., with a design heat input capacity of less than 10 MMBtu/hr) are subject to...
Culture Shock!! "Lesson" the Blow.
ERIC Educational Resources Information Center
Duffin, Ken
1996-01-01
Designing, developing, and implementing an electronic document management system involves preparation. Areas to consider when facilitating technological change include staff input and business and customer needs and wants. Further discussion addresses value assessment of document type, providing a pilot system for staff experiment and practice,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frazier, Christopher Rawls; Durfee, Justin David; Bandlow, Alisa
The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.
Degraded character recognition based on gradient pattern
NASA Astrophysics Data System (ADS)
Babu, D. R. Ramesh; Ravishankar, M.; Kumar, Manish; Wadera, Kevin; Raj, Aakash
2010-02-01
Degraded character recognition is a challenging problem in the field of Optical Character Recognition (OCR). The performance of an optical character recognition depends upon printed quality of the input documents. Many OCRs have been designed which correctly identifies the fine printed documents. But, very few reported work has been found on the recognition of the degraded documents. The efficiency of the OCRs system decreases if the input image is degraded. In this paper, a novel approach based on gradient pattern for recognizing degraded printed character is proposed. The approach makes use of gradient pattern of an individual character for recognition. Experiments were conducted on character image that is either digitally written or a degraded character extracted from historical documents and the results are found to be satisfactory.
ATLAS, an integrated structural analysis and design system. Volume 5: System demonstration problems
NASA Technical Reports Server (NTRS)
Samuel, R. A. (Editor)
1979-01-01
One of a series of documents describing the ATLAS System for structural analysis and design is presented. A set of problems is described that demonstrate the various analysis and design capabilities of the ATLAS System proper as well as capabilities available by means of interfaces with other computer programs. Input data and results for each demonstration problem are discussed. Results are compared to theoretical solutions or experimental data where possible. Listings of all input data are included.
Development of Innovative Design Processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.S.; Park, C.O.
2004-07-01
The nuclear design analysis requires time-consuming and erroneous model-input preparation, code run, output analysis and quality assurance process. To reduce human effort and improve design quality and productivity, Innovative Design Processor (IDP) is being developed. Two basic principles of IDP are the document-oriented design and the web-based design. The document-oriented design is that, if the designer writes a design document called active document and feeds it to a special program, the final document with complete analysis, table and plots is made automatically. The active documents can be written with ordinary HTML editors or created automatically on the web, which ismore » another framework of IDP. Using the proper mix-up of server side and client side programming under the LAMP (Linux/Apache/MySQL/PHP) environment, the design process on the web is modeled as a design wizard style so that even a novice designer makes the design document easily. This automation using the IDP is now being implemented for all the reload design of Korea Standard Nuclear Power Plant (KSNP) type PWRs. The introduction of this process will allow large reduction in all reload design efforts of KSNP and provide a platform for design and R and D tasks of KNFC. (authors)« less
Development of Souvenir Production Transaction Processing System
NASA Astrophysics Data System (ADS)
Rumambi, H.; Kaparang, R.; Lintong, J.
2018-01-01
This research aims to design a souvenir production transaction processing system for the craftsmen in North Sulawesi. The craftsmen make very simple recordings about souvenir production transactions and use documents that are not in accordance with the generally accepted accounting practices. This research uses qualitative method. The data is collected through interviews, observations, documents and literatures studies. The research stages are conducted in preliminary studies, data collection, data analyzed and system design. The design of system is built from chart of account, accounting cycle and documents as input and get processed in accounting recording. The outputs are financial statements. The system design provides benefits for the craftsmen in assessing the financial performance and getting financing from bank.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-18
... published in the Federal Register a document seeking public input on the design of a plan to use for periodic retrospective review of its regulations (76 FR 9988). This input is being solicited in response to...] Extension of Comment Period: EPA's Plan for Retrospective Review Under Executive Order 13563 AGENCY...
ERIC Educational Resources Information Center
Broussard, Shorna R.; Bliss, John C.
2007-01-01
Purpose: The purpose of this research is to determine institutional commitment to sustainability by examining Natural Resource Extension program inputs, activities, and participation. Design/methodology/approach: A document analysis of Natural Resource Extension planning and reporting documents was conducted to provide contextual and historical…
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Davis, R. C.
1992-01-01
A user's manual is presented for MacPASCO, which is an interactive, graphic, preprocessor for panel design. MacPASCO creates input for PASCO, an existing computer code for structural analysis and sizing of longitudinally stiffened composite panels. MacPASCO provides a graphical user interface which simplifies the specification of panel geometry and reduces user input errors. The user draws the initial structural geometry and reduces user input errors. The user draws the initial structural geometry on the computer screen, then uses a combination of graphic and text inputs to: refine the structural geometry; specify information required for analysis such as panel load and boundary conditions; and define design variables and constraints for minimum mass optimization. Only the use of MacPASCO is described, since the use of PASCO has been documented elsewhere.
NASA Technical Reports Server (NTRS)
Buck, C. H.
1975-01-01
The program documentation for the PRF ARTWORK/AIDS conversion program, which serves as the interface between the outputs of the PRF ARTWORK and AIDS programs, was presented. The document has a two-fold purpose, the first of which is a description of the software design including flowcharts of the design at the functional level. The second purpose is to provide the user with a detailed description of the input parameters and formats necessary to execute the program and a description of the output produced when the program is executed.
Digital adaptive controllers for VTOL vehicles. Volume 2: Software documentation
NASA Technical Reports Server (NTRS)
Hartmann, G. L.; Stein, G.; Pratt, S. G.
1979-01-01
The VTOL approach and landing test (VALT) adaptive software is documented. Two self-adaptive algorithms, one based on an implicit model reference design and the other on an explicit parameter estimation technique were evaluated. The organization of the software, user options, and a nominal set of input data are presented along with a flow chart and program listing of each algorithm.
Computer program for design analysis of radial-inflow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1976-01-01
A computer program written in FORTRAN that may be used for the design analysis of radial-inflow turbines was documented. The following information is included: loss model (estimation of losses), the analysis equations, a description of the input and output data, the FORTRAN program listing and list of variables, and sample cases. The input design requirements include the power, mass flow rate, inlet temperature and pressure, and rotational speed. The program output data includes various diameters, efficiencies, temperatures, pressures, velocities, and flow angles for the appropriate calculation stations. The design variables include the stator-exit angle, rotor radius ratios, and rotor-exit tangential velocity distribution. The losses are determined by an internal loss model.
The present status and problems in document retrieval system : document input type retrieval system
NASA Astrophysics Data System (ADS)
Inagaki, Hirohito
The office-automation (OA) made many changes. Many documents were begun to maintained in an electronic filing system. Therefore, it is needed to establish efficient document retrieval system to extract useful information. Current document retrieval systems are using simple word-matching, syntactic-matching, semantic-matching to obtain high retrieval efficiency. On the other hand, the document retrieval systems using special hardware devices, such as ISSP, were developed for aiming high speed retrieval. Since these systems can accept a single sentence or keywords as input, it is difficult to explain searcher's request. We demonstrated document input type retrieval system, which can directly accept document as an input, and can search similar documents from document data-base.
Correlator optical wavefront sensor COWS
NASA Astrophysics Data System (ADS)
1991-02-01
This report documents the significant upgrades and improvements made to the correlator optical wavefront sensor (COWS) optical bench during this phase of the program. Software for the experiment was reviewed and documented. Flowcharts showing the program flow are included as well as documentation for programs which were written to calculate and display Zernike polynomials. The system was calibrated and aligned and a series of experiments to determine the optimum settings for the input and output MOSLM polarizers were conducted. In addition, design of a simple aberration generation is included.
The Role and Design of Screen Images in Software Documentation.
ERIC Educational Resources Information Center
van der Meij, Hans
2000-01-01
Discussion of learning a new computer software program focuses on how to support the joint handling of a manual, input devices, and screen display. Describes a study that examined three design styles for manuals that included screen images to reduce split-attention problems and discusses theory versus practice and cognitive load theory.…
Model documentation Renewable Fuels Module of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-01-01
This report documents the objectives, analaytical approach and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1996 Annual Energy Outlook forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described.
Short-Term Energy Outlook Model Documentation: Hydrocarbon Gas Liquids Supply and Demand
2015-01-01
The hydrocarbon gas liquids (ethane, propane, butanes, and natural gasoline) module of the Short-Term Energy Outlook (STEO) model is designed to provide forecasts of U.S. production, consumption, refinery inputs, net imports, and inventories.
Turning Operational Lessons Learned into Design Reality
NASA Technical Reports Server (NTRS)
Brady, David A.
2009-01-01
The capabilities and limitations of a particular system design are well known by the people who operate it. Operational workarounds, operational notes and lessons learned are traditional methods for dealing with and documenting design shortcomings. The beginning of each new program brings the hope that hard-learned lessons will be incorporated into the next new system. But often operations personnel find their well-intentioned efforts frustrated by an inability to have their inputs considered by design personnel who have strictly-scoped requirements that are coupled with ambitious cost and schedule targets. There is a way for operational inputs to make it into the design, but the solution involves a combination of organizational culture and technical data. Any organization that utilizes this approach can realize significant benefits over the life cycle of their project.
DAKOTA JAGUAR 3.0 user's manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Bauman, Lara E; Chan, Ethan
2013-05-01
JAGUAR (JAva GUi for Applied Research) is a Java software tool providing an advanced text editor and graphical user interface (GUI) to manipulate DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) input specifications. This document focuses on the features necessary to use JAGUAR.
Port Needs Study (Vessel Traffic Services Benefits) : Volume 2. Appendices, Part 1.
DOT National Transportation Integrated Search
1991-08-01
Volume II focuses on organization and presentation of information for each individual study zone. It contains the appendix tables of input data, output statistics and the documentation of the candidate Vessel Traffic Service (VTS) Design by NavCom Sy...
NASA Technical Reports Server (NTRS)
Pitts, E. R.
1976-01-01
The DJANAL (DisJunct ANALyzer) Program provides a means for the LSI designer to format output from the Mask Analysis Program (MAP) for input to the FETLOG (FETSIM/LOGSIM) processor. This document presents a brief description of the operation of DJANAL and provides comprehensive instruction for its use.
Port Needs Study (Vessel Traffic Services Benefits) : Volume 2. Appendices, Part 2.
DOT National Transportation Integrated Search
1991-01-01
Volume II focuses on organization and presentation of information for each individual study zone. It contains the appendix tables of input data, output statistics and the documentation of the candidate Vessel Traffic Services (VTS) Design by NavCom S...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chan, Ethan
2011-06-01
JAGUAR (JAva GUi for Applied Research) is a Java software tool providing an advanced text editor and graphical user interface (GUI) to manipulate DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) input specifications. This document focuses on the technical background necessary for a developer to understand JAGUAR.
Software Manages Documentation in a Large Test Facility
NASA Technical Reports Server (NTRS)
Gurneck, Joseph M.
2001-01-01
The 3MCS computer program assists and instrumentation engineer in performing the 3 essential functions of design, documentation, and configuration management of measurement and control systems in a large test facility. Services provided by 3MCS are acceptance of input from multiple engineers and technicians working at multiple locations;standardization of drawings;automated cross-referencing; identification of errors;listing of components and resources; downloading of test settings; and provision of information to customers.
C2 Core and UCore Message Design Capstone: Interoperable Message Structure
2009-09-01
there are sufficient resources to carry out a mission. The Team used the Theatre Battle Management Command System ( TBMCS ) to generate sample CMD...System ( TBMCS ) was used to generate CMD messages as inputs for both use cases. These were programmatically transformed into the three-layer message...used for the experiment was generated from the TBMCS in the form of a CMD XML document. The Capstone experiment included transforming that document to
2017-08-03
Army Test and Evaluation Command 2202 Aberdeen Boulevard Aberdeen Proving Ground, MD 21005-5001 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR...properties that provide information on a vehicle’s mass distribution. The properties impact vehicle design and safety and are primary inputs to vehicle...also useful in the design and construction of vehicle safety outriggers needed during the conduct of dynamic handling tests. This document
Integral flange design program. [procedure for computing stresses
NASA Technical Reports Server (NTRS)
Wilson, J. F.
1974-01-01
An automated interactive flange design program utilizing an electronic desk top calculator is presented. The program calculates the operating and seating stresses for circular flanges of the integral or optional type subjected to internal pressure. The required input information is documented. The program provides an automated procedure for computing stresses in selected flange geometries for comparison to the allowable code values.
Design of a modular digital computer system, CDRL no. D001, final design plan
NASA Technical Reports Server (NTRS)
Easton, R. A.
1975-01-01
The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.
Advanced information processing system: Input/output network management software
NASA Technical Reports Server (NTRS)
Nagle, Gail; Alger, Linda; Kemp, Alexander
1988-01-01
The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.
NASA Technical Reports Server (NTRS)
Carlson, C. R.
1981-01-01
The user documentation of the SYSGEN model and its links with other simulations is described. The SYSGEN is a production costing and reliability model of electric utility systems. Hydroelectric, storage, and time dependent generating units are modeled in addition to conventional generating plants. Input variables, modeling options, output variables, and reports formats are explained. SYSGEN also can be run interactively by using a program called FEPS (Front End Program for SYSGEN). A format for SYSGEN input variables which is designed for use with FEPS is presented.
IGDS/TRAP Interface Program (ITIP). Software Design Document
NASA Technical Reports Server (NTRS)
Jefferys, Steve; Johnson, Wendell
1981-01-01
The preliminary design of the IGDS/TRAP Interface Program (ITIP) is described. The ITIP is implemented on the PDP 11/70 and interfaces directly with the Interactive Graphics Design System and the Data Management and Retrieval System. The program provides an efficient method for developing a network flow diagram. Performance requirements, operational rquirements, and design requirements are discussed along with sources and types of input and destination and types of output. Information processing functions and data base requirements are also covered.
Description of the IV + V System Software Package.
ERIC Educational Resources Information Center
Microcomputers for Information Management: An International Journal for Library and Information Services, 1984
1984-01-01
Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…
WINDOWAC (Wing Design Optimization With Aeroelastic Constraints): Program manual
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Starnes, J. H., Jr.
1974-01-01
User and programer documentation for the WIDOWAC programs is given. WIDOWAC may be used for the design of minimum mass wing structures subjected to flutter, strength, and minimum gage constraints. The wing structure is modeled by finite elements, flutter conditions may be both subsonic and supersonic, and mathematical programing methods are used for the optimization procedure. The user documentation gives general directions on how the programs may be used and describes their limitations; in addition, program input and output are described, and example problems are presented. A discussion of computational algorithms and flow charts of the WIDOWAC programs and major subroutines is also given.
Development of An Intelligent Flight Propulsion Control System
NASA Technical Reports Server (NTRS)
Calise, A. J.; Rysdyk, R. T.; Leonhardt, B. K.
1999-01-01
The initial design and demonstration of an Intelligent Flight Propulsion and Control System (IFPCS) is documented. The design is based on the implementation of a nonlinear adaptive flight control architecture. This initial design of the IFPCS enhances flight safety by using propulsion sources to provide redundancy in flight control. The IFPCS enhances the conventional gain scheduled approach in significant ways: (1) The IFPCS provides a back up flight control system that results in consistent responses over a wide range of unanticipated failures. (2) The IFPCS is applicable to a variety of aircraft models without redesign and,(3) significantly reduces the laborious research and design necessary in a gain scheduled approach. The control augmentation is detailed within an approximate Input-Output Linearization setting. The availability of propulsion only provides two control inputs, symmetric and differential thrust. Earlier Propulsion Control Augmentation (PCA) work performed by NASA provided for a trajectory controller with pilot command input of glidepath and heading. This work is aimed at demonstrating the flexibility of the IFPCS in providing consistency in flying qualities under a variety of failure scenarios. This report documents the initial design phase where propulsion only is used. Results confirm that the engine dynamics and associated hard nonlineaaities result in poor handling qualities at best. However, as demonstrated in simulation, the IFPCS is capable of results similar to the gain scheduled designs of the NASA PCA work. The IFPCS design uses crude estimates of aircraft behaviour. The adaptive control architecture demonstrates robust stability and provides robust performance. In this work, robust stability means that all states, errors, and adaptive parameters remain bounded under a wide class of uncertainties and input and output disturbances. Robust performance is measured in the quality of the tracking. The results demonstrate the flexibility of the IFPCS architecture and the ability to provide robust performance under a broad range of uncertainty. Robust stability is proved using Lyapunov like analysis. Future development of the IFPCS will include integration of conventional control surfaces with the use of propulsion augmentation, and utilization of available lift and drag devices, to demonstrate adaptive control capability under a greater variety of failure scenarios. Further work will specifically address the effects of actuator saturation.
CATS--Computer Assisted Teaching in Science.
ERIC Educational Resources Information Center
Barron, Marcelline A.
This document contains the listings for 46 computer programs which are designed to teach various concepts in chemistry and physics. Significant time was spent in writing programs in which students would input chemical and physical data from their laboratory experiments. No significant time was spent writing drill and practice programs other than…
Program Description: Financial Master File Processor-SWRL Financial System.
ERIC Educational Resources Information Center
Ideda, Masumi
Computer routines designed to produce various management and accounting reports required by the Southwest Regional Laboratory's (SWRL) Financial System are described. Input data requirements and output report formats are presented together with a discussion of the Financial Master File updating capabilities of the system. This document should be…
SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION
The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...
NASA Technical Reports Server (NTRS)
Miura, H.; Schmit, L. A., Jr.
1976-01-01
The program documentation and user's guide for the ACCESS-1 computer program is presented. ACCESS-1 is a research oriented program which implements a collection of approximation concepts to achieve excellent efficiency in structural synthesis. The finite element method is used for structural analysis and general mathematical programming algorithms are applied in the design optimization procedure. Implementation of the computer program, preparation of input data and basic program structure are described, and three illustrative examples are given.
Aero/structural tailoring of engine blades (AERO/STAEBL)
NASA Technical Reports Server (NTRS)
Brown, K. W.
1988-01-01
This report describes the Aero/Structural Tailoring of Engine Blades (AERO/STAEBL) program, which is a computer code used to perform engine fan and compressor blade aero/structural numerical optimizations. These optimizations seek a blade design of minimum operating cost that satisfies realistic blade design constraints. This report documents the overall program (i.e., input, optimization procedures, approximate analyses) and also provides a detailed description of the validation test cases.
NASA Astrophysics Data System (ADS)
Venkrbec, Vaclav; Bittnerova, Lucie
2017-12-01
Building information modeling (BIM) can support effectiveness during many activities in the AEC industry. even when processing a construction-technological project. This paper presents an approach how to use building information model in higher education, especially during the work on diploma thesis and it supervision. Diploma thesis is project based work, which aims to compile a construction-technological project for a selected construction. The paper describes the use of input data, working with them and compares this process with standard input data such as printed design documentation. The effectiveness of using the building information model as a input data for construction-technological project is described in the conclusion.
Chinese-English Machine Translation System.
ERIC Educational Resources Information Center
Wang, William S-Y; And Others
The report documents results of a two-year R&D effort directed at the completion of a prototype system for Chinese-English machine translation of S&T literature. The system, designated QUINCE, accepts Chinese input exactly as printed, with no pre-editing of any kind, and produces English output on experimental basis. Coding of Chinese text via…
Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN
NASA Technical Reports Server (NTRS)
Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.
1996-01-01
A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.
A Ka-Band Wide-Bandgap Solid-State Power Amplifier: Architecture Performance Estimates
NASA Technical Reports Server (NTRS)
Epp, L.; Khan, P.; Silva, A.
2005-01-01
Motivated by recent advances in wide-bandgap (WBG) gallium nitride (GaN) semiconductor technology, there is considerable interest in developing efficient solidstate power amplifiers (SSPAs) as an alternative to the traveling-wave tube amplifier (TWTA) for space applications. This article documents the results of a study to investigate power-combining technology and SSPA architectures that can enable a 120-W, 40 percent power-added efficiency (PAE) SSPA. Results of the study indicate that architectures based on at least three power combiner designs are likely to enable the target SSPA. The proposed architectures can power combine 16 to 32 individual monolithic microwave integrated circuits (MMICs) with >80 percent combining efficiency. This corresponds to MMIC requirements of 5- to 10-W output power and >48 percent PAE. For the three proposed architectures [1], detailed analysis and design of the power combiner are presented. The first architecture studied is based on a 16-way septum combiner that offers low loss and high isolation over the design band of 31 to 36 GHz. Analysis of a 2-way prototype septum combiner had an input match >25 dB, output match >30 dB, insertion loss <0.02 dB, and isolation >30 dB over the design band. A 16-way design, based on cascading this combiner in a binary fashion, is documented. The second architecture is based on a 24-way waveguide radial combiner. A prototype 24-way radial base was analyzed to have an input match >30 dB (under equal excitation of all input ports). The match of the mode transducer that forms the output of a radial combiner was found to be >27 dB. The functional bandwidth of the radial base and mode transducer, which together will form a radial combiner/divider, exceeded the design band. The third architecture employs a 32-way, parallel-plate radial combiner. Simulation results indicated an input match >24 dB, output match >22 dB, insertion loss <0.23 dB, and adjacent port isolation >20 dB over the design band. All three architectures utilize a low-loss MMIC amplifier module based on commercial MMIC packaging and a custom microstrip-to-rectangular-waveguide transition. The insertion loss of the module is expected to be 0.45 dB over the design band.
Nuclear Criticality Safety Data Book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollenbach, D. F.
The objective of this document is to support the revision of criticality safety process studies (CSPSs) for the Uranium Processing Facility (UPF) at the Y-12 National Security Complex (Y-12). This design analysis and calculation (DAC) document contains development and justification for generic inputs typically used in Nuclear Criticality Safety (NCS) DACs to model both normal and abnormal conditions of processes at UPF to support CSPSs. This will provide consistency between NCS DACs and efficiency in preparation and review of DACs, as frequently used data are provided in one reference source.
Romney, Wendy; Salbach, Nancy; Parrott, James Scott; Deutsch, Judith E
2018-04-16
Little is known about the process of engaging key stakeholders to select and design a knowledge translation (KT) intervention to increase the use of an outcome measure using audit and feedback. The purpose of this case report was to describe the development of a KT intervention designed with organizational support to increase physical therapists' (PTs) use of a selected outcome measure in an inpatient sub-acute rehabilitation hospital. Eleven PTs who worked at a sub-acute rehabilitation hospital participated. After determining organizational support, a mixed methods barrier assessment including a chart audit, questionnaire, and a focus group with audit and feedback was used to select an outcome measure and design a locally tailored intervention. The intervention was mapped using the Theoretical Domains Framework (TDF). One investigator acted as knowledge broker and co-designed the intervention with clinician and supervisor support. The 4-m walk test was selected through a group discussion facilitated by the knowledge broker. Support from the facility and input from the key stakeholders guided the design of a tailored KT intervention to increase use of gait speed. The intervention design included an interactive educational meeting, with documentation and environmental changes. Input from the clinicians on the educational meeting, documentation changes and placement of tracks, and support from the supervisor were used to design and locally adapt a KT intervention to change assessment practice among PTs in an inpatient sub-acute rehabilitation hospital. Implementation and evaluation of the intervention is underway.
Lilholt, Lars; Haubro, Camilla Dremstrup; Møller, Jørn Munkhof; Aarøe, Jens; Højen, Anne Randorff; Gøeg, Kirstine Rosenbeck
2013-01-01
It is well-established that to increase acceptance of electronic clinical documentation tools, such as electronic health record (EHR) systems, it is important to have a strong relationship between those who document the clinical encounters and those who reaps the benefit of digitalized and more structured documentation. [1] Therefore, templates for EHR systems benefit from being closely related to clinical practice with a strong focus on primarily solving clinical problems. Clinical use as a driver for structured documentation has been the focus of the acute-physical-examination template (APET) development in the North Denmark Region. The template was developed through a participatory design where precision and clarity of documentation was prioritized as well as fast registration. The resulting template has approximately 700 easy accessible input possibilities and will be evaluated in clinical practice in the first quarter of 2013.
Harbaugh, Arlen W.; Banta, Edward R.; Hill, Mary C.; McDonald, Michael G.
2000-01-01
MODFLOW is a computer program that numerically solves the three-dimensional ground-water flow equation for a porous medium by using a finite-difference method. Although MODFLOW was designed to be easily enhanced, the design was oriented toward additions to the ground-water flow equation. Frequently there is a need to solve additional equations; for example, transport equations and equations for estimating parameter values that produce the closest match between model-calculated heads and flows and measured values. This report documents a new version of MODFLOW, called MODFLOW-2000, which is designed to accommodate the solution of equations in addition to the ground-water flow equation. This report is a user's manual. It contains an overview of the old and added design concepts, documents one new package, and contains input instructions for using the model to solve the ground-water flow equation.
NASA Astrophysics Data System (ADS)
Manninen, L. M.
1993-12-01
The document describes TKKMOD, a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout, with special emphasis on the battery submodel and its use in simulation. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project 'Engineering Design Tools for Wind-Diesel Systems' (JOUR-0078). WDLTOOLS serves as the user interface and processes the input and output data of different logistic simulation models developed by the project participants. TKKMOD cannot be run without this shell. The report only describes the simulation principles and model specific parameters of TKKMOD and gives model specific user instructions. The input and output data processing performed outside this model is described in the documentation of the shell. The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, losses in the system components, diesel fuel consumption, and the number of diesel engine starts.
Costs Associated With Compressed Natural Gas Vehicle Fueling Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, M.; Gonzales, J.
2014-09-01
This document is designed to help fleets understand the cost factors associated with fueling infrastructure for compressed natural gas (CNG) vehicles. It provides estimated cost ranges for various sizes and types of CNG fueling stations and an overview of factors that contribute to the total cost of an installed station. The information presented is based on input from professionals in the natural gas industry who design, sell equipment for, and/or own and operate CNG stations.
Wildcat5 for Windows, a rainfall-runoff hydrograph model: user manual and documentation
R. H. Hawkins; A. Barreto-Munoz
2016-01-01
Wildcat5 for Windows (Wildcat5) is an interactive Windows Excel-based software package designed to assist watershed specialists in analyzing rainfall runoff events to predict peak flow and runoff volumes generated by single-event rainstorms for a variety of watershed soil and vegetation conditions. Model inputs are: (1) rainstorm characteristics, (2) parameters related...
Advanced lighting guidelines: 1993. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eley, C.; Tolen, T.M.; Benya, J.R.
1993-12-31
The 1993 Advanced Lighting Guidelines document consists of twelve guidelines that provide an overview of specific lighting technologies and design application techniques utilizing energy-efficient lighting practice. Lighting Design Practice assesses energy-efficient lighting strategies, discusses lighting issues, and explains how to obtain quality lighting design and consulting services. Luminaires and Lighting Systems surveys luminaire equipment designed to take advantage of advanced technology lamp products and includes performance tables that allow for accurate estimation of luminaire light output and power input. The additional ten guidelines -- Computer-Aided Lighting Design, Energy-Efficient Fluorescent Ballasts, Full-Size Fluorescent Lamps, Compact Fluorescent Lamps, Tungsten-Halogen Lamps, Metal Halidemore » and HPS Lamps, Daylighting and Lumen Maintenance, Occupant Sensors, Time Scheduling Systems, and Retrofit Control Technologies -- each provide a product technology overview, discuss current products on the lighting equipment market, and provide application techniques. This document is intended for use by electric utility personnel involved in lighting programs, lighting designers, electrical engineers, architects, lighting manufacturers` representatives, and other lighting professionals.« less
Productivity increase through implementation of CAD/CAE workstation
NASA Technical Reports Server (NTRS)
Bromley, L. K.
1985-01-01
The tracking and communication division computer aided design/computer aided engineering system is now operational. The system is utilized in an effort to automate certain tasks that were previously performed manually. These tasks include detailed test configuration diagrams of systems under certification test in the ESTL, floorplan layouts of future planned laboratory reconfigurations, and other graphical documentation of division activities. The significant time savings achieved with this CAD/CAE system are examined: (1) input of drawings and diagrams; (2) editing of initial drawings; (3) accessibility of the data; and (4) added versatility. It is shown that the Applicon CAD/CAE system, with its ease of input and editing, the accessibility of data, and its added versatility, has made more efficient many of the necessary but often time-consuming tasks associated with engineering design and testing.
NASA Technical Reports Server (NTRS)
West, R. S.
1975-01-01
The system is described as a computer-based system designed to track the status of problems and corrective actions pertinent to space shuttle hardware. The input, processing, output, and performance requirements of the system are presented along with standard display formats and examples. Operational requirements, hardware, requirements, and test requirements are also included.
Waste Package Component Design Methodology Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.C. Mecham
2004-07-12
This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and usemore » of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational requirements of the YMP. Four waste package configurations have been selected to illustrate the application of the methodology during the licensing process. These four configurations are the 21-pressurized water reactor absorber plate waste package (21-PWRAP), the 44-boiling water reactor waste package (44-BWR), the 5 defense high-level radioactive waste (HLW) DOE spent nuclear fuel (SNF) codisposal short waste package (5-DHLWDOE SNF Short), and the naval canistered SNF long waste package (Naval SNF Long). Design work for the other six waste packages will be completed at a later date using the same design methodology. These include the 24-boiling water reactor waste package (24-BWR), the 21-pressurized water reactor control rod waste package (21-PWRCR), the 12-pressurized water reactor waste package (12-PWR), the 5 defense HLW DOE SNF codisposal long waste package (5-DHLWDOE SNF Long), the 2 defense HLW DOE SNF codisposal waste package (2-MC012-DHLW), and the naval canistered SNF short waste package (Naval SNF Short). This report is only part of the complete design description. Other reports related to the design include the design reports, the waste package system description documents, manufacturing specifications, and numerous documents for the many detailed calculations. The relationships between this report and other design documents are shown in Figure 1.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Charles R.; Weck, Philippe F.; Vaughn, Palmer
Report RWEV-REP-001, Analysis of Postclosure Groundwater Impacts for a Geologic Repository for the Disposal of Spent Nuclear Fuel and High Level Radioactive Waste at Yucca Mountain, Nye County, Nevada was issued by the DOE in 2009 and is currently being updated. Sandia National Laboratories (SNL) provided support for the original document, performing calculations and extracting data from the Yucca Mountain Performance Assessment Model that were used as inputs to the contaminant transport and dose calculations by Jason Associates Corporation, the primary developers of the DOE report. The inputs from SNL were documented in LSA-AR-037, Inputs to Jason Associates Corporation inmore » Support of the Postclosure Repository Supplemental Environmental Impact Statement. To support the updating of the original Groundwater Impacts document, SNL has reviewed the inputs provided in LSA-AR-037 to verify that they are current and appropriate for use. The results of that assessment are documented here.« less
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The appendices A-K to the user's manual for the rocket combustor interactive design (ROCCID) computer program are presented. This includes installation instructions, flow charts, subroutine model documentation, and sample output files. The ROCCID program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can be easily added. The analysis models in ROCCID can account for the influences of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
The Barriers and Causes of Building Information Modelling Usage for Interior Design Industry
NASA Astrophysics Data System (ADS)
Hamid, A. B. Abd; Taib, M. Z. Mohd; Razak, A. H. N. Abdul; Embi, M. R.
2017-12-01
Building Information Modeling (BIM) has since developed alongside the improvement in the construction industry, purposely to simulate the design, management, construction and documentation. It facilitates and monitors the construction through visualization and emphasizes on various inputs to virtually design and construct a building using specific software. This study aims to identify and elaborate barriers of BIM usage in interior design industry in Malaysia. This study is initiated with a pilot survey utilising sixteen respondents that has been randomly chosen. Respondents are attached with interior design firms that are registered by Lembaga Arkitek Malaysia (LAM). The research findings are expected to provide significant information to encourage BIM adoption among interior design firms.
NASA Technical Reports Server (NTRS)
Izumi, K. H.; Thompson, J. L.; Groce, J. L.; Schwab, R. W.
1986-01-01
The design requirements for a 4D path definition algorithm are described. These requirements were developed for the NASA ATOPS as an extension of the Local Flow Management/Profile Descent algorithm. They specify the processing flow, functional and data architectures, and system input requirements, and recommended the addition of a broad path revision (reinitialization) function capability. The document also summarizes algorithm design enhancements and the implementation status of the algorithm on an in-house PDP-11/70 computer. Finally, the requirements for the pilot-computer interfaces, the lateral path processor, and guidance and steering function are described.
Thermal/Structural Tailoring of Engine Blades (T/STAEBL) User's manual
NASA Technical Reports Server (NTRS)
Brown, K. W.
1994-01-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.
Thermal/Structural Tailoring of Engine Blades (T/STAEBL): User's manual
NASA Astrophysics Data System (ADS)
Brown, K. W.
1994-03-01
The Thermal/Structural Tailoring of Engine Blades (T/STAEBL) system is a computer code that is able to perform numerical optimizations of cooled jet engine turbine blades and vanes. These optimizations seek an airfoil design of minimum operating cost that satisfies realistic design constraints. This report documents the organization of the T/STAEBL computer program, its design and analysis procedure, its optimization procedure, and provides an overview of the input required to run the program, as well as the computer resources required for its effective use. Additionally, usage of the program is demonstrated through a validation test case.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neitzel, D.A.; McKenzie, D.H.
To minimize adverse impact on aquatic ecosystems resulting from the operation of water intake structures, design engineers must have relevant information on the behavior, physiology and ecology of local fish and shellfish. Identification of stimulus/response relationships and the environmental factors that influence them is the first step in incorporating biological information in the design, location or modification of water intake structures. A procedure is presented in this document for providing biological input to engineers who are designing, locating or modifying a water intake structure. The authors discuss sources of stimuli at water intakes, historical approaches in assessing potential/actual impact andmore » review biological information needed for intake design.« less
2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Wesley; Mai, Trieu; Logan, Jeffrey
The National Renewable Energy Laboratory is conducting a study sponsored by the Office of Energy Efficiency and Renewable Energy (EERE) that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis.
Software Engineering Practices: Their Impact on the Design of a Program Maintenance Manual.
1982-12-01
listinjs. Myers raef. 18) st at es: Since we already have the zoi?, 1’hYlnot let it SerVe as-he loic documentation? . diin3 ocmtati-)n such as a flowchart ...Recommended Locations for Coiments 1. ht the beginnning of eaz- module lnzlude the module name the current date, the module’sfunction, its’inputs and outputs
User manual for SPLASH (Single Panel Lamp and Shroud Helper).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, Marvin Elwood
2006-02-01
The radiant heat test facility develops test sets providing well-characterized thermal environments, often representing fires. Many of the components and procedures have become standardized to such an extent that the development of a specialized design tool to determine optimal configurations for radiant heat experiments was appropriate. SPLASH (Single Panel Lamp and Shroud Helper) is that tool. SPLASH is implemented as a user-friendly, Windows-based program that allows a designer to describe a test setup in terms of parameters such as number of lamps, power, position, and separation distance. This document is a user manual for that software. Any incidental descriptions ofmore » theory are only for the purpose of defining the model inputs. The theory for the underlying model is described in SAND2005-2947 (Ref. [1]). SPLASH provides a graphical user interface to define lamp panel and shroud designs parametrically, solves the resulting radiation enclosure problem for up to 2500 surfaces, and provides post-processing to facilitate understanding and documentation of analyzed designs.« less
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.
2008-12-01
Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This study is a requirements document that presents analysis for the functional description for the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operations input or engineering judgment. The requirements in this study apply to the first phase of the W314 Project. This document has been updated during the definitive design portion of the first phase of the W314 Project to capture additional software requirements and is planned to be updated duringmore » the second phase of the W314 Project to cover the second phase of the project's scope.« less
Praxis language reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, J.H.
1981-01-01
This document is a language reference manual for the programming language Praxis. The document contains the specifications that must be met by any compiler for the language. The Praxis language was designed for systems programming in real-time process applications. Goals for the language and its implementations are: (1) highly efficient code generated by the compiler; (2) program portability; (3) completeness, that is, all programming requirements can be met by the language without needing an assembler; and (4) separate compilation to aid in design and management of large systems. The language does not provide any facilities for input/output, stack and queuemore » handling, string operations, parallel processing, or coroutine processing. These features can be implemented as routines in the language, using machine-dependent code to take advantage of facilities in the control environment on different machines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickens, J.K.
1988-03-01
This document provides a complete listing of the FORTRAN progran SCINFUL, a program designed to provide a calculated full response anticipated for either an NE-213 (liquid) scintillator or an NE-110 (solid) scintillator. The incident design neutron energy range is 0.1 to 80 MeV. Preparation of input to the program is discussed as are important features of the output. Also included is a FORTRAN listing of a subsidiary program applicable to the output of SCINFUL. This user-interactive program is named SCINSPEC from which the output of SCINFUL may be reformatted into a standard spectrum form involving either equal light-unit or equalmore » protran-energy intervals. Examples of input to this program and corresponding output are given.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goltz, G.; Weiner, H.
A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U. S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document provides all the information necessary tomore » access the DSPA programs, to input required data and to generate appropriate Design Synthesis or Performance Analysis Output.« less
A rotor technology assessment of the advancing blade concept
NASA Technical Reports Server (NTRS)
Pleasants, W. A.
1983-01-01
A rotor technology assessment of the Advancing Blade Concept (ABC) was conducted in support of a preliminary design study. The analytical methodology modifications and inputs, the correlation, and the results of the assessment are documented. The primary emphasis was on the high-speed forward flight performance of the rotor. The correlation data base included both the wind tunnel and the flight test results. An advanced ABC rotor design was examined; the suitability of the ABC for a particular mission was not considered. The objective of this technology assessment was to provide estimates of the performance potential of an advanced ABC rotor designed for high speed forward flight.
Hollow cathode heater development for the Space Station plasma contactor
NASA Technical Reports Server (NTRS)
Soulas, George C.
1993-01-01
A hollow cathode-based plasma contactor has been selected for use on the Space Station. During the operation of the plasma contactor, the hollow cathode heater will endure approximately 12000 thermal cycles. Since a hollow cathode heater failure would result in a plasma contactor failure, a hollow cathode heater development program was established to produce a reliable heater design. The development program includes the heater design, process documents for both heater fabrication and assembly, and heater testing. The heater design was a modification of a sheathed ion thruster cathode heater. Three heaters have been tested to date using direct current power supplies. Performance testing was conducted to determine input current and power requirements for achieving activation and ignition temperatures, single unit operational repeatability, and unit-to-unit operational repeatability. Comparisons of performance testing data at the ignition input current level for the three heaters show the unit-to-unit repeatability of input power and tube temperature near the cathode tip to be within 3.5 W and 44 degrees C, respectively. Cyclic testing was then conducted to evaluate reliability under thermal cycling. The first heater, although damaged during assembly, completed 5985 ignition cycles before failing. Two additional heaters were subsequently fabricated and have completed 3178 cycles to date in an on-going test.
Timeline Resource Analysis Program (TRAP): User's manual and program document
NASA Technical Reports Server (NTRS)
Sessler, J. G.
1981-01-01
The Timeline Resource Analysis Program (TRAP), developed for scheduling and timelining problems, is described. Given an activity network, TRAP generates timeline plots, resource histograms, and tabular summaries of the network, schedules, and resource levels. It is written in ANSI FORTRAN for the Honeywell SIGMA 5 computer and operates in the interactive mode using the TEKTRONIX 4014-1 graphics terminal. The input network file may be a standard SIGMA 5 file or one generated using the Interactive Graphics Design System. The timeline plots can be displayed in two orderings: according to the sequence in which the tasks were read on input, and a waterfall sequence in which the tasks are ordered by start time. The input order is especially meaningful when the network consists of several interacting subnetworks. The waterfall sequence is helpful in assessing the project status at any point in time.
ESCHER: An interactive mesh-generating editor for preparing finite-element input
NASA Technical Reports Server (NTRS)
Oakes, W. R., Jr.
1984-01-01
ESCHER is an interactive mesh generation and editing program designed to help the user create a finite-element mesh, create additional input for finite-element analysis, including initial conditions, boundary conditions, and slidelines, and generate a NEUTRAL FILE that can be postprocessed for input into several finite-element codes, including ADINA, ADINAT, DYNA, NIKE, TSAAS, and ABUQUS. Two important ESCHER capabilities, interactive geometry creation and mesh archival storge are described in detail. Also described is the interactive command language and the use of interactive graphics. The archival storage and restart file is a modular, entity-based mesh data file. Modules of this file correspond to separate editing modes in the mesh editor, with data definition syntax preserved between the interactive commands and the archival storage file. Because ESCHER was expected to be highly interactive, extensive user documentation was provided in the form of an interactive HELP package.
Carey, A.E.; Prudic, David E.
1996-01-01
Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.
Matsuo, Toshihiko; Gochi, Akira; Hirakawa, Tsuyoshi; Ito, Tadashi; Kohno, Yoshihisa
2010-10-01
General electronic medical records systems remain insufficient for ophthalmology outpatient clinics from the viewpoint of dealing with many ophthalmic examinations and images in a large number of patients. Filing systems for documents and images by Yahgee Document View (Yahgee, Inc.) were introduced on the platform of general electronic medical records system (Fujitsu, Inc.). Outpatients flow management system and electronic medical records system for ophthalmology were constructed. All images from ophthalmic appliances were transported to Yahgee Image by the MaxFile gateway system (P4 Medic, Inc.). The flow of outpatients going through examinations such as visual acuity testing were monitored by the list "Ophthalmology Outpatients List" by Yahgee Workflow in addition to the list "Patients Reception List" by Fujitsu. Patients' identification number was scanned with bar code readers attached to ophthalmic appliances. Dual monitors were placed in doctors' rooms to show Fujitsu Medical Records on the left-hand monitor and ophthalmic charts of Yahgee Document on the right-hand monitor. The data of manually-inputted visual acuity, automatically-exported autorefractometry and non-contact tonometry on a new template, MaxFile ED, were again automatically transported to designated boxes on ophthalmic charts of Yahgee Document. Images such as fundus photographs, fluorescein angiograms, optical coherence tomographic and ultrasound scans were viewed by Yahgee Image, and were copy-and-pasted to assigned boxes on the ophthalmic charts. Ordering such as appointments, drug prescription, fees and diagnoses input, central laboratory tests, surgical theater and ward room reservations were placed by functions of the Fujitsu electronic medical records system. The combination of the Fujitsu electronic medical records and Yahgee Document View systems enabled the University Hospital to examine the same number of outpatients as prior to the implementation of the computerized filing system.
Strategy for Nanotechnology-Related Environmenta, Health, and Safety Research
2008-02-01
silver. Cover design by Nicolle Rager Fuller, Sayo Arts, Washington, DC. Copyright information This document is a work of the U.S. Government and...Facilitate wide dissemination of research results and other non -proprietary EHS information The strategy presented here is based on the state of science...products and processes in which they are used. The needs were also informed by input from non -Federal experts on risk assessment issues and by relevant
Human factors aspects of control room design
NASA Technical Reports Server (NTRS)
Jenkins, J. P.
1983-01-01
A plan for the design and analysis of a multistation control room is reviewed. It is found that acceptance of the computer based information system by the uses in the control room is mandatory for mission and system success. Criteria to improve computer/user interface include: match of system input/output with user; reliability, compatibility and maintainability; easy to learn and little training needed; self descriptive system; system under user control; transparent language, format and organization; corresponds to user expectations; adaptable to user experience level; fault tolerant; dialog capability user communications needs reflected in flexibility, complexity, power and information load; integrated system; and documentation.
Unwin, Ian; Jansen-van der Vliet, Martine; Westenbrink, Susanne; Presser, Karl; Infanger, Esther; Porubska, Janka; Roe, Mark; Finglas, Paul
2016-02-15
The EuroFIR Document and Data Repositories are being developed as accessible collections of source documents, including grey literature, and the food composition data reported in them. These Repositories will contain source information available to food composition database compilers when selecting their nutritional data. The Document Repository was implemented as searchable bibliographic records in the Europe PubMed Central database, which links to the documents online. The Data Repository will contain original data from source documents in the Document Repository. Testing confirmed the FoodCASE food database management system as a suitable tool for the input, documentation and quality assessment of Data Repository information. Data management requirements for the input and documentation of reported analytical results were established, including record identification and method documentation specifications. Document access and data preparation using the Repositories will provide information resources for compilers, eliminating duplicated work and supporting unambiguous referencing of data contributing to their compiled data. Copyright © 2014 Elsevier Ltd. All rights reserved.
Center-TRACON Automation System (CTAS) En Route Trajectory Predictor Requirements and Capabilities
NASA Technical Reports Server (NTRS)
Vivona, Robert; Cate, Karen Tung
2013-01-01
This requirements framework document is designed to support the capture of requirements and capabilities for state-of-the-art trajectory predictors (TPs). This framework has been developed to assist TP experts in capturing a clear, consistent, and cross-comparable set of requirements and capabilities. The goal is to capture capabilities (types of trajectories that can be built), functional requirements (including inputs and outputs), non-functional requirements (including prediction accuracy and computational performance), approaches for constraint relaxation, and input uncertainties. The sections of this framework are based on the Common Trajectory Predictor structure developed by the FAA/Eurocontrol Cooperative R&D Action Plan 16 Committee on Common Trajectory Prediction. It is assumed that the reader is familiar with the Common TP Structure.1 This initial draft is intended as a first cut capture of the En Route TS Capabilities and Requirements. As such, it contains many annotations indicating possible logic errors in the CTAS code or in the description provided. It is intended to work out the details of the annotations with NASA and to update this document at a later time.
Rurkhamet, Busagarin; Nanthavanij, Suebsak
2004-12-01
One important factor that leads to the development of musculoskeletal disorders (MSD) and cumulative trauma disorders (CTD) among visual display terminal (VDT) users is their work posture. While operating a VDT, a user's body posture is strongly influenced by the task, VDT workstation settings, and layout of computer accessories. This paper presents an analytic and rule-based decision support tool called EQ-DeX (an ergonomics and quantitative design expert system) that is developed to provide valid and practical recommendations regarding the adjustment of a VDT workstation and the arrangement of computer accessories. The paper explains the structure and components of EQ-DeX, input data, rules, and adjustment and arrangement algorithms. From input information such as gender, age, body height, task, etc., EQ-DeX uses analytic and rule-based algorithms to estimate quantitative settings of a computer table and a chair, as well as locations of computer accessories such as monitor, document holder, keyboard, and mouse. With the input and output screens that are designed using the concept of usability, the interactions between the user and EQ-DeX are convenient. Examples are also presented to demonstrate the recommendations generated by EQ-DeX.
NASA Astrophysics Data System (ADS)
Foster, K.
1994-09-01
This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.
CTF Preprocessor User's Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria; Salko, Robert K.
2016-05-26
This document describes how a user should go about using the CTF pre- processor tool to create an input deck for modeling rod-bundle geometry in CTF. The tool was designed to generate input decks in a quick and less error-prone manner for CTF. The pre-processor is a completely independent utility, written in Fortran, that takes a reduced amount of input from the user. The information that the user must supply is basic information on bundle geometry, such as rod pitch, clad thickness, and axial location of spacer grids--the pre-processor takes this basic information and determines channel placement and connection informationmore » to be written to the input deck, which is the most time-consuming and error-prone segment of creating a deck. Creation of the model is also more intuitive, as the user can specify assembly and water-tube placement using visual maps instead of having to place them by determining channel/channel and rod/channel connections. As an example of the benefit of the pre-processor, a quarter-core model that contains 500,000 scalar-mesh cells was read into CTF from an input deck containing 200,000 lines of data. This 200,000 line input deck was produced automatically from a set of pre-processor decks that contained only 300 lines of data.« less
Hybrid automated reliability predictor integrated work station (HiREL)
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.
1991-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated reliability (HiREL) workstation tool system marks another step toward the goal of producing a totally integrated computer aided design (CAD) workstation design capability. Since a reliability engineer must generally graphically represent a reliability model before he can solve it, the use of a graphical input description language increases productivity and decreases the incidence of error. The captured image displayed on a cathode ray tube (CRT) screen serves as a documented copy of the model and provides the data for automatic input to the HARP reliability model solver. The introduction of dependency gates to a fault tree notation allows the modeling of very large fault tolerant system models using a concise and visually recognizable and familiar graphical language. In addition to aiding in the validation of the reliability model, the concise graphical representation presents company management, regulatory agencies, and company customers a means of expressing a complex model that is readily understandable. The graphical postprocessor computer program HARPO (HARP Output) makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes.
Creating an AI modeling application for designers and developers
NASA Astrophysics Data System (ADS)
Houlette, Ryan; Fu, Daniel; Jensen, Randy
2003-09-01
Simulation developers often realize an entity's AI by writing a program that exhibits the intended behavior. These behaviors are often the product of design documents written by designers. These individuals, while possessing a vast knowledge of the subject matter, might not have any programming knowledge whatsoever. To address this disconnect between design and subsequent development, we have created an AI application whereby a designer or developer sketches an entity's AI using a graphical "drag and drop" interface to quickly articulate behavior using a UML-like representation of state charts. Aside from the design-level benefits, the application also features a runtime engine that takes the application's data as input along with a simulation or game interface, and makes the AI operational. We discuss our experience in creating such an application for both designer and developer.
A Report on Applying EEGnet to Discriminate Human State Effects on Task Performance
2018-01-01
whether we could identify what task the participant was performing from differences in the recorded brain time series . We modeled the relationship...between input data (brain time series ) and output labels (task A and task B) as an unknown function, and we found an optimal approximation of that...this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of
An automated program for reinforcement requirements for openings in cylindrical pressure vessels
NASA Technical Reports Server (NTRS)
Wilson, J. F.; Taylor, J. T.
1975-01-01
An automated interactive program for calculating the reinforcement requirements for openings in cylindrical pressure vessels subjected to internal pressure is described. The program is written for an electronic desk top calculator. The program calculates the required area of reinforcement for a given opening and compares this value with the area of reinforcement provided by a proposed design. All program steps, operating instructions, and example problems with input and sample output are documented.
NASA Astrophysics Data System (ADS)
Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat @
2014-02-01
Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.
Full value documentation in the Czech Food Composition Database.
Machackova, M; Holasova, M; Maskova, E
2010-11-01
The aim of this project was to launch a new Food Composition Database (FCDB) Programme in the Czech Republic; to implement a methodology for food description and value documentation according to the standards designed by the European Food Information Resource (EuroFIR) Network of Excellence; and to start the compilation of a pilot FCDB. Foods for the initial data set were selected from the list of foods included in the Czech Food Consumption Basket. Selection of 24 priority components was based on the range of components used in former Czech tables. The priority list was extended with components for which original Czech analytical data or calculated data were available. Values that were input into the compiled database were documented according to the EuroFIR standards within the entities FOOD, COMPONENT, VALUE and REFERENCE using Excel sheets. Foods were described using the LanguaL Thesaurus. A template for documentation of data according to the EuroFIR standards was designed. The initial data set comprised documented data for 162 foods. Values were based on original Czech analytical data (available for traditional and fast foods, milk and milk products, wheat flour types), data derived from literature (for example, fruits, vegetables, nuts, legumes, eggs) and calculated data. The Czech FCDB programme has been successfully relaunched. Inclusion of the Czech data set into the EuroFIR eSearch facility confirmed compliance of the database format with the EuroFIR standards. Excel spreadsheets are applicable for full value documentation in the FCDB.
Flight Test Validation of Optimal Input Design and Comparison to Conventional Inputs
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1997-01-01
A technique for designing optimal inputs for aerodynamic parameter estimation was flight tested on the F-18 High Angle of Attack Research Vehicle (HARV). Model parameter accuracies calculated from flight test data were compared on an equal basis for optimal input designs and conventional inputs at the same flight condition. In spite of errors in the a priori input design models and distortions of the input form by the feedback control system, the optimal inputs increased estimated parameter accuracies compared to conventional 3-2-1-1 and doublet inputs. In addition, the tests using optimal input designs demonstrated enhanced design flexibility, allowing the optimal input design technique to use a larger input amplitude to achieve further increases in estimated parameter accuracy without departing from the desired flight test condition. This work validated the analysis used to develop the optimal input designs, and demonstrated the feasibility and practical utility of the optimal input design technique.
Robust input design for nonlinear dynamic modeling of AUV.
Nouri, Nowrouz Mohammad; Valadi, Mehrdad
2017-09-01
Input design has a dominant role in developing the dynamic model of autonomous underwater vehicles (AUVs) through system identification. Optimal input design is the process of generating informative inputs that can be used to generate the good quality dynamic model of AUVs. In a problem with optimal input design, the desired input signal depends on the unknown system which is intended to be identified. In this paper, the input design approach which is robust to uncertainties in model parameters is used. The Bayesian robust design strategy is applied to design input signals for dynamic modeling of AUVs. The employed approach can design multiple inputs and apply constraints on an AUV system's inputs and outputs. Particle swarm optimization (PSO) is employed to solve the constraint robust optimization problem. The presented algorithm is used for designing the input signals for an AUV, and the estimate obtained by robust input design is compared with that of the optimal input design. According to the results, proposed input design can satisfy both robustness of constraints and optimality. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Automated Generation of Technical Documentation and Provenance for Reproducible Research
NASA Astrophysics Data System (ADS)
Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.
2017-12-01
Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This document has been updated during the definitive design portion of the first phase of the W-314 Project to capture additional software requirements and is planned to be updated during the second phase of the W-314 Project to cover the second phase of the Project's scope. The objective is to provide requirement traceability by recording the analysis/basis for the functional descriptions of the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operationsmore » input or engineering judgment.« less
Spam comments prediction using stacking with ensemble learning
NASA Astrophysics Data System (ADS)
Mehmood, Arif; On, Byung-Won; Lee, Ingyu; Ashraf, Imran; Choi, Gyu Sang
2018-01-01
Illusive comments of product or services are misleading for people in decision making. The current methodologies to predict deceptive comments are concerned for feature designing with single training model. Indigenous features have ability to show some linguistic phenomena but are hard to reveal the latent semantic meaning of the comments. We propose a prediction model on general features of documents using stacking with ensemble learning. Term Frequency/Inverse Document Frequency (TF/IDF) features are inputs to stacking of Random Forest and Gradient Boosted Trees and the outputs of the base learners are encapsulated with decision tree to make final training of the model. The results exhibits that our approach gives the accuracy of 92.19% which outperform the state-of-the-art method.
Enhanced analysis and users manual for radial-inflow turbine conceptual design code RTD
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1995-01-01
Modeling enhancements made to a radial-inflow turbine conceptual design code are documented in this report. A stator-endwall clearance-flow model was added for use with pivoting vanes. The rotor calculations were modified to account for swept blades and splitter blades. Stator and rotor trailing-edge losses and a vaneless-space loss were added to the loss model. Changes were made to the disk-friction and rotor-clearance loss calculations. The loss model was then calibrated based on experimental turbine performance. A complete description of code input and output along with sample cases are included in the report.
The software system development for the TAMU real-time fan beam scatterometer data processors
NASA Technical Reports Server (NTRS)
Clark, B. V.; Jean, B. R.
1980-01-01
A software package was designed and written to process in real-time any one quadrature channel pair of radar scatterometer signals form the NASA L- or C-Band radar scatterometer systems. The software was successfully tested in the C-Band processor breadboard hardware using recorded radar and NERDAS (NASA Earth Resources Data Annotation System) signals as the input data sources. The processor development program and the overall processor theory of operation and design are described. The real-time processor software system is documented and the results of the laboratory software tests, and recommendations for the efficient application of the data processing capabilities are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trent, D.S.; Eyler, L.L.; Budden, M.J.
This document describes the numerical methods, current capabilities, and the use of the TEMPEST (Version L, MOD 2) computer program. TEMPEST is a transient, three-dimensional, hydrothermal computer program that is designed to analyze a broad range of coupled fluid dynamic and heat transfer systems of particular interest to the Fast Breeder Reactor thermal-hydraulic design community. The full three-dimensional, time-dependent equations of motion, continuity, and heat transport are solved for either laminar or turbulent fluid flow, including heat diffusion and generation in both solid and liquid materials. 10 refs., 22 figs., 2 tabs.
1991-09-05
NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ 3 ... ... .. . ....... ... .. ............ ... . . . . . . . u m ORIGINATOR CONTROL NUMBER: SDD-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: DCO0-0005d DATE: 09/05/91 ORIGINATOR NAME: Vivian L. Martin OFFICE SYMBOL: SAIC TELEPHONE NUMBER: 271-2999 SUBSTANTIVE: x EDITORIAL: PAGE NUMBER: CUlAO025-3 PARA NUMBER: c. COMMENT OR RECOMMENDED CHANGE: Provide the input, output, and local data elements for this CSU. RATIONALE: Paragraph c. states that data
NASA Technical Reports Server (NTRS)
Harrison, B. A.; Richard, M.
1979-01-01
The information necessary for execution of the digital computer program L216 on the CDC 6600 is described. L216 characteristics are based on the doublet lattice method. Arbitrary aerodynamic configurations may be represented with combinations of nonplanar lifting surfaces composed of finite constant pressure panel elements, and axially summetric slender bodies composed of constant pressure line elements. Program input consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic field (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.
ERIC Educational Resources Information Center
Weinberger, Elizabeth
The document contains optical scannable forms for some of the instruments in the Input and Process Batteries, and guidelines for administration of the instruments in the Input Batteries of the Management Information System for Occupational Education (MISOE) Sample Data Systems. Input information describes the characteristics of the students at…
NASA Technical Reports Server (NTRS)
Khorram, S.
1977-01-01
Results are presented of a study intended to develop a general location-specific remote-sensing procedure for watershed-wide estimation of water loss to the atmosphere by evaporation and transpiration. The general approach involves a stepwise sequence of required information definition (input data), appropriate sample design, mathematical modeling, and evaluation of results. More specifically, the remote sensing-aided system developed to evaluate evapotranspiration employs a basic two-stage two-phase sample of three information resolution levels. Based on the discussed design, documentation, and feasibility analysis to yield timely, relatively accurate, and cost-effective evapotranspiration estimates on a watershed or subwatershed basis, work is now proceeding to implement this remote sensing-aided system.
1977-03-01
267 Input Layout for Each Card Type ...................... 269 Input Sequence .......................... 271 SAMPLE PROBLEM...13 3 Sample Data rormn Used for Documenting MSD Effectiveness Attribute Data ........................... 15 -1 Sample Form Used for Documenting WMS...from commodes, urinals and garbage grinder) and gray (galley and turbid, i.e., output from sinks, showers, laundry, deck, drains, etc.) wastewaters
Data Quality Objectives for Tank Farms Waste Compatibility Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING, D.L.
1999-07-02
There are 177 waste storage tanks containing over 210,000 m{sup 3} (55 million gal) of mixed waste at the Hanford Site. The River Protection Project (RPP) has adopted the data quality objective (DQO) process used by the U.S. Environmental Protection Agency (EPA) (EPA 1994a) and implemented by RPP internal procedure (Banning 1999a) to identify the information and data needed to address safety issues. This DQO document is based on several documents that provide the technical basis for inputs and decision/action levels used to develop the decision rules that evaluate the transfer of wastes. A number of these documents are presentlymore » in the process of being revised. This document will need to be revised if there are changes to the technical criteria in these supporting documents. This DQO process supports various documents, such as sampling and analysis plans and double-shell tank (DST) waste analysis plans. This document identifies the type, quality, and quantity of data needed to determine whether transfer of supernatant can be performed safely. The requirements in this document are designed to prevent the mixing of incompatible waste as defined in Washington Administrative Code (WAC) 173-303-040. Waste transfers which meet the requirements contained in this document and the Double-Shell Tank Waste Analysis Plan (Mulkey 1998) are considered to be compatible, and prevent the mixing of incompatible waste.« less
User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs
Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.
2008-01-01
This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.
MODFLOW-NWT, A Newton formulation for MODFLOW-2005
Niswonger, Richard G.; Panday, Sorab; Ibaraki, Motomu
2011-01-01
This report documents a Newton formulation of MODFLOW-2005, called MODFLOW-NWT. MODFLOW-NWT is a standalone program that is intended for solving problems involving drying and rewetting nonlinearities of the unconfined groundwater-flow equation. MODFLOW-NWT must be used with the Upstream-Weighting (UPW) Package for calculating intercell conductances in a different manner than is done in the Block-Centered Flow (BCF), Layer Property Flow (LPF), or Hydrogeologic-Unit Flow (HUF; Anderman and Hill, 2000) Packages. The UPW Package treats nonlinearities of cell drying and rewetting by use of a continuous function of groundwater head, rather than the discrete approach of drying and rewetting that is used by the BCF, LPF, and HUF Packages. This further enables application of the Newton formulation for unconfined groundwater-flow problems because conductance derivatives required by the Newton method are smooth over the full range of head for a model cell. The NWT linearization approach generates an asymmetric matrix, which is different from the standard MODFLOW formulation that generates a symmetric matrix. Because all linear solvers presently available for use with MODFLOW-2005 solve only symmetric matrices, MODFLOW-NWT includes two previously developed asymmetric matrix-solver options. The matrix-solver options include a generalized-minimum-residual (GMRES) Solver and an Orthomin / stabilized conjugate-gradient (CGSTAB) Solver. The GMRES Solver is documented in a previously published report, such that only a brief description and input instructions are provided in this report. However, the CGSTAB Solver (called XMD) is documented in this report. Flow-property input for the UPW Package is designed based on the LPF Package and material-property input is identical to that for the LPF Package except that the rewetting and vertical-conductance correction options of the LPF Package are not available with the UPW Package. Input files constructed for the LPF Package can be used with slight modification as input for the UPW Package. This report presents the theory and methods used by MODFLOW-NWT, including the UPW Package. Additionally, this report provides comparisons of the new methodology to analytical solutions of groundwater flow and to standard MODFLOW-2005 results by use of an unconfined aquifer MODFLOW example problem. The standard MODFLOW-2005 simulation uses the LPF Package with the wet/dry option active. A new example problem also is presented to demonstrate MODFLOW-NWT's ability to provide a solution for a difficult unconfined groundwater-flow problem.
Jaton, Florian
2017-01-01
This article documents the practical efforts of a group of scientists designing an image-processing algorithm for saliency detection. By following the actors of this computer science project, the article shows that the problems often considered to be the starting points of computational models are in fact provisional results of time-consuming, collective and highly material processes that engage habits, desires, skills and values. In the project being studied, problematization processes lead to the constitution of referential databases called ‘ground truths’ that enable both the effective shaping of algorithms and the evaluation of their performances. Working as important common touchstones for research communities in image processing, the ground truths are inherited from prior problematization processes and may be imparted to subsequent ones. The ethnographic results of this study suggest two complementary analytical perspectives on algorithms: (1) an ‘axiomatic’ perspective that understands algorithms as sets of instructions designed to solve given problems computationally in the best possible way, and (2) a ‘problem-oriented’ perspective that understands algorithms as sets of instructions designed to computationally retrieve outputs designed and designated during specific problematization processes. If the axiomatic perspective on algorithms puts the emphasis on the numerical transformations of inputs into outputs, the problem-oriented perspective puts the emphasis on the definition of both inputs and outputs. PMID:28950802
PROCESS DOCUMENTATION: A MODEL FOR KNOWLEDGE MANAGEMENT IN ORGANIZATIONS.
Haddadpoor, Asefeh; Taheri, Behjat; Nasri, Mehran; Heydari, Kamal; Bahrami, Gholamreza
2015-10-01
Continuous and interconnected processes are a chain of activities that turn the inputs of an organization to its outputs and help achieve partial and overall goals of the organization. These activates are carried out by two types of knowledge in the organization called explicit and implicit knowledge. Among these, implicit knowledge is the knowledge that controls a major part of the activities of an organization, controls these activities internally and will not be transferred to the process owners unless they are present during the organization's work. Therefore the goal of this study is identification of implicit knowledge and its integration with explicit knowledge in order to improve human resources management, physical resource management, information resource management, training of new employees and other activities of Isfahan University of Medical Science. The project for documentation of activities in department of health of Isfahan University of Medical Science was carried out in several stages. First the main processes and related sub processes were identified and categorized with the help of planning expert. The categorization was carried out from smaller processes to larger ones. In this stage the experts of each process wrote down all their daily activities and organized them into general categories based on logical and physical relations between different activities. Then each activity was assigned a specific code. The computer software was designed after understanding the different parts of the processes, including main and sup processes, and categorization, which will be explained in the following sections. The findings of this study showed that documentation of activities can help expose implicit knowledge because all of inputs and outputs of a process along with the length, location, tools and different stages of the process, exchanged information, storage location of the information and information flow can be identified using proper documentation. A documentation program can create a complete identifier for every process of an organization and also acts as the main tool for establishment of information technology as the basis of the organization and helps achieve the goal of having electronic and information technology based organizations. In other words documentation is the starting step in creating an organizational architecture. Afterwards, in order to reach the desired goal of documentation, computer software containing all tools, methods, instructions and guidelines and implicit knowledge of the organization was designed. This software links all relevant knowledge to the main text of the documentation and identification of a process and provides the users with electronic versions of all documentations and helps use the explicit and implicit knowledge of the organization to facilitate the reengineering of the processes in the organization.
NASA Astrophysics Data System (ADS)
Wang, Yongli; Wang, Gang; Zuo, Yi; Fan, Lisha; Wei, Jiaxiang
2017-03-01
On March 15, 2015, the central office issued the "Opinions on Further Deepening the Reform of Electric Power System" (in the 2015 No. 9). This policy marks the central government officially opened a new round of electricity reform. As a programmatic document under the new situation to comprehensively promote the reform of the power system, No. 9 document will be approved as a separate transmission and distribution of electricity prices, which is the first task of promoting the reform of the power system. Grid tariff reform is not only the transmission and distribution price of a separate approval, more of the grid company input-output relationship and many other aspects of deep-level adjustments. Under the background of the reform of the transmission and distribution price, the main factors affecting the input-output relationship, such as the main business, electricity pricing, and investment approval, financial accounting and so on, have changed significantly. The paper designed the comprehensive evaluation index system of power grid enterprises' credit rating under the reform of transmission and distribution price to reduce the impact of the reform on the company's international rating results and the ability to raise funds.
NASA Astrophysics Data System (ADS)
Wang, Yongli; Wang, Gang; Zuo, Yi; Fan, Lisha; Ling, Yunpeng
2017-03-01
On March 15, 2015, the Central Office issued the "Opinions on Further Deepening the Reform of Electric Power System" (Zhong Fa No. 9). This policy marks the central government officially opened a new round of electricity reform. As a programmatic document under the new situation to comprehensively promote the reform of the power system, No. 9 document will be approved as a separate transmission and distribution of electricity prices, which is the first task of promoting the reform of the power system. Grid tariff reform is not only the transmission and distribution price of a separate approval, more of the grid company input-output relationship and many other aspects of deep-level adjustments. Under the background of the reform of the transmission and distribution price, the main factors affecting the input-output relationship, such as the main business, electricity pricing, and investment approval, financial accounting and so on, have changed significantly. The paper designed the comprehensive evaluation index system of power grid projects' investment benefits under the reform of transmission and distribution price to improve the investment efficiency of power grid projects after the power reform in China.
Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.
1999-01-01
A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.
ALARA radiation considerations for the AP600 reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lau, F.L.
1995-03-01
The radiation design of the AP600 reactor plant is based on an average annual occupational radiation exposure (ORE) of 100 man-rem. As a design goal we have established a lower value of 70 man-rem per year. And, with our current design process, we expect to achieve annual exposures which are well below this goal. To accomplish our goal we have established a process that provides criteria, guidelines and customer involvement to achieve the desired result. The criteria and guidelines provide the shield designer, as well as the systems and plant layout designers with information that will lead to an integratedmore » plant design that minimizes personnel exposure and yet is not burdened with complicated shielding or unnecessary component access limitations. Customer involvement is provided in the form of utility input, design reviews and information exchange. Cooperative programs with utilities in the development of specific systems or processes also provides for an ALARA design. The results are features which include ALARA radiation considerations as an integral part of the plant design and a lower plant ORE. It is anticipated that a further reduction in plant personnel exposures will result through good radiological practices by the plant operators. The information in place to support and direct the plant designers includes the Utility Requirements Document (URD), Federal Regulations, ALARA guidelines, radiation design information and radiation and shielding design criteria. This information, along with the utility input, design reviews and information feedback, will contribute to the reduction of plant radiation exposure levels such that they will be less than the stated goals.« less
Orzol, Leonard L.; McGrath, Timothy S.
1992-01-01
This report documents modifications to the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model, commonly called MODFLOW, so that it can read and write files used by a geographic information system (GIS). The modified model program is called MODFLOWARC. Simulation programs such as MODFLOW generally require large amounts of input data and produce large amounts of output data. Viewing data graphically, generating head contours, and creating or editing model data arrays such as hydraulic conductivity are examples of tasks that currently are performed either by the use of independent software packages or by tedious manual editing, manipulating, and transferring data. Programs such as GIS programs are commonly used to facilitate preparation of the model input data and analyze model output data; however, auxiliary programs are frequently required to translate data between programs. Data translations are required when different programs use different data formats. Thus, the user might use GIS techniques to create model input data, run a translation program to convert input data into a format compatible with the ground-water flow model, run the model, run a translation program to convert the model output into the correct format for GIS, and use GIS to display and analyze this output. MODFLOWARC, avoids the two translation steps and transfers data directly to and from the ground-water-flow model. This report documents the design and use of MODFLOWARC and includes instructions for data input/output of the Basic, Block-centered flow, River, Recharge, Well, Drain, Evapotranspiration, General-head boundary, and Streamflow-routing packages. The modification to MODFLOW and the Streamflow-Routing package was minimized. Flow charts and computer-program code describe the modifications to the original computer codes for each of these packages. Appendix A contains a discussion on the operation of MODFLOWARC using a sample problem.
ERIC Educational Resources Information Center
Silver, Steven S.
FMS/3 is a system for producing hard copy documentation at high speed from free format text and command input. The system was originally written in assembler language for a 12K IBM 360 model 20 using a high speed 1403 printer with the UCS-TN chain option (upper and lower case). Input was from an IBM 2560 Multi-function Card Machine. The model 20…
Method of Characteristic (MOC) Nozzle Flowfield Solver - User’s Guide and Input Manual Version 2.0
2018-01-01
TECHNICAL REPORT RDMR-SS-17-13 METHOD OF CHARACTERISTIC (MOC) NOZZLE FLOWFIELD SOLVER—USER’S GUIDE AND INPUT MANUAL VERSION 2.0 Kevin D. Kennedy...System Simulation and Development Directorate Aviation and Missile Research , Development, and Engineering Center January 2018 Distribution Statement...DOCUMENTS, DESTROY BY ANY METHOD THAT WILL PREVENT DISCLOSURE OF CONTENTS OR RECONSTRUCTION OF THE DOCUMENT. DISCLAIMER THE FINDINGS IN THIS REPORT
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... Staff Guidance on Ensuring Hazard-Consistent Seismic Input for Site Response and Soil Structure...-Consistent Seismic Input for Site Response and Soil Structure Interaction Analyses,'' (Agencywide Documents... Soil Structure Interaction Analyses,'' (ADAMS Accession No. ML092230455) to solicit public and industry...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria; Blyth, Taylor S.; Salko, Robert K.
This document describes how to make a CTF input deck. A CTF input deck is organized into Card Groups and Cards. A Card Group is a collection of Cards. A Card is defined as a line of input. Each Card may contain multiple data. A Card is terminated by making a new line.
Instruction in Documentation for Computer Programming
ERIC Educational Resources Information Center
Westley, John W.
1976-01-01
In addition to the input/output record format, the program flowchart, the program listing, and the program test output, eight documentation items are suggested in order that they may serve as a base from which to start teaching program documentation. (Author/AG)
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SGI IRIS VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
CubeSat mission design software tool for risk estimating relationships
NASA Astrophysics Data System (ADS)
Gamble, Katharine Brumbaugh; Lightsey, E. Glenn
2014-09-01
In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.
NASA Astrophysics Data System (ADS)
Duan, Haoran
1997-12-01
This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a heuristic strategy that leads to 'socially optimal' solutions, yielding a maximum number of contention-free cells being scheduled. A novel mixed digital-analog circuit has been designed to implement the MUCS core functionality. The MUCS circuit maps the cell scheduling computation to the capacitor charging and discharging procedures that are conducted fully in parallel. The design has a uniform circuit structure, low interconnect counts, and low chip I/O counts. Using 2 μm CMOS technology, the design operates on a 100 MHz clock and finds a near-optimal solution within a linear processing time. The circuit has been verified at the transistor level by HSPICE simulation. During this research, a five-port IQ-based optoelectronic iPOINT ATM switch has been developed and demonstrated. It has been fully functional with an aggregate throughput of 800 Mb/s. The second-generation IQ-based switch is currently under development. Equipped with iiQueue modules and MUCS module, the new switch system will deliver a multi-gigabit aggregate throughput, eliminate HOL blocking, provide per-VC QoS, and achieve near-100% link bandwidth utilization. Complete documentation of input modules and trunk module for the existing testbed, and complete documentation of 3DQ, iiQueue, and MUCS for the second-generation testbed are given in this dissertation.
User guide for MODPATH Version 7—A particle-tracking model for MODFLOW
Pollock, David W.
2016-09-26
MODPATH is a particle-tracking post-processing program designed to work with MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. MODPATH version 7 is the fourth major release since its original publication. Previous versions were documented in USGS Open-File Reports 89–381 and 94–464 and in USGS Techniques and Methods 6–A41.MODPATH version 7 works with MODFLOW-2005 and MODFLOW–USG. Support for unstructured grids in MODFLOW–USG is limited to smoothed, rectangular-based quadtree and quadpatch grids.A software distribution package containing the computer program and supporting documentation, such as input instructions, output file descriptions, and example problems, is available from the USGS over the Internet (http://water.usgs.gov/ogw/modpath/).
Documentation of the Benson Diesel Engine Simulation Program
NASA Technical Reports Server (NTRS)
Vangerpen, Jon
1988-01-01
This report documents the Benson Diesel Engine Simulation Program and explains how it can be used to predict the performance of diesel engines. The program was obtained from the Garrett Turbine Engine Company but has been extensively modified since. The program is a thermodynamic simulation of the diesel engine cycle which uses a single zone combustion model. It can be used to predict the effect of changes in engine design and operating parameters such as valve timing, speed and boost pressure. The most significan change made to this program is the addition of a more detailed heat transfer model to predict metal part temperatures. This report contains a description of the sub-models used in the Benson program, a description of the input parameters and sample program runs.
Web Based Tool for Mission Operations Scenarios
NASA Technical Reports Server (NTRS)
Boyles, Carole A.; Bindschadler, Duane L.
2008-01-01
A conventional practice for spaceflight projects is to document scenarios in a monolithic Operations Concept document. Such documents can be hundreds of pages long and may require laborious updates. Software development practice utilizes scenarios in the form of smaller, individual use cases, which are often structured and managed using UML. We have developed a process and a web-based scenario tool that utilizes a similar philosophy of smaller, more compact scenarios (but avoids the formality of UML). The need for a scenario process and tool became apparent during the authors' work on a large astrophysics mission. It was noted that every phase of the Mission (e.g., formulation, design, verification and validation, and operations) looked back to scenarios to assess completeness of requirements and design. It was also noted that terminology needed to be clarified and structured to assure communication across all levels of the project. Attempts to manage, communicate, and evolve scenarios at all levels of a project using conventional tools (e.g., Excel) and methods (Scenario Working Group meetings) were not effective given limitations on budget and staffing. The objective of this paper is to document the scenario process and tool created to offer projects a low-cost capability to create, communicate, manage, and evolve scenarios throughout project development. The process and tool have the further benefit of allowing the association of requirements with particular scenarios, establishing and viewing relationships between higher- and lower-level scenarios, and the ability to place all scenarios in a shared context. The resulting structured set of scenarios is widely visible (using a web browser), easily updated, and can be searched according to various criteria including the level (e.g., Project, System, and Team) and Mission Phase. Scenarios are maintained in a web-accessible environment that provides a structured set of scenario fields and allows for maximum visibility across the project. One key aspect is that the tool was built for a scenario process that accounts for stakeholder input, review, comment, and concurrence. By creating well-designed opportunities for stakeholder input and concurrence and by making the scenario content easily accessible to all project personnel, we maximize the opportunities for stakeholders to both understand and agree on the concepts for how their mission is to be carried out.
NASA Astrophysics Data System (ADS)
Whitehead, James Joshua
The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.
Description of CASCOMP Comprehensive Airship Sizing and Performance Computer Program, Volume 2
NASA Technical Reports Server (NTRS)
Davis, J.
1975-01-01
The computer program CASCOMP, which may be used in comparative design studies of lighter than air vehicles by rapidly providing airship size and mission performance data, was prepared and documented. The program can be used to define design requirements such as weight breakdown, required propulsive power, and physical dimensions of airships which are designed to meet specified mission requirements. The program is also useful in sensitivity studies involving both design trade-offs and performance trade-offs. The input to the program primarily consists of a series of single point values such as hull overall fineness ratio, number of engines, airship hull and empennage drag coefficients, description of the mission profile, and weights of fixed equipment, fixed useful load and payload. In order to minimize computation time, the program makes ample use of optional computation paths.
Wind turbine design codes: A preliminary comparison of the aerodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buhl, M.L. Jr.; Wright, A.D.; Tangler, J.L.
1997-12-01
The National Wind Technology Center of the National Renewable Energy Laboratory is comparing several computer codes used to design and analyze wind turbines. The first part of this comparison is to determine how well the programs predict the aerodynamic behavior of turbines with no structural degrees of freedom. Without general agreement on the aerodynamics, it is futile to try to compare the structural response due to the aerodynamic input. In this paper, the authors compare the aerodynamic loads for three programs: Garrad Hassan`s BLADED, their own WT-PERF, and the University of Utah`s YawDyn. This report documents a work in progressmore » and compares only two-bladed, downwind turbines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria; Toptan, Aysenur; Porter, Nathan
This document describes how to make a CTF input deck. A CTF input deck is organized into Card Groups and Cards. A Card Group is a collection of Cards. A Card is de ned as a line of input. Each Card may contain multiple data. A Card is terminated by making a new line. This document has been organized so that each Card Group is discussed in its own dedicated chapter. Each card is discused in its own dedicated section. Each data in the card is discussed in its own block. The block gives information about the data, including themore » number of the input, the title, a description of the meaning of the data, units, data type, and so on. An example block is shown below to discuss the meaning of each entry in the block.« less
Redundancy-Aware Topic Modeling for Patient Record Notes
Cohen, Raphael; Aviram, Iddo; Elhadad, Michael; Elhadad, Noémie
2014-01-01
The clinical notes in a given patient record contain much redundancy, in large part due to clinicians’ documentation habit of copying from previous notes in the record and pasting into a new note. Previous work has shown that this redundancy has a negative impact on the quality of text mining and topic modeling in particular. In this paper we describe a novel variant of Latent Dirichlet Allocation (LDA) topic modeling, Red-LDA, which takes into account the inherent redundancy of patient records when modeling content of clinical notes. To assess the value of Red-LDA, we experiment with three baselines and our novel redundancy-aware topic modeling method: given a large collection of patient records, (i) apply vanilla LDA to all documents in all input records; (ii) identify and remove all redundancy by chosing a single representative document for each record as input to LDA; (iii) identify and remove all redundant paragraphs in each record, leaving partial, non-redundant documents as input to LDA; and (iv) apply Red-LDA to all documents in all input records. Both quantitative evaluation carried out through log-likelihood on held-out data and topic coherence of produced topics and qualitative assessement of topics carried out by physicians show that Red-LDA produces superior models to all three baseline strategies. This research contributes to the emerging field of understanding the characteristics of the electronic health record and how to account for them in the framework of data mining. The code for the two redundancy-elimination baselines and Red-LDA is made publicly available to the community. PMID:24551060
Redundancy-aware topic modeling for patient record notes.
Cohen, Raphael; Aviram, Iddo; Elhadad, Michael; Elhadad, Noémie
2014-01-01
The clinical notes in a given patient record contain much redundancy, in large part due to clinicians' documentation habit of copying from previous notes in the record and pasting into a new note. Previous work has shown that this redundancy has a negative impact on the quality of text mining and topic modeling in particular. In this paper we describe a novel variant of Latent Dirichlet Allocation (LDA) topic modeling, Red-LDA, which takes into account the inherent redundancy of patient records when modeling content of clinical notes. To assess the value of Red-LDA, we experiment with three baselines and our novel redundancy-aware topic modeling method: given a large collection of patient records, (i) apply vanilla LDA to all documents in all input records; (ii) identify and remove all redundancy by chosing a single representative document for each record as input to LDA; (iii) identify and remove all redundant paragraphs in each record, leaving partial, non-redundant documents as input to LDA; and (iv) apply Red-LDA to all documents in all input records. Both quantitative evaluation carried out through log-likelihood on held-out data and topic coherence of produced topics and qualitative assessment of topics carried out by physicians show that Red-LDA produces superior models to all three baseline strategies. This research contributes to the emerging field of understanding the characteristics of the electronic health record and how to account for them in the framework of data mining. The code for the two redundancy-elimination baselines and Red-LDA is made publicly available to the community.
Xyce parallel electronic simulator : reference guide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.
2011-05-01
This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide. The Xyce Parallel Electronic Simulator has been written to support, in a rigorous manner, the simulation needs of the Sandia National Laboratories electrical designers. It is targeted specifically to runmore » on large-scale parallel computing platforms but also runs well on a variety of architectures including single processor workstations. It also aims to support a variety of devices and models specific to Sandia needs. This document is intended to complement the Xyce Users Guide. It contains comprehensive, detailed information about a number of topics pertinent to the usage of Xyce. Included in this document is a netlist reference for the input-file commands and elements supported within Xyce; a command line reference, which describes the available command line arguments for Xyce; and quick-references for users of other circuit codes, such as Orcad's PSpice and Sandia's ChileSPICE.« less
Model documentation: Renewable Fuels Module of the National Energy Modeling System
NASA Astrophysics Data System (ADS)
1994-04-01
This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it related to the production of the 1994 Annual Energy Outlook (AEO94) forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. This documentation report serves two purposes. First, it is a reference document for model analysts, model users, and the public interested in the construction and application of the RFM. Second, it meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. Of these six, four are documented in the following chapters: municipal solid waste, wind, solar and biofuels. Geothermal and wood are not currently working components of NEMS. The purpose of the RFM is to define the technological and cost characteristics of renewable energy technologies, and to pass these characteristics to other NEMS modules for the determination of mid-term forecasted renewable energy demand.
Toward Scientific Numerical Modeling
NASA Technical Reports Server (NTRS)
Kleb, Bil
2007-01-01
Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.
Design tradeoff studies and sensitivity analysis, appendices B1 - B4. [hybrid electric vehicles
NASA Technical Reports Server (NTRS)
1979-01-01
Documentation is presented for a program which separately computes fuel and energy consumption for the two modes of operation of a hybrid electric vehicle. The distribution of daily travel is specified as input data as well as the weights which the component driving cycles are given in each of the composite cycles. The possibility of weight reduction through the substitution of various materials is considered as well as the market potential for hybrid vehicles. Data relating to battery compartment weight distribution and vehicle handling analysis is tabulated.
DITTY - a computer program for calculating population dose integrated over ten thousand years
DOE Office of Scientific and Technical Information (OSTI.GOV)
Napier, B.A.; Peloquin, R.A.; Strenge, D.L.
The computer program DITTY (Dose Integrated Over Ten Thousand Years) was developed to determine the collective dose from long term nuclear waste disposal sites resulting from the ground-water pathways. DITTY estimates the time integral of collective dose over a ten-thousand-year period for time-variant radionuclide releases to surface waters, wells, or the atmosphere. This document includes the following information on DITTY: a description of the mathematical models, program designs, data file requirements, input preparation, output interpretations, sample problems, and program-generated diagnostic messages.
FEQinput—An editor for the full equations (FEQ) hydraulic modeling system
Ancalle, David S.; Ancalle, Pablo J.; Domanski, Marian M.
2017-10-30
IntroductionThe Full Equations Model (FEQ) is a computer program that solves the full, dynamic equations of motion for one-dimensional unsteady hydraulic flow in open channels and through control structures. As a result, hydrologists have used FEQ to design and operate flood-control structures, delineate inundation maps, and analyze peak-flow impacts. To aid in fighting floods, hydrologists are using the software to develop a system that uses flood-plain models to simulate real-time streamflow.Input files for FEQ are composed of text files that contain large amounts of parameters, data, and instructions that are written in a format exclusive to FEQ. Although documentation exists that can aid in the creation and editing of these input files, new users face a steep learning curve in order to understand the specific format and language of the files.FEQinput provides a set of tools to help a new user overcome the steep learning curve associated with creating and modifying input files for the FEQ hydraulic model and the related utility tool, Full Equations Utilities (FEQUTL).
Documentation of a dissolved-solids model of the Tongue River, southeastern Montana
Woods, Paul F.
1981-01-01
A model has been developed for assessing potential increases in dissolved solids of the Tongue River as a result of leaching of overburden materials used to backfill pits in surface coal-mining operations. The model allows spatial and temporal simulation of streamflow and dissolved-solids loads and concentrations under user-defined scenarios of surface coal mining and agricultural development. The model routes an input quantity of streamflow and dissolved solids from the upstream end to the downstream end of a stream reach while algebraically accounting for gains and losses of streamflow and dissolved solids within the stream reach. Input data needed to operate the model include the following: simulation number, designation of hydrologic conditions for each simulated month, either user-defined or regression-defined concentrations of dissolved solids input by the Tongue River Reservoir, number of irrigated acres, number of mined acres, dissolved-solids concentration of mine leachates and quantity of other water losses. A listing of the Fortran computer program, definitions of all variables in the model, and an example output permit use of the model by interested persons. (USGS)
Scientific and technical advisory committee review of the nutrient inputs to the watershed model
USDA-ARS?s Scientific Manuscript database
The following is a report by a STAC Review Team concerning the methods and documentation used by the Chesapeake Bay Partnership for evaluation of nutrient inputs to Phase 6 of the Chesapeake Bay Watershed Model. The “STAC Review of the Nutrient Inputs to the Watershed Model” (previously referred to...
2014 Version 7.0 Technical Support Document (TSD)
The 2014 Version 7 document describes the processing of emission inventories into inputs for the Community Multiscale Air Quality model for use in the 2014 National Air Toxics Assessment initial modeling.
Thermal Transport Model for Heat Sink Design
NASA Technical Reports Server (NTRS)
Chervenak, James A.; Kelley, Richard L.; Brown, Ari D.; Smith, Stephen J.; Kilbourne, Caroline a.
2009-01-01
A document discusses the development of a finite element model for describing thermal transport through microcalorimeter arrays in order to assist in heat-sinking design. A fabricated multi-absorber transition edge sensor (PoST) was designed in order to reduce device wiring density by a factor of four. The finite element model consists of breaking the microcalorimeter array into separate elements, including the transition edge sensor (TES) and the silicon substrate on which the sensor is deposited. Each element is then broken up into subelements, whose surface area subtends 10 10 microns. The heat capacity per unit temperature, thermal conductance, and thermal diffusivity of each subelement are the model inputs, as are the temperatures of each subelement. Numerical integration using the Finite in Time Centered in Space algorithm of the thermal diffusion equation is then performed in order to obtain a temporal evolution of the subelement temperature. Thermal transport across interfaces is modeled using a thermal boundary resistance obtained using the acoustic mismatch model. The document concludes with a discussion of the PoST fabrication. PoSTs are novel because they enable incident x-ray position sensitivity with good energy resolution and low wiring density.
Unesco Integrated Documentation Network; Computerized Documentation System (CDS).
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Paris (France). Dept. of Documentation, Libraries, and Archives.
Intended for use by the Computerized Documentation System (CDS), the Unesco version of ISIS (Integrated Set of Information Systems)--originally developed by the International Labour Organization--was developed in 1975 and named CDS/ISIS. This system has a comprehensive collection of programs for input, management, and output, running in batch or…
The multi-disciplinary design study: A life cycle cost algorithm
NASA Technical Reports Server (NTRS)
Harding, R. R.; Pichi, F. J.
1988-01-01
The approach and results of a Life Cycle Cost (LCC) analysis of the Space Station Solar Dynamic Power Subsystem (SDPS) including gimbal pointing and power output performance are documented. The Multi-Discipline Design Tool (MDDT) computer program developed during the 1986 study has been modified to include the design, performance, and cost algorithms for the SDPS as described. As with the Space Station structural and control subsystems, the LCC of the SDPS can be computed within the MDDT program as a function of the engineering design variables. Two simple examples of MDDT's capability to evaluate cost sensitivity and design based on LCC are included. MDDT was designed to accept NASA's IMAT computer program data as input so that IMAT's detailed structural and controls design capability can be assessed with expected system LCC as computed by MDDT. No changes to IMAT were required. Detailed knowledge of IMAT is not required to perform the LCC analyses as the interface with IMAT is noninteractive.
Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1993-01-01
The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.
The reform of home care services in Ontario: opportunity lost or lesson learned?
Randall, Glen
2007-06-01
With the release of the Romanow Commission report, Canadian governments are poised to consider the creation of a national home care program. If occupational and physical therapists are to have input in shaping such a program, they will need to learn from lost opportunities of the past. This paper provides an overview of recent reforms to home care in Ontario with an emphasis on rehabilitation services. Data were collected from documents and 28 key informant interviews with rehabilitation professionals. Home care in Ontario has evolved in a piecemeal manner without rehabilitation professionals playing a prominent role in program design. Rehabilitation services play a critical role in facilitating hospital discharges, minimizing readmissions, and improving the quality of peoples' lives. Canadians will benefit if occupational and physical therapists seize the unique opportunity before them to provide meaningful input into creating a national home care program.
The Phoretic Motion Experiment (PME) definition phase
NASA Technical Reports Server (NTRS)
Eaton, L. R.; Neste, S. L. (Editor)
1982-01-01
The aerosol generator and the charge flow devices (CFD) chamber which were designed for zero-gravity operation was analyzed. Characteristics of the CFD chamber and aerosol generator which would be useful for cloud physics experimentation in a one-g as well as a zero-g environment are documented. The Collision type of aerosol generator is addressed. Relationships among the various input and output parameters are derived and subsequently used to determine the requirements on the controls of the input parameters to assure a given error budget of an output parameter. The CFD chamber operation in a zero-g environment is assessed utilizing a computer simulation program. Low nuclei critical supersaturation and high experiment accuracies are emphasized which lead to droplet growth times extending into hundreds of seconds. The analysis was extended to assess the performance constraints of the CFD chamber in a one-g environment operating in the horizontal mode.
Advanced information processing system: Local system services
NASA Technical Reports Server (NTRS)
Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter
1989-01-01
The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.
NASA Technical Reports Server (NTRS)
1981-01-01
The set of computer programs described allows for data definition, data input, and data transfer between the LSI-11 microcomputers and the VAX-11/780 minicomputer. Program VAXCOM allows for a simple method of textual file transfer from the LSI to the VAX. Program LSICOM allows for easy file transfer from the VAX to the LSI. Program TTY changes the LSI-11 operators console to the LSI's printing device. Program DICTIN provides a means for defining a data set for input to either computer. Program DATAIN is a simple to operate data entry program which is capable of building data files on either machine. Program LEDITV is an extremely powerful, easy to use, line oriented text editor. Program COPYSBF is designed to print out textual files on the line printer without character loss from FORTRAN carriage control or wide record transfer.
TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Myers, R. A.; Topp, D. A.; Delaney, R. A.
1995-01-01
The primary objective of this study was the development of a computational fluid dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a graphical user interface (GUI). The computer codes resulting from this effort are referred to as the Turbomachinery Analysis and Design System (TADS). This document is intended to serve as a user's manual for the computer programs which comprise the TADS system. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of various programs was done in a way that alternative solvers or grid generators could be easily incorporated into the TADS framework.
NASA Technical Reports Server (NTRS)
1993-01-01
The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.
39 CFR 3050.2 - Documentation of periodic reports.
Code of Federal Regulations, 2010 CFR
2010-07-01
... traced back to public documents or to primary data sources; and (3) Be submitted in a form, and be... Postal Service shall identify any input data that have changed, list any quantification techniques that...
1980-02-08
hours 0 Input Format: Integer b. Creatina Rescource Allocation Blocks The creation of a specific resource allocation block as a directive component is...is directed. 0 Range: N/A . Input Format: INT/NUC/CHM b. Creatina Employment Packages An employment package block has the structure portrayed in Figure
Conceptual design of an advanced Stirling conversion system for terrestrial power generation
NASA Technical Reports Server (NTRS)
1988-01-01
A free piston Stirling engine coupled to an electric generator or alternator with a nominal kWe power output absorbing thermal energy from a nominal 100 square meter parabolic solar collector and supplying electric power to a utility grid was identified. The results of the conceptual design study of an Advanced Stirling Conversion System (ASCS) were documented. The objectives are as follows: define the ASCS configuration; provide a manufacturability and cost evaluation; predict ASCS performance over the range of solar input required to produce power; estimate system and major component weights; define engine and electrical power condidtioning control requirements; and define key technology needs not ready by the late 1980s in meeting efficiency, life, cost, and with goalds for the ASCS.
Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman
1993-01-01
This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.
Search and Graph Database Technologies for Biomedical Semantic Indexing: Experimental Analysis.
Segura Bedmar, Isabel; Martínez, Paloma; Carruana Martín, Adrián
2017-12-01
Biomedical semantic indexing is a very useful support tool for human curators in their efforts for indexing and cataloging the biomedical literature. The aim of this study was to describe a system to automatically assign Medical Subject Headings (MeSH) to biomedical articles from MEDLINE. Our approach relies on the assumption that similar documents should be classified by similar MeSH terms. Although previous work has already exploited the document similarity by using a k-nearest neighbors algorithm, we represent documents as document vectors by search engine indexing and then compute the similarity between documents using cosine similarity. Once the most similar documents for a given input document are retrieved, we rank their MeSH terms to choose the most suitable set for the input document. To do this, we define a scoring function that takes into account the frequency of the term into the set of retrieved documents and the similarity between the input document and each retrieved document. In addition, we implement guidelines proposed by human curators to annotate MEDLINE articles; in particular, the heuristic that says if 3 MeSH terms are proposed to classify an article and they share the same ancestor, they should be replaced by this ancestor. The representation of the MeSH thesaurus as a graph database allows us to employ graph search algorithms to quickly and easily capture hierarchical relationships such as the lowest common ancestor between terms. Our experiments show promising results with an F1 of 69% on the test dataset. To the best of our knowledge, this is the first work that combines search and graph database technologies for the task of biomedical semantic indexing. Due to its horizontal scalability, ElasticSearch becomes a real solution to index large collections of documents (such as the bibliographic database MEDLINE). Moreover, the use of graph search algorithms for accessing MeSH information could provide a support tool for cataloging MEDLINE abstracts in real time. ©Isabel Segura Bedmar, Paloma Martínez, Adrián Carruana Martín. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 01.12.2017.
INDES User's guide multistep input design with nonlinear rotorcraft modeling
NASA Technical Reports Server (NTRS)
1979-01-01
The INDES computer program, a multistep input design program used as part of a data processing technique for rotorcraft systems identification, is described. Flight test inputs base on INDES improve the accuracy of parameter estimates. The input design algorithm, program input, and program output are presented.
Machine Aided Indexing and the NASA Thesaurus
NASA Technical Reports Server (NTRS)
vonOfenheim, Bill
2007-01-01
Machine Aided Indexing (MAI) is a Web-based application program for aiding the indexing of literature in the NASA Scientific and Technical Information (STI) Database. MAI was designed to be a convenient, fully interactive tool for determining the subject matter of documents and identifying keywords. The heart of MAI is a natural-language processor that accepts, as input, any user-supplied text, including abstracts, full documents, and Web pages. Within seconds, the text is analyzed and a ranked list of terms is generated. The 17,800 terms of the NASA Thesaurus serve as the foundation of the knowledge base used by MAI. The NASA Thesaurus defines a standard vocabulary, the use of which enables MAI to assist in ensuring that STI documents are uniformly and consistently accessible. Of particular interest to traditional users of the NASA Thesaurus, MAI incorporates a fully searchable thesaurus display module that affords word-search and hierarchy- navigation capabilities that make it much easier and less time-consuming to look up terms and browse, relative to lookup and browsing in older print and Portable Document Format (PDF) digital versions of the Thesaurus. In addition, because MAI is centrally hosted, the Thesaurus data are always current.
Minimizing structural vibrations with Input Shaping (TM)
NASA Technical Reports Server (NTRS)
Singhose, Bill; Singer, Neil
1995-01-01
A new method for commanding machines to move with increased dynamic performance was developed. This method is an enhanced version of input shaping, a patented vibration suppression algorithm. This technique intercepts a command input to a system command that moves the mechanical system with increased performance and reduced residual vibration. This document describes many advanced methods for generating highly optimized shaping sequences which are tuned to particular systems. The shaping sequence is important because it determines the trade off between move/settle time of the system and the insensitivity of the input shaping algorithm to variations or uncertainties in the machine which can be controlled. For example, a system with a 5 Hz resonance that takes 1 second to settle can be improved to settle instantaneously using a 0.2 shaping sequence (thus improving settle time by a factor of 5). This system could vary by plus or minus 15% in its natural frequency and still have no apparent vibration. However, the same system shaped with a 0.3 second shaping sequence could tolerate plus or minus 40% or more variation in natural frequency. This document describes how to generate sequences that maximize performance, sequences that maximize insensitivity, and sequences that trade off between the two. Several software tools are documented and included.
The Impact of Input Quality on Early Sign Development in Native and Non-Native Language Learners
ERIC Educational Resources Information Center
Lu, Jenny; Jones, Anna; Morgan, Gary
2016-01-01
There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the…
Design and Analysis of Precise Pointing Systems
NASA Technical Reports Server (NTRS)
Kim, Young K.
2000-01-01
The mathematical models of Glovebox Integrated Microgravity Isolation Technology (g- LIMIT) dynamics/control system, which include six degrees of freedom (DOF) equations of motion, mathematical models of position sensors, accelerometers and actuators, and acceleration and position controller, were developed using MATLAB and TREETOPS simulations. Optimal control parameters of G-LIMIT control system were determined through sensitivity studies and its performance were evaluated with the TREETOPS model of G-LIMIT dynamics and control system. The functional operation and performance of the Tektronix DTM920 digital thermometer were studied and the inputs to the crew procedures and training of the DTM920 were documented.
[Technology and progress in the use of information systems in the dental office].
Walther, K
1989-09-01
The numerous DP systems used in dental offices are designed for administrative work. Data storage and management is limited to accountancy applications, and the advantages of the flow of information are restricted to operational purposes. Data of medical use are available only to a moderate extent. It should be possible, however, to use these information systems for processing purely medical data, for the structured input of comprehensive diagnostic information, and to have these data available for specific decisions. The use of a "decision-supporting system" has been tested in the documentation of dental diagnostic findings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. Harrington
2004-10-25
The purpose of this model report is to provide documentation of the conceptual and mathematical model (Ashplume) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. These aspects of volcanism-related dose calculation are described in the context of the entire igneous disruptive events conceptual model in ''Characterize Framework for Igneous Activity'' (BSC 2004 [DIRS 169989], Section 6.1.1). The Ashplume conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through themore » Yucca Mountain repository and downwind transport of contaminated tephra. The Ashplume mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the ground surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report update the previous documentation of the Ashplume mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model. In this report, ''Ashplume'' is used when referring to the atmospheric dispersal model and ''ASHPLUME'' is used when referencing the code of that model. Two analysis and model reports provide direct inputs to this model report, namely ''Characterize Eruptive Processes at Yucca Mountain, Nevada and Number of Waste Packages Hit by Igneous Intrusion''. This model report provides direct inputs to the TSPA, which uses the ASHPLUME software described and used in this model report. Thus, ASHPLUME software inputs are inputs to this model report for ASHPLUME runs in this model report. However, ASHPLUME software inputs are outputs of this model report for ASHPLUME runs by TSPA.« less
SIRU utilization. Volume 2: Software description and program documentation
NASA Technical Reports Server (NTRS)
Oehrle, J.; Whittredge, R.
1973-01-01
A complete description of the additional analysis, development and evaluation provided for the SIRU system as identified in the requirements for the SIRU utilization program is presented. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The modules are mounted in this package so that their measurement input axes form a unique symmetrical pattern that corresponds to the array of perpendiculars to the faces of a regular dodecahedron. This six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. Documentation of the additional software and software modifications required to implement the utilization capabilities includes assembly listings and flow charts
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
Dooley, Katherine L; Arain, Muzammil A; Feldbaum, David; Frolov, Valery V; Heintze, Matthew; Hoak, Daniel; Khazanov, Efim A; Lucianetti, Antonio; Martin, Rodica M; Mueller, Guido; Palashov, Oleg; Quetschke, Volker; Reitze, David H; Savage, R L; Tanner, D B; Williams, Luke F; Wu, Wan
2012-03-01
We present the design and performance of the LIGO Input Optics subsystem as implemented for the sixth science run of the LIGO interferometers. The Initial LIGO Input Optics experienced thermal side effects when operating with 7 W input power. We designed, built, and implemented improved versions of the Input Optics for Enhanced LIGO, an incremental upgrade to the Initial LIGO interferometers, designed to run with 30 W input power. At four times the power of Initial LIGO, the Enhanced LIGO Input Optics demonstrated improved performance including better optical isolation, less thermal drift, minimal thermal lensing, and higher optical efficiency. The success of the Input Optics design fosters confidence for its ability to perform well in Advanced LIGO.
Graphics and composite material computer program enhancements for SPAR
NASA Technical Reports Server (NTRS)
Farley, G. L.; Baker, D. J.
1980-01-01
User documentation is provided for additional computer programs developed for use in conjunction with SPAR. These programs plot digital data, simplify input for composite material section properties, and compute lamina stresses and strains. Sample problems are presented including execution procedures, program input, and graphical output.
The guide to Design For On-orbit Spacecraft Servicing (DFOSS) manual: Producing a consensus document
NASA Technical Reports Server (NTRS)
Nyman, Janice
1993-01-01
Increasing interaction and changing economies at the national and international levels have accelerated the call for standardization in space systems design. The benefits of standardization--compatibility, interchangeability, and lower costs--are maximized when achieved through consensus. Reaching consensus in standardization means giving everyone who will be affected by a standard an opportunity to have input into creating that standard. The DFOSS manual was initiated with the goal of developing standards through consensus. The present Proposed Guide derives from work begun by the Space Automation and Robotics Center (SpARC), a NASA Center for the Commercial Development of Space, and has continued as a standards project through the American Institute of Aeronautics and Astronautics (AIAA). The Proposed Guide was released by AIAA in Jan. 1992 for sale during a one-year, trial-use period. DFOSS is a response to the need for one document that contains all the guidelines required by on-orbit spacecraft servicing designers for astronaut extravehicular activity and/or telerobotic servicing. The manual's content is driven by spacecraft design considerations, and its composition has been achieved by interaction and cooperation among government, industry, and research organizations. While much work lies ahead to maximize the potential of DFOSS, the Proposed Guide represents evidence of the benefits of industry-wide consensus, points the way for broader application, and provides an example for similar projects.
Lin, Tzu-Yung; Green, Roger J.; O'Connor, Peter B.
2011-01-01
The nature of the ion signal from a 12-T Fourier-transform ion cyclotron resonance mass spectrometer and the electronic noise were studied to further understand the electronic detection limit. At minimal cost, a new transimpedance preamplifier was designed, computer simulated, built, and tested. The preamplifier design pushes the electronic signal-to-noise performance at room temperature to the limit, because of its enhanced tolerance of the capacitance of the detection device, lower intrinsic noise, and larger flat mid-band gain (input current noise spectral density of around 1 pA/\\documentclass[12pt]{minimal}\\begin{document}$\\sqrt{\\mbox{Hz}}$\\end{document}Hz when the transimpedance is about 85 dBΩ). The designed preamplifier has a bandwidth of ∼3 kHz to 10 MHz, which corresponds to the mass-to-charge ratio, m/z, of approximately 18 to 61 k at 12 T. The transimpedance and the bandwidth can be easily adjusted by changing the value of passive components. The feedback limitation of the circuit is discussed. With the maximum possible transimpedance of 5.3 MΩ when using an 0402 surface mount resistor, the preamplifier was estimated to be able to detect ∼110 charges in a single scan. PMID:22225232
Auble, Gregor T.; Wondzell, Mark; Talbert, Colin
2009-01-01
This report describes and documents a decision support system for the Gunnison River in Black Canyon of the Gunnison National Park. It is a macro-embedded EXCEL program that calculates and displays indicators representing valued characteristics or processes in the Black Canyon based on daily flows of the Gunnison River. The program is designed to easily accept input from downloaded stream gage records or output from the RIVERWARE reservoir operations model being used for the upstream Aspinall Unit. The decision support system is structured to compare as many as eight alternative flow regimes, where each alternative is represented by a daily sequence of at least 20 calendar years of streamflow. Indicators include selected flow statistics, riparian plant community distribution, clearing of box elder by inundation and scour, several measures of sediment mobilization, trout fry habitat, and federal reserved water rights. Calculation of variables representing National Park Service federal reserved water rights requires additional secondary input files pertaining to forecast and actual basin inflows and storage levels in Blue Mesa reservoir. Example input files representing a range of situations including historical, reconstructed natural, and simulated alternative reservoir operations are provided with the software.
Image quality assessment for video stream recognition systems
NASA Astrophysics Data System (ADS)
Chernov, Timofey S.; Razumnuy, Nikita P.; Kozharinov, Alexander S.; Nikolaev, Dmitry P.; Arlazarov, Vladimir V.
2018-04-01
Recognition and machine vision systems have long been widely used in many disciplines to automate various processes of life and industry. Input images of optical recognition systems can be subjected to a large number of different distortions, especially in uncontrolled or natural shooting conditions, which leads to unpredictable results of recognition systems, making it impossible to assess their reliability. For this reason, it is necessary to perform quality control of the input data of recognition systems, which is facilitated by modern progress in the field of image quality evaluation. In this paper, we investigate the approach to designing optical recognition systems with built-in input image quality estimation modules and feedback, for which the necessary definitions are introduced and a model for describing such systems is constructed. The efficiency of this approach is illustrated by the example of solving the problem of selecting the best frames for recognition in a video stream for a system with limited resources. Experimental results are presented for the system for identity documents recognition, showing a significant increase in the accuracy and speed of the system under simulated conditions of automatic camera focusing, leading to blurring of frames.
Banta, Edward R.; Provost, Alden M.
2008-01-01
This report documents HUFPrint, a computer program that extracts and displays information about model structure and hydraulic properties from the input data for a model built using the Hydrogeologic-Unit Flow (HUF) Package of the U.S. Geological Survey's MODFLOW program for modeling ground-water flow. HUFPrint reads the HUF Package and other MODFLOW input files, processes the data by hydrogeologic unit and by model layer, and generates text and graphics files useful for visualizing the data or for further processing. For hydrogeologic units, HUFPrint outputs such hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, vertical hydraulic conductivity or anisotropy, specific storage, specific yield, and hydraulic-conductivity depth-dependence coefficient. For model layers, HUFPrint outputs such effective hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, specific storage, primary direction of anisotropy, and vertical conductance. Text files tabulating hydraulic properties by hydrogeologic unit, by model layer, or in a specified vertical section may be generated. Graphics showing two-dimensional cross sections and one-dimensional vertical sections at specified locations also may be generated. HUFPrint reads input files designed for MODFLOW-2000 or MODFLOW-2005.
NASA Technical Reports Server (NTRS)
Sorenson, Reese L.; Alter, Stephen J.
1995-01-01
This document is a users' manual for a new three-dimensional structured multiple-block volume g generator called 3DGRAPE/AL. It is a significantly improved version of the previously-released a widely-distributed programs 3DGRAPE and 3DMAGGS. It generates volume grids by iteratively solving the Poisson Equations in three-dimensions. The right-hand-side terms are designed so that user-specific; grid cell heights and user-specified grid cell skewness near boundary surfaces result automatically, with little user intervention. The code is written in Fortran-77, and can be installed with or without a simple graphical user interface which allows the user to watch as the grid is generated. An introduction describing the improvements over the antecedent 3DGRAPE code is presented first. Then follows a chapter on the basic grid generator program itself, and comments on installing it. The input is then described in detail. After that is a description of the Graphical User Interface. Five example cases are shown next, with plots of the results. Following that is a chapter on two input filters which allow use of input data generated elsewhere. Last is a treatment of the theory embodied in the code.
User's Guide for Monthly Vector Wind Profile Model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1999-01-01
The background, theoretical concepts, and methodology for construction of vector wind profiles based on a statistical model are presented. The derived monthly vector wind profiles are to be applied by the launch vehicle design community for establishing realistic estimates of critical vehicle design parameter dispersions related to wind profile dispersions. During initial studies a number of months are used to establish the model profiles that produce the largest monthly dispersions of ascent vehicle aerodynamic load indicators. The largest monthly dispersions for wind, which occur during the winter high-wind months, are used for establishing the design reference dispersions for the aerodynamic load indicators. This document includes a description of the computational process for the vector wind model including specification of input data, parameter settings, and output data formats. Sample output data listings are provided to aid the user in the verification of test output.
The design and analysis of single flank transmission error testor for loaded gears
NASA Technical Reports Server (NTRS)
Houser, D. R.; Bassett, D. E.
1985-01-01
Due to geometrical imperfections in gears and finite tooth stiffnesses, the motion transmitted from an input gear shaft to an output gear shaft will not have conjugate action. In order to strengthen the understanding of transmission error and to verify mathematical models of gear transmission error, a test stand that will measure the transmission error of a gear pair at operating loads, but at reduced speeds would be desirable. This document describes the design and development of a loaded transmission error tester. For a gear box with a gear ratio of one, few tooth meshing combinations will occur during a single test. In order to observe the effects of different tooth mesh combinations and to increase the ability to load test gear pairs with higher gear ratios, the system was designed around a gear box with a gear ratio of two.
NASA Technical Reports Server (NTRS)
Pilkey, W. D.; Wang, B. P.; Yoo, Y.; Clark, B.
1973-01-01
A description and applications of a computer capability for determining the ultimate optimal behavior of a dynamically loaded structural-mechanical system are presented. This capability provides characteristics of the theoretically best, or limiting, design concept according to response criteria dictated by design requirements. Equations of motion of the system in first or second order form include incompletely specified elements whose characteristics are determined in the optimization of one or more performance indices subject to the response criteria in the form of constraints. The system is subject to deterministic transient inputs, and the computer capability is designed to operate with a large linear programming on-the-shelf software package which performs the desired optimization. The report contains user-oriented program documentation in engineering, problem-oriented form. Applications cover a wide variety of dynamics problems including those associated with such diverse configurations as a missile-silo system, impacting freight cars, and an aircraft ride control system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staller, G.E.; Westmoreland, J.J.; Whitlow, G.L.
1998-03-01
Lost circulation, which is the loss of well drilling fluids to the formation while drilling, is a common problem encountered while drilling geothermal wells. The rapid detection of the loss of well drilling fluids is critical to the successful and cost-effective treatment of the wellbore to stop or minimize lost circulation. Sandia National Laboratories has developed an instrument to accurately measure the outflow rate of drilling fluids while drilling. This instrument, the Rolling Float Meter, has been under development at Sandia since 1991 and is now available for utilization by interested industry users. This report documents recent Rolling Float Metermore » design upgrades resulting from field testing and industry input, the effects of ongoing testing and evaluation both in the laboratory and in the field, and the final design package that is available to transfer this technology to industry users.« less
Electrophysiology Tool Construction
Ide, David
2016-01-01
This protocol documents the construction of a custom microscope stage system currently in widespread use by a wide variety of investigators. The current design and construction of this stage is the result of multiple iterations, integrating input from a number of electrophysiologists working with a variety of preparations. Thus, this tool is a generally applicable solution, suitable for a wide array of end-user requirements; its flexible design facilitates rapid and easy configuration, making it useful for multi-user microscopes, as individual researchers can reconfigure the stage system or have their own readily replaceable stage plates. Furthermore, the stage can be manufactured using equipment typically found in small research machine shops, and by keeping the various parts on hand, machinists can quickly satisfy new requests and/or modifications for a wide variety of applications. PMID:23315946
Model Documentation of Base Case Data | Regional Energy Deployment System
Model | Energy Analysis | NREL Documentation of Base Case Data Model Documentation of Base Case base case of the model. The base case was developed simply as a point of departure for other analyses Base Case derives many of its inputs from the Energy Information Administration's (EIA's) Annual Energy
NASA Technical Reports Server (NTRS)
Brauer, G. L.; Cornick, D. E.; Stevenson, R.
1977-01-01
The capabilities and applications of the three-degree-of-freedom (3DOF) version and the six-degree-of-freedom (6DOF) version of the Program to Optimize Simulated Trajectories (POST) are summarized. The document supplements the detailed program manuals by providing additional information that motivates and clarifies basic capabilities, input procedures, applications and computer requirements of these programs. The information will enable prospective users to evaluate the programs, and to determine if they are applicable to their problems. Enough information is given to enable managerial personnel to evaluate the capabilities of the programs and describes the POST structure, formulation, input and output procedures, sample cases, and computer requirements. The report also provides answers to basic questions concerning planet and vehicle modeling, simulation accuracy, optimization capabilities, and general input rules. Several sample cases are presented.
Hierarchic Agglomerative Clustering Methods for Automatic Document Classification.
ERIC Educational Resources Information Center
Griffiths, Alan; And Others
1984-01-01
Considers classifications produced by application of single linkage, complete linkage, group average, and word clustering methods to Keen and Cranfield document test collections, and studies structure of hierarchies produced, extent to which methods distort input similarity matrices during classification generation, and retrieval effectiveness…
Airport Performance Model : Volume 2 - User's Manual and Program Documentation
DOT National Transportation Integrated Search
1978-10-01
Volume II contains a User's manual and program documentation for the Airport Performance Model. This computer-based model is written in FORTRAN IV for the DEC-10. The user's manual describes the user inputs to the interactive program and gives sample...
USDA-ARS?s Scientific Manuscript database
Grass filter strips are a widely used conservation practice in the Midwestern United States for reducing nutrient, pesticide, and sediment inputs into agricultural streams. Previous studies have documented the effectiveness of grass filter strips in reducing the input of agricultural pollutants, bu...
Guidelines for the Selection of Near-Earth Thermal Environment Parameters for Spacecraft Design
NASA Technical Reports Server (NTRS)
Anderson, B. J.; Justus, C. G.; Batts, G. W.
2001-01-01
Thermal analysis and design of Earth orbiting systems requires specification of three environmental thermal parameters: the direct solar irradiance, Earth's local albedo, and outgoing longwave radiance (OLR). In the early 1990s data sets from the Earth Radiation Budget Experiment were analyzed on behalf of the Space Station Program to provide an accurate description of these parameters as a function of averaging time along the orbital path. This information, documented in SSP 30425 and, in more generic form in NASA/TM-4527, enabled the specification of the proper thermal parameters for systems of various thermal response time constants. However, working with the engineering community and SSP-30425 and TM-4527 products over a number of years revealed difficulties in interpretation and application of this material. For this reason it was decided to develop this guidelines document to help resolve these issues of practical application. In the process, the data were extensively reprocessed and a new computer code, the Simple Thermal Environment Model (STEM) was developed to simplify the process of selecting the parameters for input into extreme hot and cold thermal analyses and design specifications. In the process, greatly improved values for the cold case OLR values for high inclination orbits were derived. Thermal parameters for satellites in low, medium, and high inclination low-Earth orbit and with various system thermal time constraints are recommended for analysis of extreme hot and cold conditions. Practical information as to the interpretation and application of the information and an introduction to the STEM are included. Complete documentation for STEM is found in the user's manual, in preparation.
Flight-testing and frequency-domain analysis for rotorcraft handling qualities
NASA Technical Reports Server (NTRS)
Ham, Johnnie A.; Gardner, Charles K.; Tischler, Mark B.
1995-01-01
A demonstration of frequency-domain flight-testing techniques and analysis was performed on a U.S. Army OH-58D helicopter in support of the OH-58D Airworthiness and Flight Characteristics Evaluation and of the Army's development and ongoing review of Aeronautical Design Standard 33C, Handling Qualities Requirements for Military Rotorcraft. Hover and forward flight (60 kn) tests were conducted in 1 flight hour by Army experimental test pilots. Further processing of the hover data generated a complete database of velocity, angular-rate, and acceleration-frequency responses to control inputs. A joint effort was then undertaken by the Airworthiness Qualification Test Dirtectorate and the U.S. Army Aeroflightdynamics Directorate to derive handling-quality information from the frequency-domain database using a variety of approaches. This report documents numerous results that have been obtained from the simple frequency-domain tests; in many areas, these results provide more insight into the aircraft dynmamics that affect handling qualities than do traditional flight tests. The handling-quality results include ADS-33C bandwidth and phase-delay calculations, vibration spectral determinations, transfer-function models to examine single-axis results, and a six-degree-of-freedom fully coupled state-space model. The ability of this model to accurately predict responses was verified using data from pulse inputs. This report also documents the frequency-sweep flight-test technique and data analysis used to support the tests.
DRI Model of the U.S. Economy -- Model Documentation
1993-01-01
Provides documentation on Data Resources, Inc., DRI Model of the U.S. Economy and the DRI Personal Computer Input/Output Model. It also describes the theoretical basis, structure and functions of both DRI models; and contains brief descriptions of the models and their equations.
Outcomes that Define Successful Advance Care Planning: A Delphi Panel Consensus
Sudore, Rebecca L.; Heyland, Daren K.; Lum, Hillary D.; Rietjens, Judith A.C.; Korfage, Ida J.; Ritchie, Christine S.; Hanson, Laura C.; Meier, Diane E.; Pantilat, Steven Z.; Lorenz, Karl; Howard, Michelle; Green, Michael J.; Simon, Jessica E.; Feuz, Mariko A.; You, John J.
2017-01-01
Context Standardized outcomes that define successful advance care planning (ACP) are lacking. Objective To create an Organizing Framework of ACP outcome constructs and rate the importance of these outcomes. Methods This study convened a Delphi panel consisting of 52 multidisciplinary, international ACP experts including clinicians, researchers, and policy leaders from four countries. We conducted literature reviews and solicited attendee input from 5 international ACP conferences to identify initial ACP outcome constructs. In 5 Delphi rounds, we asked panelists to rate patient-centered outcomes on a 7-point “not-at-all” to “extremely important” scale. We calculated means and analyzed panelists’ input to finalize an Organizing Framework and outcome rankings. Results Organizing Framework outcome domains included process (e.g., attitudes), actions (e.g., discussions), quality of care (e.g., satisfaction), and healthcare (e.g., utilization). The top 5 outcomes included (1) care consistent with goals, mean 6.71 (±SD 0.04); (2) surrogate designation, 6.55 (0.45); (3) surrogate documentation, 6.50 (0.11); (4) discussions with surrogates, 6.40 (0.19); and (5) documents and recorded wishes are accessible when needed 6.27 (0.11). Advance directive documentation was ranked 10th, 6.01 (0.21). Panelists raised caution about whether “care consistent with goals” 6.01 (0.21). Panelists raised can be reliably measured. Conclusion A large, multidisciplinary Delphi panel developed an Organizing Framework and rated the importance of ACP outcome constructs. Top rated outcomes should be used to evaluate the success of ACP initiatives. More research is needed to create reliable and valid measurement tools for the highest rated outcomes, particularly “care consistent with goals.” PMID:28865870
NASA Technical Reports Server (NTRS)
Jovic, Srba
2015-01-01
This Interface Control Document (ICD) documents and tracks the necessary information required for the Live Virtual and Constructive (LVC) systems components as well as protocols for communicating with them in order to achieve all research objectives captured by the experiment requirements. The purpose of this ICD is to clearly communicate all inputs and outputs from the subsystem components.
NASA Technical Reports Server (NTRS)
Boyd, D. Douglas, Jr.; Brooks, Thomas F.; Burley, Casey L.; Jolly, J. Ralph, Jr.
1998-01-01
This document details the methodology and use of the CAMRAD.Mod1/HIRES codes, which were developed at NASA Langley Research Center for the prediction of helicopter harmonic and Blade-Vortex Interaction (BVI) noise. CANMAD.Mod1 is a substantially modified version of the performance/trim/wake code CANMAD. High resolution blade loading is determined in post-processing by HIRES and an associated indicial aerodynamics code. Extensive capabilities of importance to noise prediction accuracy are documented, including a new multi-core tip vortex roll-up wake model, higher harmonic and individual blade control, tunnel and fuselage correction input, diagnostic blade motion input, and interfaces for acoustic and CFD aerodynamics codes. Modifications and new code capabilities are documented with examples. A users' job preparation guide and listings of variables and namelists are given.
NASA Technical Reports Server (NTRS)
Kuhlman, J. M.; Shu, J. Y.
1981-01-01
A subsonic, linearized aerodynamic theory, wing design program for one or two planforms was developed which uses a vortex lattice near field model and a higher order panel method in the far field. The theoretical development of the wake model and its implementation in the vortex lattice design code are summarized and sample results are given. Detailed program usage instructions, sample input and output data, and a program listing are presented in the Appendixes. The far field wake model assumes a wake vortex sheet whose strength varies piecewise linearly in the spanwise direction. From this model analytical expressions for lift coefficient, induced drag coefficient, pitching moment coefficient, and bending moment coefficient were developed. From these relationships a direct optimization scheme is used to determine the optimum wake vorticity distribution for minimum induced drag, subject to constraints on lift, and pitching or bending moment. Integration spanwise yields the bound circulation, which is interpolated in the near field vortex lattice to obtain the design camber surface(s).
AU-FREDI - AUTONOMOUS FREQUENCY DOMAIN IDENTIFICATION
NASA Technical Reports Server (NTRS)
Yam, Y.
1994-01-01
The Autonomous Frequency Domain Identification program, AU-FREDI, is a system of methods, algorithms and software that was developed for the identification of structural dynamic parameters and system transfer function characterization for control of large space platforms and flexible spacecraft. It was validated in the CALTECH/Jet Propulsion Laboratory's Large Spacecraft Control Laboratory. Due to the unique characteristics of this laboratory environment, and the environment-specific nature of many of the software's routines, AU-FREDI should be considered to be a collection of routines which can be modified and reassembled to suit system identification and control experiments on large flexible structures. The AU-FREDI software was originally designed to command plant excitation and handle subsequent input/output data transfer, and to conduct system identification based on the I/O data. Key features of the AU-FREDI methodology are as follows: 1. AU-FREDI has on-line digital filter design to support on-orbit optimal input design and data composition. 2. Data composition of experimental data in overlapping frequency bands overcomes finite actuator power constraints. 3. Recursive least squares sine-dwell estimation accurately handles digitized sinusoids and low frequency modes. 4. The system also includes automated estimation of model order using a product moment matrix. 5. A sample-data transfer function parametrization supports digital control design. 6. Minimum variance estimation is assured with a curve fitting algorithm with iterative reweighting. 7. Robust root solvers accurately factorize high order polynomials to determine frequency and damping estimates. 8. Output error characterization of model additive uncertainty supports robustness analysis. The research objectives associated with AU-FREDI were particularly useful in focusing the identification methodology for realistic on-orbit testing conditions. Rather than estimating the entire structure, as is typically done in ground structural testing, AU-FREDI identifies only the key transfer function parameters and uncertainty bounds that are necessary for on-line design and tuning of robust controllers. AU-FREDI's system identification algorithms are independent of the JPL-LSCL environment, and can easily be extracted and modified for use with input/output data files. The basic approach of AU-FREDI's system identification algorithms is to non-parametrically identify the sampled data in the frequency domain using either stochastic or sine-dwell input, and then to obtain a parametric model of the transfer function by curve-fitting techniques. A cross-spectral analysis of the output error is used to determine the additive uncertainty in the estimated transfer function. The nominal transfer function estimate and the estimate of the associated additive uncertainty can be used for robust control analysis and design. AU-FREDI's I/O data transfer routines are tailored to the environment of the CALTECH/ JPL-LSCL which included a special operating system to interface with the testbed. Input commands for a particular experiment (wideband, narrowband, or sine-dwell) were computed on-line and then issued to respective actuators by the operating system. The operating system also took measurements through displacement sensors and passed them back to the software for storage and off-line processing. In order to make use of AU-FREDI's I/O data transfer routines, a user would need to provide an operating system capable of overseeing such functions between the software and the experimental setup at hand. The program documentation contains information designed to support users in either providing such an operating system or modifying the system identification algorithms for use with input/output data files. It provides a history of the theoretical, algorithmic and software development efforts including operating system requirements and listings of some of the various special purpose subroutines which were developed and optimized for Lahey FORTRAN compilers on IBM PC-AT computers before the subroutines were integrated into the system software. Potential purchasers are encouraged to purchase and review the documentation before purchasing the AU-FREDI software. AU-FREDI is distributed in DEC VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard media) or a TK50 tape cartridge. AU-FREDI was developed in 1989 and is a copyrighted work with all copyright vested in NASA.
NASA Astrophysics Data System (ADS)
Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre; Bopp, Laurent; Brovkin, Victor; Dunne, John; Graven, Heather; Hoffman, Forrest; Ilyina, Tatiana; John, Jasmin G.; Jung, Martin; Kawamiya, Michio; Koven, Charlie; Pongratz, Julia; Raddatz, Thomas; Randerson, James T.; Zaehle, Sönke
2016-08-01
Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate-Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks are potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate-carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate-carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This paper documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre
Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks aremore » potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO 2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate–carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate–carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO 2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This study documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.« less
Jones, Chris D.; Arora, Vivek; Friedlingstein, Pierre; ...
2016-08-25
Coordinated experimental design and implementation has become a cornerstone of global climate modelling. Model Intercomparison Projects (MIPs) enable systematic and robust analysis of results across many models, by reducing the influence of ad hoc differences in model set-up or experimental boundary conditions. As it enters its 6th phase, the Coupled Model Intercomparison Project (CMIP6) has grown significantly in scope with the design and documentation of individual simulations delegated to individual climate science communities. The Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP) takes responsibility for design, documentation, and analysis of carbon cycle feedbacks and interactions in climate simulations. These feedbacks aremore » potentially large and play a leading-order contribution in determining the atmospheric composition in response to human emissions of CO 2 and in the setting of emissions targets to stabilize climate or avoid dangerous climate change. For over a decade, C4MIP has coordinated coupled climate–carbon cycle simulations, and in this paper we describe the C4MIP simulations that will be formally part of CMIP6. While the climate–carbon cycle community has created this experimental design, the simulations also fit within the wider CMIP activity, conform to some common standards including documentation and diagnostic requests, and are designed to complement the CMIP core experiments known as the Diagnostic, Evaluation and Characterization of Klima (DECK). C4MIP has three key strands of scientific motivation and the requested simulations are designed to satisfy their needs: (1) pre-industrial and historical simulations (formally part of the common set of CMIP6 experiments) to enable model evaluation, (2) idealized coupled and partially coupled simulations with 1 % per year increases in CO 2 to enable diagnosis of feedback strength and its components, (3) future scenario simulations to project how the Earth system will respond to anthropogenic activity over the 21st century and beyond. This study documents in detail these simulations, explains their rationale and planned analysis, and describes how to set up and run the simulations. Particular attention is paid to boundary conditions, input data, and requested output diagnostics. It is important that modelling groups participating in C4MIP adhere as closely as possible to this experimental design.« less
NASA Technical Reports Server (NTRS)
Craft, R.; Dunn, C.; Mccord, J.; Simeone, L.
1980-01-01
A user guide and programmer documentation is provided for a system of PRIME 400 minicomputer programs. The system was designed to support loading analyses on the Tracking Data Relay Satellite System (TDRSS). The system is a scheduler for various types of data relays (including tape recorder dumps and real time relays) from orbiting payloads to the TDRSS. Several model options are available to statistically generate data relay requirements. TDRSS time lines (representing resources available for scheduling) and payload/TDRSS acquisition and loss of sight time lines are input to the scheduler from disk. Tabulated output from the interactive system includes a summary of the scheduler activities over time intervals specified by the user and overall summary of scheduler input and output information. A history file, which records every event generated by the scheduler, is written to disk to allow further scheduling on remaining resources and to provide data for graphic displays or additional statistical analysis.
Manual for obscuration code with space station applications
NASA Technical Reports Server (NTRS)
Marhefka, R. J.; Takacs, L.
1986-01-01
The Obscuration Code, referred to as SHADOW, is a user-oriented computer code to determine the case shadow of an antenna in a complex environment onto the far zone sphere. The surrounding structure can be composed of multiple composite cone frustums and multiply sided flat plates. These structural pieces are ideal for modeling space station configurations. The means of describing the geometry input is compatible with the NEC-BASIC Scattering Code. In addition, an interactive mode of operation has been provided for DEC VAX computers. The first part of this document is a user's manual designed to give a description of the method used to obtain the shadow map, to provide an overall view of the operation of the computer code, to instruct a user in how to model structures, and to give examples of inputs and outputs. The second part is a code manual that details how to set up the interactive and non-interactive modes of the code and provides a listing and brief description of each of the subroutines.
The National Transport Code Collaboration Module Library
NASA Astrophysics Data System (ADS)
Kritz, A. H.; Bateman, G.; Kinsey, J.; Pankin, A.; Onjun, T.; Redd, A.; McCune, D.; Ludescher, C.; Pletzer, A.; Andre, R.; Zakharov, L.; Lodestro, L.; Pearlstein, L. D.; Jong, R.; Houlberg, W.; Strand, P.; Wiley, J.; Valanju, P.; John, H. St.; Waltz, R.; Mandrekas, J.; Mau, T. K.; Carlsson, J.; Braams, B.
2004-12-01
This paper reports on the progress in developing a library of code modules under the auspices of the National Transport Code Collaboration (NTCC). Code modules are high quality, fully documented software packages with a clearly defined interface. The modules provide a variety of functions, such as implementing numerical physics models; performing ancillary functions such as I/O or graphics; or providing tools for dealing with common issues in scientific programming such as portability of Fortran codes. Researchers in the plasma community submit code modules, and a review procedure is followed to insure adherence to programming and documentation standards. The review process is designed to provide added confidence with regard to the use of the modules and to allow users and independent reviews to validate the claims of the modules' authors. All modules include source code; clear instructions for compilation of binaries on a variety of target architectures; and test cases with well-documented input and output. All the NTCC modules and ancillary information, such as current standards and documentation, are available from the NTCC Module Library Website http://w3.pppl.gov/NTCC. The goal of the project is to develop a resource of value to builders of integrated modeling codes and to plasma physics researchers generally. Currently, there are more than 40 modules in the module library.
Dickinson, Jesse; Hanson, R.T.; Mehl, Steffen W.; Hill, Mary C.
2011-01-01
The computer program described in this report, MODPATH-LGR, is designed to allow simulation of particle tracking in locally refined grids. The locally refined grids are simulated by using MODFLOW-LGR, which is based on MODFLOW-2005, the three-dimensional groundwater-flow model published by the U.S. Geological Survey. The documentation includes brief descriptions of the methods used and detailed descriptions of the required input files and how the output files are typically used. The code for this model is available for downloading from the World Wide Web from a U.S. Geological Survey software repository. The repository is accessible from the U.S. Geological Survey Water Resources Information Web page at http://water.usgs.gov/software/ground_water.html. The performance of the MODPATH-LGR program has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program by using the email address available on the Web site. Updates might occasionally be made to this document and to the MODPATH-LGR program, and users should check the Web site periodically.
Eigensystem realization algorithm user's guide forVAX/VMS computers: Version 931216
NASA Technical Reports Server (NTRS)
Pappa, Richard S.
1994-01-01
The eigensystem realization algorithm (ERA) is a multiple-input, multiple-output, time domain technique for structural modal identification and minimum-order system realization. Modal identification is the process of calculating structural eigenvalues and eigenvectors (natural vibration frequencies, damping, mode shapes, and modal masses) from experimental data. System realization is the process of constructing state-space dynamic models for modern control design. This user's guide documents VAX/VMS-based FORTRAN software developed by the author since 1984 in conjunction with many applications. It consists of a main ERA program and 66 pre- and post-processors. The software provides complete modal identification capabilities and most system realization capabilities.
Virtual reality welder training
NASA Astrophysics Data System (ADS)
White, Steven A.; Reiners, Dirk; Prachyabrued, Mores; Borst, Christoph W.; Chambers, Terrence L.
2010-01-01
This document describes the Virtual Reality Simulated MIG Lab (sMIG), a system for Virtual Reality welder training. It is designed to reproduce the experience of metal inert gas (MIG) welding faithfully enough to be used as a teaching tool for beginning welding students. To make the experience as realistic as possible it employs physically accurate and tracked input devices, a real-time welding simulation, real-time sound generation and a 3D display for output. Thanks to being a fully digital system it can go beyond providing just a realistic welding experience by giving interactive and immediate feedback to the student to avoid learning wrong movements from day 1.
Computer program documentation for the dynamic analysis of a noncontacting mechanical face seal
NASA Technical Reports Server (NTRS)
Auer, B. M.; Etsion, I.
1980-01-01
A computer program is presented which achieves a numerical solution for the equations of motion of a noncontacting mechanical face seal. The flexibly-mounted primary seal ring motion is expressed by a set of second order differential equations for three degrees of freedom. These equations are reduced to a set of first order equations and the GEAR software package is used to solve the set of first order equations. Program input includes seal design parameters and seal operating conditions. Output from the program includes velocities and displacements of the seal ring about the axis of an inertial reference system. One example problem is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SWENSON JA; CROWE RD; APTHORPE R
2010-03-09
The purpose of this document is to present conceptual design phase thermal process calculations that support the process design and process safety basis for the cold vacuum drying of K Basin KOP material. This document is intended to demonstrate that the conceptual approach: (1) Represents a workable process design that is suitable for development in preliminary design; and (2) Will support formal safety documentation to be prepared during the definitive design phase to establish an acceptable safety basis. The Sludge Treatment Project (STP) is responsible for the disposition of Knock Out Pot (KOP) sludge within the 105-K West (KW) Basin.more » KOP sludge consists of size segregated material (primarily canister particulate) from the fuel and scrap cleaning process used in the Spent Nuclear Fuel process at K Basin. The KOP sludge will be pre-treated to remove fines and some of the constituents containing chemically bound water, after which it is referred to as KOP material. The KOP material will then be loaded into a Multi-Canister Overpack (MCO), dried at the Cold Vacuum Drying Facility (CVDF) and stored in the Canister Storage Building (CSB). This process is patterned after the successful drying of 2100 metric tons of spent fuel, and uses the same facilities and much of the same equipment that was used for drying fuel and scrap. Table ES-l present similarities and differences between KOP material and fuel and between MCOs loaded with these materials. The potential content of bound water bearing constituents limits the mass ofKOP material in an MCO load to a fraction of that in an MCO containing fuel and scrap; however, the small particle size of the KOP material causes the surface area to be significantly higher. This relatively large reactive surface area represents an input to the KOP thermal calculations that is significantly different from the calculations for fuel MCOs. The conceptual design provides for a copper insert block that limits the volume available to receive KOP material, enhances heat conduction, and functions as a heat source and sink during drying operations. This use of the copper insert represents a significant change to the thermal model compared to that used for the fuel calculations. A number of cases were run representing a spectrum of normal and upset conditions for the drying process. Dozens of cases have been run on cold vacuum drying of fuel MCOs. Analysis of these previous calculations identified four cases that provide a solid basis for judgments on the behavior of MCO in drying operations. These four cases are: (1) Normal Process; (2) Degraded vacuum pumping; (3) Open MCO with loss of annulus water; and (4) Cool down after vacuum drying. The four cases were run for two sets of input parameters for KOP MCOs: (1) a set of parameters drawn from safety basis values from the technical data book and (2) a sensitivity set using parameters selected to evaluate the impact of lower void volume and smaller particle size on MCO behavior. Results of the calculations for the drying phase cases are shown in Table ES-2. Cases using data book safety basis values showed dry out in 9.7 hours and heat rejection sufficient to hold temperature rise to less than 25 C. Sensitivity cases which included unrealistically small particle sizes and corresponding high reactive surface area showed higher temperature increases that were limited by water consumption. In this document and in the attachment (Apthorpe, R. and M.G. Plys, 2010) cases using Technical Databook safety basis values are referred to as nominal cases. In future calculations such cases will be called safety basis cases. Also in these documents cases using parameters that are less favorable to acceptable performance than databook safety values are referred to as safety cases. In future calculations such cases will be called sensitivity cases or sensitivity evaluations Calculations to be performed in support of the detailed design and formal safety basis documentation will expand the calculations presented in this document to include: additional features of the drying cycle, more realistic treatment of uranium metal consumption during oxidation, larger water inventory, longer time scales, and graphing of results of hydrogen gas concentration.« less
Input Files and Procedures for Analysis of SMA Hybrid Composite Beams in MSC.Nastran and ABAQUS
NASA Technical Reports Server (NTRS)
Turner, Travis L.; Patel, Hemant D.
2005-01-01
A thermoelastic constitutive model for shape memory alloys (SMAs) and SMA hybrid composites (SMAHCs) was recently implemented in the commercial codes MSC.Nastran and ABAQUS. The model is implemented and supported within the core of the commercial codes, so no user subroutines or external calculations are necessary. The model and resulting structural analysis has been previously demonstrated and experimentally verified for thermoelastic, vibration and acoustic, and structural shape control applications. The commercial implementations are described in related documents cited in the references, where various results are also shown that validate the commercial implementations relative to a research code. This paper is a companion to those documents in that it provides additional detail on the actual input files and solution procedures and serves as a repository for ASCII text versions of the input files necessary for duplication of the available results.
AN OPTICAL CHARACTER RECOGNITION RESEARCH AND DEMONSTRATION PROJECT.
ERIC Educational Resources Information Center
1968
RESEARCH AND DEVELOPMENT OF PROTOTYPE LIBRARY SYSTEMS WHICH UTILIZE OPTICAL CHARACTER RECOGNITION INPUT HAS CENTERED AROUND OPTICAL PAGE READERS AND DOCUMENT READERS. THE STATE-OF-THE-ART OF BOTH THESE OPTICAL SCANNERS IS SUCH THAT BOTH ARE ACCEPTABLE FOR LIBRARY INPUT PREPARATION. A DEMONSTRATION PROJECT UTILIZING THE TWO TYPES OF READERS, SINCE…
DOT National Transportation Integrated Search
2012-04-01
The purpose of this report is to document the stakeholder input received at the February 8, 2012, stakeholder workshop at the Hall of States in Washington, D.C. on goals, performance measures, transformative performance targets, and high-level user n...
User input verification and test driven development in the NJOY21 nuclear data processing code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trainer, Amelia Jo; Conlin, Jeremy Lloyd; McCartney, Austin Paul
Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, andmore » capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.« less
AIR QUALITY CRITERIA FOR PARTICULATE MATTER DOCUMENT
A Planning Document was produced by NCEA/RTP and reviewed by the Clean Air Scientific Advisory Committee (CASAC) (62 FR 55201, October 23, 1997). In FY99, a workshop draft of the PM AQCD was completed, a peer input workshop held, and an External Review Draft released for public ...
Multiple Input Design for Real-Time Parameter Estimation in the Frequency Domain
NASA Technical Reports Server (NTRS)
Morelli, Eugene
2003-01-01
A method for designing multiple inputs for real-time dynamic system identification in the frequency domain was developed and demonstrated. The designed inputs are mutually orthogonal in both the time and frequency domains, with reduced peak factors to provide good information content for relatively small amplitude excursions. The inputs are designed for selected frequency ranges, and therefore do not require a priori models. The experiment design approach was applied to identify linear dynamic models for the F-15 ACTIVE aircraft, which has multiple control effectors.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
2010-09-01
differentiated between source codes and input/output files. The text makes references to a REMChlor-GoldSim model. The text also refers to the REMChlor...To the extent possible, the instructions should be accurate and precise. The documentation should differentiate between describing what is actually...Windows XP operating system Model Input Paran1eters. · n1e input parameters were identical to those utilized and reported by CDM (See Table .I .from
Input from Key Stakeholders in the National Security Technology Incubator
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This report documents the input from key stakeholders of the National Security Technology Incubator (NSTI) in developing a new technology incubator and related programs for southern New Mexico. The technology incubator is being developed as part of the National Security Preparedness Project (NSPP), funded by a Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. This report includes identification of key stakeholders as well as a description and analysis of their input for the development of an incubator.
Narrow pH Range of Surface Water Bodies Receiving Pesticide Input in Europe.
Bundschuh, Mirco; Weyers, Arnd; Ebeling, Markus; Elsaesser, David; Schulz, Ralf
2016-01-01
Fate and toxicity of the active ingredients (AI's) of plant protection products in surface waters is often influenced by pH. Although a general range of pH values is reported in literature, an evaluation targeting aquatic ecosystems with documented AI inputs is lacking at the larger scale. Results show 95% of European surface waters (n = 3075) with a documented history of AI exposure fall within a rather narrow pH range, between 7.0 and 8.5. Spatial and temporal variability in the data may at least be partly explained by the calcareous characteristics of parental rock material, the affiliation of the sampling site to a freshwater ecoregion, and the photosynthetic activity of macrophytes (i.e., higher pH values with photosynthesis). Nonetheless, the documented pH range fits well with the standard pH of most ecotoxicological test guidelines, confirming the fate and ecotoxicity of AIs are usually adequately addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khivsara, Sagar
Recent studies have evaluated closed-loop supercritical carbon dioxide (s-CO 2) Brayton cycles to be a higher energy-density system in comparison to conventional superheated steam Rankine systems. At turbine inlet conditions of 923K and 25 MPa, high thermal efficiency (~50%) can be achieved. Achieving these high efficiencies will make concentrating solar power (CSP) technologies a competitive alternative to current power generation methods. To incorporate a s-CO 2 Brayton power cycle in a solar power tower system, the development of a solar receiver capable of providing an outlet temperature of 923 K (at 25 MPa) is necessary. To satisfy the temperature requirementsmore » of a s-CO 2 Brayton cycle with recuperation and recompression, it is required to heat s-CO 2 by a temperature of ~200 K as it passes through the solar receiver. Our objective was to develop an optical-thermal-fluid model to design and evaluate a tubular receiver that will receive a heat input ~1 MWth from a heliostat field. We also undertook the documentation of design requirements for the development, testing and safe operation of a direct s-CO 2 solar receiver. The main purpose of this document is to serve as a reference and guideline for design and testing requirements, as well as to address the technical challenges and provide initial parameters for the computational models that will be employed for the development of s-CO 2 receivers.« less
ERIC Educational Resources Information Center
Mahapatra, M.; Biswas, S. C.
1985-01-01
Two hundred journal articles related to fields of taxation, genetic psychology, and Shakespearean drama published from 1970-1980 were analyzed and PRECIS input strings were drawn. Occasions when input string and index entries looked incomplete and unexpressive after losing context of document are provided with solutions. Role operator schema is…
ERIC Educational Resources Information Center
Noughabi, Mostafa Azari
2017-01-01
Vocabulary as a significant component of language learning has been widely researched. As well, it is well documented that vocabulary could be learned through listening and reading. In addition, measuring productive vocabulary has been a chief concern among scholars. However, few studies have focused on meaning-focused listening input and its…
Code of Federal Regulations, 2014 CFR
2014-07-01
... the updated factors and input data sets from the supporting data systems used, including: (1) The In... Determination. (b) The CRA report, including relevant data on international mail services; (c) The Cost Segments and Components (CSC) report; (d) All input data and processing programs used to produce the CRA report...
Code of Federal Regulations, 2011 CFR
2011-07-01
... the updated factors and input data sets from the supporting data systems used, including: (1) The In... Determination. (b) The CRA report, including relevant data on international mail services; (c) The Cost Segments and Components (CSC) report; (d) All input data and processing programs used to produce the CRA report...
Code of Federal Regulations, 2013 CFR
2013-07-01
... the updated factors and input data sets from the supporting data systems used, including: (1) The In... Determination. (b) The CRA report, including relevant data on international mail services; (c) The Cost Segments and Components (CSC) report; (d) All input data and processing programs used to produce the CRA report...
Code of Federal Regulations, 2012 CFR
2012-07-01
... the updated factors and input data sets from the supporting data systems used, including: (1) The In... Determination. (b) The CRA report, including relevant data on international mail services; (c) The Cost Segments and Components (CSC) report; (d) All input data and processing programs used to produce the CRA report...
Pedagogical Ethical Dilemmas in a Responsive Evaluation of a Leadership Program for Youth
ERIC Educational Resources Information Center
Freeman, Melissa; Preissle, Judith
2010-01-01
How do responsive evaluators provide input to program planners when competing ethical principles point to different choices of effective feedback? A team of three evaluators used participant observation, individual and focus group interviews, and analysis of documents to provide input on the development and outcome of a summer program for high…
Input filter compensation for switching regulators
NASA Technical Reports Server (NTRS)
Lee, F. C.; Kelkar, S. S.
1982-01-01
The problems caused by the interaction between the input filter, output filter, and the control loop are discussed. The input filter design is made more complicated because of the need to avoid performance degradation and also stay within the weight and loss limitations. Conventional input filter design techniques are then dicussed. The concept of pole zero cancellation is reviewed; this concept is the basis for an approach to control the peaking of the output impedance of the input filter and thus mitigate some of the problems caused by the input filter. The proposed approach for control of the peaking of the output impedance of the input filter is to use a feedforward loop working in conjunction with feedback loops, thus forming a total state control scheme. The design of the feedforward loop for a buck regulator is described. A possible implementation of the feedforward loop design is suggested.
Modeling Aircraft Wing Loads from Flight Data Using Neural Networks
NASA Technical Reports Server (NTRS)
Allen, Michael J.; Dibley, Ryan P.
2003-01-01
Neural networks were used to model wing bending-moment loads, torsion loads, and control surface hinge-moments of the Active Aeroelastic Wing (AAW) aircraft. Accurate loads models are required for the development of control laws designed to increase roll performance through wing twist while not exceeding load limits. Inputs to the model include aircraft rates, accelerations, and control surface positions. Neural networks were chosen to model aircraft loads because they can account for uncharacterized nonlinear effects while retaining the capability to generalize. The accuracy of the neural network models was improved by first developing linear loads models to use as starting points for network training. Neural networks were then trained with flight data for rolls, loaded reversals, wind-up-turns, and individual control surface doublets for load excitation. Generalization was improved by using gain weighting and early stopping. Results are presented for neural network loads models of four wing loads and four control surface hinge moments at Mach 0.90 and an altitude of 15,000 ft. An average model prediction error reduction of 18.6 percent was calculated for the neural network models when compared to the linear models. This paper documents the input data conditioning, input parameter selection, structure, training, and validation of the neural network models.
Optimization Under Uncertainty for Electronics Cooling Design
NASA Astrophysics Data System (ADS)
Bodla, Karthik K.; Murthy, Jayathi Y.; Garimella, Suresh V.
Optimization under uncertainty is a powerful methodology used in design and optimization to produce robust, reliable designs. Such an optimization methodology, employed when the input quantities of interest are uncertain, produces output uncertainties, helping the designer choose input parameters that would result in satisfactory thermal solutions. Apart from providing basic statistical information such as mean and standard deviation in the output quantities, auxiliary data from an uncertainty based optimization, such as local and global sensitivities, help the designer decide the input parameter(s) to which the output quantity of interest is most sensitive. This helps the design of experiments based on the most sensitive input parameter(s). A further crucial output of such a methodology is the solution to the inverse problem - finding the allowable uncertainty range in the input parameter(s), given an acceptable uncertainty range in the output quantity of interest...
Minnesota Department of Education Special Education Primer for Charter Schools
ERIC Educational Resources Information Center
Minnesota Department of Education, 2009
2009-01-01
The purpose of this document is to provide information and resources on special education for charter school sponsors and charter school directors. This document is the result of collaborative input from individuals who work in and with charter schools in Minnesota. It also represents the collaborative efforts of the following divisions of the…
International directory of documentation services concerning forestry and forest products
Peter A. Evans; Gary L. Skupa
1981-01-01
This directory lists 120 documentation services concerned with forestry, forest products, or related fields in 28 countries. The entry for each service includes title of service, cost, publisher, subject coverage, formatting data, input sources, indexing and data-handling methods, and availability of special services other than the primary ones of indexing and...
Nurses' Perceptions of Nursing Care Documentation in the Electronic Health Record
ERIC Educational Resources Information Center
Jensen, Tracey A.
2013-01-01
Electronic health records (EHRs) will soon become the standard for documenting nursing care. The EHR holds the promise of rapid access to complete records of a patient's encounter with the healthcare system. It is the expectation that healthcare providers input essential data that communicates important patient information to support quality…
77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... the process. The Code Revision Process contains four basic steps that are followed for developing new documents as well as revising existing documents. Step 1: Public Input Stage, which results in the First Draft Report (formerly ROP); Step 2: Comment Stage, which results in the Second Draft Report (formerly...
NASA Astrophysics Data System (ADS)
Sun, Jincheng; Zou, Xiaodong; Matsuura, Hiroyuki; Wang, Cong
2018-03-01
The effects of heat input parameters on inclusion and microstructure characteristics have been investigated using welding thermal simulations. Inclusion features from heat-affected zones (HAZs) were profiled. It was found that, under heat input of 120 kJ/cm, Al-Mg-Ti-O-(Mn-S) composite inclusions can act effectively as nucleation sites for acicular ferrites. However, this ability disappears when the heat input is increased to 210 kJ/cm. In addition, confocal scanning laser microscopy (CSLM) was used to document possible inclusion-microstructure interactions, shedding light on how inclusions assist beneficial transformations toward property enhancement.
NASA Astrophysics Data System (ADS)
Sun, Jincheng; Zou, Xiaodong; Matsuura, Hiroyuki; Wang, Cong
2018-06-01
The effects of heat input parameters on inclusion and microstructure characteristics have been investigated using welding thermal simulations. Inclusion features from heat-affected zones (HAZs) were profiled. It was found that, under heat input of 120 kJ/cm, Al-Mg-Ti-O-(Mn-S) composite inclusions can act effectively as nucleation sites for acicular ferrites. However, this ability disappears when the heat input is increased to 210 kJ/cm. In addition, confocal scanning laser microscopy (CSLM) was used to document possible inclusion-microstructure interactions, shedding light on how inclusions assist beneficial transformations toward property enhancement.
The problems and perspectives for the introduction of high-rise construction in Russian cities
NASA Astrophysics Data System (ADS)
Pershina, Anna; Radzhabov, Mehman; Dormidontova, Tatyana
2018-03-01
The propose of academic affairs is discovery the principal areas of concern high-rise construction in Russia. Examples of modern Russian and foreign high-rise construction are considered in the work. The most important problems and their solutions for Russia are identified on their basis. The everyone area of concern is considered separately. Ecology problems and influence of high-rise construction for the healthy and psychological effect of people are considered special. High-rise constructions influence negative and positive for urban environment in Moscow and Samara cities. The experience lack, defects in requirements document, which don't include all high-rise constructions specific, system problem of construction and often non-availability of proper control at the existing requirements document result for complexity of designing, construction and operation. At this moment, high-rise constructions temp is increasing in Moscow. Feasibility of high-rise buildings come up in regions of Russia. The reasons include high material inputs, irregularities of requirements network and utility lines and maintenance problems. The researching follow up of conclusions and recommendations for high-rise constructions development in Russia. The reasons of high-rise buildings are urbanization of people and necessary of concentration labor supply. The important tasks for organization are creating compact urban environment, decrease urban area for development, using an innovative technology for construction and properly maintenance. The balance between the preference of high-rise construction, inputs for construction and influence for ecology are resolve for this task.
Trescott, Peter C.; Pinder, George Francis; Larson, S.P.
1976-01-01
The model will simulate ground-water flow in an artesian aquifer, a water-table aquifer, or a combined artesian and water-table aquifer. The aquifer may be heterogeneous and anisotropic and have irregular boundaries. The source term in the flow equation may include well discharge, constant recharge, leakage from confining beds in which the effects of storage are considered, and evapotranspiration as a linear function of depth to water. The theoretical development includes presentation of the appropriate flow equations and derivation of the finite-difference approximations (written for a variable grid). The documentation emphasizes the numerical techniques that can be used for solving the simultaneous equations and describes the results of numerical experiments using these techniques. Of the three numerical techniques available in the model, the strongly implicit procedure, in general, requires less computer time and has fewer numerical difficulties than do the iterative alternating direction implicit procedure and line successive overrelaxation (which includes a two-dimensional correction procedure to accelerate convergence). The documentation includes a flow chart, program listing, an example simulation, and sections on designing an aquifer model and requirements for data input. It illustrates how model results can be presented on the line printer and pen plotters with a program that utilizes the graphical display software available from the Geological Survey Computer Center Division. In addition the model includes options for reading input data from a disk and writing intermediate results on a disk.
Flight testing and frequency domain analysis for rotorcraft handling qualities characteristics
NASA Technical Reports Server (NTRS)
Ham, Johnnie A.; Gardner, Charles K.; Tischler, Mark B.
1993-01-01
A demonstration of frequency domain flight testing techniques and analyses was performed on a U.S. Army OH-58D helicopter in support of the OH-58D Airworthiness and Flight Characteristics Evaluation and the Army's development and ongoing review of Aeronautical Design Standard 33C, Handling Qualities Requirements for Military Rotorcraft. Hover and forward flight (60 knots) tests were conducted in 1 flight hour by Army experimental test pilots. Further processing of the hover data generated a complete database of velocity, angular rate, and acceleration frequency responses to control inputs. A joint effort was then undertaken by the Airworthiness Qualification Test Directorate (AQTD) and the U.S. Army Aeroflightdynamics Directorate (AFDD) to derive handling qualities information from the frequency response database. A significant amount of information could be extracted from the frequency domain database using a variety of approaches. This report documents numerous results that have been obtained from the simple frequency domain tests; in many areas, these results provide more insight into the aircraft dynamics that affect handling qualities than to traditional flight tests. The handling qualities results include ADS-33C bandwidth and phase delay calculations, vibration spectral determinations, transfer function models to examine single axis results, and a six degree of freedom fully coupled state space model. The ability of this model to accurately predict aircraft responses was verified using data from pulse inputs. This report also documents the frequency-sweep flight test technique and data analysis used to support the tests.
Merrill, Matthew D.; Slucher, Ernie R.; Roberts-Ashby, Tina L.; Warwick, Peter D.; Blondes, Madalyn S.; Freeman, P.A.; Cahan, Steven M.; DeVera, Christina A.; Lohr, Celeste D.; Warwick, Peter D.; Corum, Margo D.
2015-01-01
The U.S. Geological Survey has completed an assessment of the potential geologic carbon dioxide storage resource in the onshore areas of the United States. To provide geological context and input data sources for the resources numbers, framework documents are being prepared for all areas that were investigated as part of the national assessment. This report is the geologic framework document for the Permian and Palo Duro Basins, the combined Bend arch-Fort Worth Basin area, and subbasins therein of Texas, New Mexico, and Oklahoma. In addition to a summarization of the geology and petroleum resources of studied basins, the individual storage assessment units (SAUs) within the basins are described and explanations for their selection are presented. Though appendixes in the national assessment publications include the input values used to calculate the available storage resource, this framework document provides only the context and source of inputs selected by the assessment geologists. Spatial files of boundaries for the SAUs herein, as well as maps of the density of known well bores that penetrate the SAU seal, are available for download with the release of this report.
Merrill, Matthew D.; Drake, Ronald M.; Buursink, Marc L.; Craddock, William H.; East, Joseph A.; Slucher, Ernie R.; Warwick, Peter D.; Brennan, Sean T.; Blondes, Madalyn S.; Freeman, Philip A.; Cahan, Steven M.; DeVera, Christina A.; Lohr, Celeste D.; Warwick, Peter D.; Corum, Margo D.
2016-06-02
The U.S. Geological Survey has completed an assessment of the potential geologic carbon dioxide storage resources in the onshore areas of the United States. To provide geological context and input data sources for the resources numbers, framework documents are being prepared for all areas that were investigated as part of the national assessment. This report, chapter M, is the geologic framework document for the Uinta and Piceance, San Juan, Paradox, Raton, Eastern Great, and Black Mesa Basins, and subbasins therein of Arizona, Colorado, Idaho, Nevada, New Mexico, and Utah. In addition to a summary of the geology and petroleum resources of studied basins, the individual storage assessment units (SAUs) within the basins are described and explanations for their selection are presented. Although appendixes in the national assessment publications include the input values used to calculate the available storage resource, this framework document provides only the context and source of the input values selected by the assessment geologists. Spatial-data files of the boundaries for the SAUs, and the well-penetration density of known well bores that penetrate the SAU seal, are available for download with the release of this report.
Statistics & Input-Output Measures for School Libraries in Colorado, 2002.
ERIC Educational Resources Information Center
Colorado State Library, Denver.
This document presents statistics and input-output measures for K-12 school libraries in Colorado for 2002. Data are presented by type and size of school, i.e., high schools (six categories ranging from 2,000 and over to under 300), junior high/middle schools (five categories ranging from 1,000-1,999 to under 300), elementary schools (four…
Optical mass memory system (AMM-13). AMM/DBMS interface control document
NASA Technical Reports Server (NTRS)
Bailey, G. A.
1980-01-01
The baseline for external interfaces of a 10 to the 13th power bit, optical archival mass memory system (AMM-13) is established. The types of interfaces addressed include data transfer; AMM-13, Data Base Management System, NASA End-to-End Data System computer interconnect; data/control input and output interfaces; test input data source; file management; and facilities interface.
User Manual for SAHM package for VisTrails
Talbert, C.B.; Talbert, M.K.
2012-01-01
The Software for Assisted Habitat I\\•1odeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre-and post-processing steps and modeling options incorporated in the construction of a species distribution model. The four main advantages to using the combined VisTrail: SAHM package for species distribution modeling are: 1. formalization and tractable recording of the entire modeling process 2. easier collaboration through a common modeling framework 3. a user-friendly graphical interface to manage file input, model runs, and output 4. extensibility to incorporate future and additional modeling routines and tools. This user manual provides detailed information on each module within the SAHM package, their input, output, common connections, optional arguments, and default settings. This information can also be accessed for individual modules by right clicking on the documentation button for any module in VisTrail or by right clicking on any input or output for a module and selecting view documentation. This user manual is intended to accompany the user guide which provides detailed instructions on how to install the SAHM package within VisTrails and then presents information on the use of the package.
NASA Astrophysics Data System (ADS)
Trujillo, Eddie J.; Ellersick, Steven D.
2006-05-01
The Boeing Electronic Flight Bag (EFB) is a key element in the evolutionary process of an "e-enabled" flight deck. The EFB is designed to improve the overall safety, efficiency, and operation of the flight deck and corresponding airline operations by providing the flight crew with better information and enhanced functionality in a user-friendly digital format. The EFB is intended to increase the pilots' situational awareness of the airplane and systems, as well as improve the efficiency of information management. The system will replace documents and forms that are currently stored or carried onto the flight deck and put them, in digital format, at the crew's fingertips. This paper describes what the Boeing EFB is and the significant human factors and interface design issues, trade-offs, and decisions made during development of the display system. In addition, EFB formats, graphics, input control methods, challenges using COTS (commercial-off-the-shelf)-leveraged glass and formatting technology are discussed. The optical design requirements, display technology utilized, brightness control system, reflection challenge, and the resulting optical performance are presented.
XAPiir: A recursive digital filtering package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, D.
1990-09-21
XAPiir is a basic recursive digital filtering package, containing both design and implementation subroutines. XAPiir was developed for the experimental array processor (XAP) software package, and is written in FORTRAN. However, it is intended to be incorporated into any general- or special-purpose signal analysis program. It replaces the older package RECFIL, offering several enhancements. RECFIL is used in several large analysis programs developed at LLNL, including the seismic analysis package SAC, several expert systems (NORSEA and NETSEA), and two general purpose signal analysis packages (SIG and VIEW). This report is divided into two sections: the first describes the use ofmore » the subroutine package, and the second, its internal organization. In the first section, the filter design problem is briefly reviewed, along with the definitions of the filter design parameters and their relationship to the subroutine input parameters. In the second section, the internal organization is documented to simplify maintenance and extensions to the package. 5 refs., 9 figs.« less
Steidle, Ernest F.
1983-01-01
This paper describes the design of a functional assessment system, a component of a management information system (MIS) that supports a comprehensive rehabilitation facility. Products of the subsystem document the functional status of rehabilitation clients through process evaluation reporting and outcomes reporting. The purpose of this paper is to describe the design of this MIS component. The environment supported, the integration requirements and the needed development approach is unique, requiring significant input from health care professionals, medical informatics specialists, statisticians and program evaluators. Strategies for the implementation of the functional assessment system are the major results reported in this paper. They are most useful to the systems designer or management engineer in a human service delivery setting. MIS plan development, computer file structure and access methods, and approaches to scheduling applications is described. Finally, the development of functional status measures is discussed. Application of the methodologies described will facilitate similar efforts towards systems development in other human service delivery settings.
Simulator for multilevel optimization research
NASA Technical Reports Server (NTRS)
Padula, S. L.; Young, K. C.
1986-01-01
A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.
Cleaning Genesis Sample Return Canister for Flight: Lessons for Planetary Sample Return
NASA Technical Reports Server (NTRS)
Allton, J. H.; Hittle, J. D.; Mickelson, E. T.; Stansbery, Eileen K.
2016-01-01
Sample return missions require chemical contamination to be minimized and potential sources of contamination to be documented and preserved for future use. Genesis focused on and successfully accomplished the following: - Early involvement provided input to mission design: a) cleanable materials and cleanable design; b) mission operation parameters to minimize contamination during flight. - Established contamination control authority at a high level and developed knowledge and respect for contamination control across all institutions at the working level. - Provided state-of-the-art spacecraft assembly cleanroom facilities for science canister assembly and function testing. Both particulate and airborne molecular contamination was minimized. - Using ultrapure water, cleaned spacecraft components to a very high level. Stainless steel components were cleaned to carbon monolayer levels (10 (sup 15) carbon atoms per square centimeter). - Established long-term curation facility Lessons learned and areas for improvement, include: - Bare aluminum is not a cleanable surface and should not be used for components requiring extreme levels of cleanliness. The problem is formation of oxides during rigorous cleaning. - Representative coupons of relevant spacecraft components (cut from the same block at the same time with identical surface finish and cleaning history) should be acquired, documented and preserved. Genesis experience suggests that creation of these coupons would be facilitated by specification on the engineering component drawings. - Component handling history is critical for interpretation of analytical results on returned samples. This set of relevant documents is not the same as typical documentation for one-way missions and does include data from several institutions, which need to be unified. Dedicated resources need to be provided for acquiring and archiving appropriate documents in one location with easy access for decades. - Dedicated, knowledgeable contamination control oversight should be provided at sites of fabrication and integration. Numerous excellent Genesis chemists and analytical facilities participated in the contamination oversight; however, additional oversight at fabrication sites would have been helpful.
SLS-SPEC-159 Cross-Program Design Specification for Natural Environments (DSNE) Revision E
NASA Technical Reports Server (NTRS)
Roberts, Barry C.
2017-01-01
The DSNE completes environment-related specifications for architecture, system-level, and lower-tier documents by specifying the ranges of environmental conditions that must be accounted for by NASA ESD Programs. To assure clarity and consistency, and to prevent requirements documents from becoming cluttered with extensive amounts of technical material, natural environment specifications have been compiled into this document. The intent is to keep a unified specification for natural environments that each Program calls out for appropriate application. This document defines the natural environments parameter limits (maximum and minimum values, energy spectra, or precise model inputs, assumptions, model options, etc.), for all ESD Programs. These environments are developed by the NASA Marshall Space Flight Center (MSFC) Natural Environments Branch (MSFC organization code: EV44). Many of the parameter limits are based on experience with previous programs, such as the Space Shuttle Program. The parameter limits contain no margin and are meant to be evaluated individually to ensure they are reasonable (i.e., do not apply unrealistic extreme-on-extreme conditions). The natural environments specifications in this document should be accounted for by robust design of the flight vehicle and support systems. However, it is understood that in some cases the Programs will find it more effective to account for portions of the environment ranges by operational mitigation or acceptance of risk in accordance with an appropriate program risk management plan and/or hazard analysis process. The DSNE is not intended as a definition of operational models or operational constraints, nor is it adequate, alone, for ground facilities which may have additional requirements (for example, building codes and local environmental constraints). "Natural environments," as the term is used here, refers to the environments that are not the result of intended human activity or intervention. It consists of a variety of external environmental factors (most of natural origin and a few of human origin) which impose restrictions or otherwise impact the development or operation of flight vehicles and destination surface systems.
The sequence measurement system of the IR camera
NASA Astrophysics Data System (ADS)
Geng, Ai-hui; Han, Hong-xia; Zhang, Hai-bo
2011-08-01
Currently, the IR cameras are broadly used in the optic-electronic tracking, optic-electronic measuring, fire control and optic-electronic countermeasure field, but the output sequence of the most presently applied IR cameras in the project is complex and the giving sequence documents from the leave factory are not detailed. Aiming at the requirement that the continuous image transmission and image procession system need the detailed sequence of the IR cameras, the sequence measurement system of the IR camera is designed, and the detailed sequence measurement way of the applied IR camera is carried out. The FPGA programming combined with the SignalTap online observation way has been applied in the sequence measurement system, and the precise sequence of the IR camera's output signal has been achieved, the detailed document of the IR camera has been supplied to the continuous image transmission system, image processing system and etc. The sequence measurement system of the IR camera includes CameraLink input interface part, LVDS input interface part, FPGA part, CameraLink output interface part and etc, thereinto the FPGA part is the key composed part in the sequence measurement system. Both the video signal of the CmaeraLink style and the video signal of LVDS style can be accepted by the sequence measurement system, and because the image processing card and image memory card always use the CameraLink interface as its input interface style, the output signal style of the sequence measurement system has been designed into CameraLink interface. The sequence measurement system does the IR camera's sequence measurement work and meanwhile does the interface transmission work to some cameras. Inside the FPGA of the sequence measurement system, the sequence measurement program, the pixel clock modification, the SignalTap file configuration and the SignalTap online observation has been integrated to realize the precise measurement to the IR camera. Te sequence measurement program written by the verilog language combining the SignalTap tool on line observation can count the line numbers in one frame, pixel numbers in one line and meanwhile account the line offset and row offset of the image. Aiming at the complex sequence of the IR camera's output signal, the sequence measurement system of the IR camera accurately measures the sequence of the project applied camera, supplies the detailed sequence document to the continuous system such as image processing system and image transmission system and gives out the concrete parameters of the fval, lval, pixclk, line offset and row offset. The experiment shows that the sequence measurement system of the IR camera can get the precise sequence measurement result and works stably, laying foundation for the continuous system.
Model documentation, Coal Market Module of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives and the conceptual and methodological approach used in the development of the National Energy Modeling System`s (NEMS) Coal Market Module (CMM) used to develop the Annual Energy Outlook 1998 (AEO98). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of CMM`s two submodules. These are the Coal Production Submodule (CPS) and the Coal Distribution Submodule (CDS). CMM provides annual forecasts of prices, production, and consumption of coal for NEMS. In general, the CDS integrates the supply inputs from the CPS to satisfy demands for coal from exogenous demand models. The internationalmore » area of the CDS forecasts annual world coal trade flows from major supply to major demand regions and provides annual forecasts of US coal exports for input to NEMS. Specifically, the CDS receives minemouth prices produced by the CPS, demand and other exogenous inputs from other NEMS components, and provides delivered coal prices and quantities to the NEMS economic sectors and regions.« less
Thrust Chamber Modeling Using Navier-Stokes Equations: Code Documentation and Listings. Volume 2
NASA Technical Reports Server (NTRS)
Daley, P. L.; Owens, S. F.
1988-01-01
A copy of the PHOENICS input files and FORTRAN code developed for the modeling of thrust chambers is given. These copies are contained in the Appendices. The listings are contained in Appendices A through E. Appendix A describes the input statements relevant to thrust chamber modeling as well as the FORTRAN code developed for the Satellite program. Appendix B describes the FORTRAN code developed for the Ground program. Appendices C through E contain copies of the Q1 (input) file, the Satellite program, and the Ground program respectively.
Turbomachinery Forced Response Prediction System (FREPS): User's Manual
NASA Technical Reports Server (NTRS)
Morel, M. R.; Murthy, D. V.
1994-01-01
The turbomachinery forced response prediction system (FREPS), version 1.2, is capable of predicting the aeroelastic behavior of axial-flow turbomachinery blades. This document is meant to serve as a guide in the use of the FREPS code with specific emphasis on its use at NASA Lewis Research Center (LeRC). A detailed explanation of the aeroelastic analysis and its development is beyond the scope of this document, and may be found in the references. FREPS has been developed by the NASA LeRC Structural Dynamics Branch. The manual is divided into three major parts: an introduction, the preparation of input, and the procedure to execute FREPS. Part 1 includes a brief background on the necessity of FREPS, a description of the FREPS system, the steps needed to be taken before FREPS is executed, an example input file with instructions, presentation of the geometric conventions used, and the input/output files employed and produced by FREPS. Part 2 contains a detailed description of the command names needed to create the primary input file that is required to execute the FREPS code. Also, Part 2 has an example data file to aid the user in creating their own input files. Part 3 explains the procedures required to execute the FREPS code on the Cray Y-MP, a computer system available at the NASA LeRC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
NASA Technical Reports Server (NTRS)
Hairr, John W.; Huang, Jui-Ten; Ingram, J. Edward; Shah, Bharat M.
1992-01-01
The ISPAN Program (Interactive Stiffened Panel Analysis) is an interactive design tool that is intended to provide a means of performing simple and self contained preliminary analysis of aircraft primary structures made of composite materials. The program combines a series of modules with the finite element code DIAL as its backbone. Four ISPAN Modules were developed and are documented. These include: (1) flat stiffened panel; (2) curved stiffened panel; (3) flat tubular panel; and (4) curved geodesic panel. Users are instructed to input geometric and material properties, load information and types of analysis (linear, bifurcation buckling, or post-buckling) interactively. The program utilizing this information will generate finite element mesh and perform analysis. The output in the form of summary tables of stress or margins of safety, contour plots of loads or stress, and deflected shape plots may be generalized and used to evaluate specific design.
Design requirements for ubiquitous computing environments for healthcare professionals.
Bång, Magnus; Larsson, Anders; Eriksson, Henrik
2004-01-01
Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-12
... suggestions and information on the scope of issues to include in the environmental documents. Special mailings, newspaper articles, and other media announcements will inform people of the opportunities for input... public that the U.S. Fish and Wildlife Service (Service) intends to gather information necessary to...
TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 1: Method and results
NASA Technical Reports Server (NTRS)
Topp, D. A.; Myers, R. A.; Delaney, R. A.
1995-01-01
The primary objective of this study was the development of a computational fluid dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a graphical user interface (GUI). The computer codes resulting from this effort are referred to as the Turbomachinery Analysis and Design System (TADS). This document describes the theoretical basis and analytical results from the TADS system. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of various programs was done in a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low-speed turbine blade, and a transonic turbine vane.
Photovoltaic performance models - A report card
NASA Technical Reports Server (NTRS)
Smith, J. H.; Reiter, L. R.
1985-01-01
Models for the analysis of photovoltaic (PV) systems' designs, implementation policies, and economic performance, have proliferated while keeping pace with rapid changes in basic PV technology and extensive empirical data compiled for such systems' performance. Attention is presently given to the results of a comparative assessment of ten well documented and widely used models, which range in complexity from first-order approximations of PV system performance to in-depth, circuit-level characterizations. The comparisons were made on the basis of the performance of their subsystem, as well as system, elements. The models fall into three categories in light of their degree of aggregation into subsystems: (1) simplified models for first-order calculation of system performance, with easily met input requirements but limited capability to address more than a small variety of design considerations; (2) models simulating PV systems in greater detail, encompassing types primarily intended for either concentrator-incorporating or flat plate collector PV systems; and (3) models not specifically designed for PV system performance modeling, but applicable to aspects of electrical system design. Models ignoring subsystem failure or degradation are noted to exclude operating and maintenance characteristics as well.
Energy Return On Investment of Engineered Geothermal Systems Data
Mansure, Chip
2012-01-01
The project provides an updated Energy Return on Investment (EROI) for Enhanced Geothermal Systems (EGS). Results incorporate Argonne National Laboratory's Life Cycle Assessment and base case assumptions consistent with other projects in the Analysis subprogram. EROI is a ratio of the energy delivered to the consumer to the energy consumed to build, operate, and decommission the facility. EROI is important in assessing the viability of energy alternatives. Currently EROI analyses of geothermal energy are either out-of-date, of uncertain methodology, or presented online with little supporting documentation. This data set is a collection of files documenting data used to calculate the Energy Return On Investment (EROI) of Engineered Geothermal Systems (EGS) and erratum to publications prior to the final report. Final report is available from the OSTI web site (http://www.osti.gov/geothermal/). Data in this collections includes the well designs used, input parameters for GETEM, a discussion of the energy needed to haul materials to the drill site, the baseline mud program, and a summary of the energy needed to drill each of the well designs. EROI is the ratio of the energy delivered to the customer to the energy consumed to construct, operate, and decommission the facility. Whereas efficiency is the ratio of the energy delivered to the customer to the energy extracted from the reservoir.
Regan, R.S.; Schaffranek, R.W.; Baltzer, R.A.
1996-01-01
A system of functional utilities and computer routines, collectively identified as the Time-Dependent Data System CI DDS), has been developed and documented by the U.S. Geological Survey. The TDDS is designed for processing time sequences of discrete, fixed-interval, time-varying geophysical data--in particular, hydrologic data. Such data include various, dependent variables and related parameters typically needed as input for execution of one-, two-, and three-dimensional hydrodynamic/transport and associated water-quality simulation models. Such data can also include time sequences of results generated by numerical simulation models. Specifically, TDDS provides the functional capabilities to process, store, retrieve, and compile data in a Time-Dependent Data Base (TDDB) in response to interactive user commands or pre-programmed directives. Thus, the TDDS, in conjunction with a companion TDDB, provides a ready means for processing, preparation, and assembly of time sequences of data for input to models; collection, categorization, and storage of simulation results from models; and intercomparison of field data and simulation results. The TDDS can be used to edit and verify prototype, time-dependent data to affirm that selected sequences of data are accurate, contiguous, and appropriate for numerical simulation modeling. It can be used to prepare time-varying data in a variety of formats, such as tabular lists, sequential files, arrays, graphical displays, as well as line-printer plots of single or multiparameter data sets. The TDDB is organized and maintained as a direct-access data base by the TDDS, thus providing simple, yet efficient, data management and access. A single, easily used, program interface that provides all access to and from a particular TDDB is available for use directly within models, other user-provided programs, and other data systems. This interface, together with each major functional utility of the TDDS, is described and documented in this report.
Managing Input during Assistive Technology Product Design
ERIC Educational Resources Information Center
Choi, Young Mi
2011-01-01
Many different sources of input are available to assistive technology innovators during the course of designing products. However, there is little information on which ones may be most effective or how they may be efficiently utilized within the design process. The aim of this project was to compare how three types of input--from simulation tools,…
a Digital Pre-Inventory of Architectural Heritage in Kosovo Using DOCU-TOOLS®
NASA Astrophysics Data System (ADS)
Jäger-Klein, C.; Kryeziu, A.; Ymeri Hoxha, V.; Rant, M.
2017-08-01
Kosovo is one of the new states in transition in the Western Balkans and its state institutions are not yet fully functional. Although the territory has a rich architectural heritage, the documentation and inventory of this cultural legacy by the national monument protection institutions is insufficiently-structured and incomplete. Civil society has collected far more material than the state, but people are largely untrained in the terminology and categories of professional cultural inventories and in database systems and their international standards. What is missing is an efficient, user-friendly, low-threshold tool to gather together and integrate the various materials, archive them appropriately and make all the information suitably accessible to the public. Multiple groups of information-holders should be able to feed this open-access platform in an easy and self-explanatory way. In this case, existing systems such as the Arches Heritage Inventory and Management System would seem to be too complex, as it pre-supposes a certain understanding of the standard terminology and internationally used categories. Also, the platform as archive must be able to guarantee the integrity and authenticity of the inputted material to avoid abuse through unauthorized users with nationalistic views. Such an open-access lay-inventory would enable Kosovo to meet the urgent need for a national heritage inventory, which the state institutions have thus far been able to establish. The situation is time-sensitive, as Kosovo will soon repeat its attempt to join UNESCO, having failed to do so in 2015, receiving only a minimum number of votes in favour. In Austria, a program called docu-tools® was recently developed to tackle a similar problem. It can be used by non-professionals to document complicated and multi-structured cases within the building process. Its cloud and app-design structure allows archiving enormous numbers of images and documents in whatever format. Additionally, it allows parallel access by authorized users and avoids any hierarchy of structure or prerequisites for its users. The archived documents cannot be changed after input, which gave this documentation tool acclaimed court relevance. The following article is an attempt to explore the potential for this tool to prepare Kosovo for a comprehensive heritage inventory.
Approximation of Quantum Stochastic Differential Equations for Input-Output Model Reduction
2016-02-25
Approximation of Quantum Stochastic Differential Equations for Input-Output Model Reduction We have completed a short program of theoretical research...on dimensional reduction and approximation of models based on quantum stochastic differential equations. Our primary results lie in the area of...2211 quantum probability, quantum stochastic differential equations REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.
Apollo experience report: Mission evaluation team postflight documentation
NASA Technical Reports Server (NTRS)
Dodson, J. W.; Cordiner, D. H.
1975-01-01
The various postflight reports prepared by the mission evaluation team, including the final mission evaluation report, report supplements, anomaly reports, and the 5-day mission report, are described. The procedures for preparing each report from the inputs of the various disciplines are explained, and the general method of reporting postflight results is discussed. Recommendations for postflight documentation in future space programs are included. The official requirements for postflight documentation and a typical example of an anomaly report are provided as appendixes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pattison, Morgan
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
FPGA-Based Networked Phasemeter for a Heterodyne Interferometer
NASA Technical Reports Server (NTRS)
Rao, Shanti
2009-01-01
A document discusses a component of a laser metrology system designed to measure displacements along the line of sight with precision on the order of a tenth the diameter of an atom. This component, the phasemeter, measures the relative phase of two electrical signals and transfers that information to a computer. Because the metrology system measures the differences between two optical paths, the phasemeter has two inputs, called measure and reference. The reference signal is nominally a perfect square wave with a 50- percent duty cycle (though only rising edges are used). As the metrology system detects motion, the difference between the reference and measure signal phases is proportional to the displacement of the motion. The phasemeter, therefore, counts the elapsed time between rising edges in the two signals, and converts the time into an estimate of phase delay. The hardware consists of a circuit board that plugs into a COTS (commercial, off-the- shelf) Spartan-III FPGA (field-programmable gate array) evaluation board. It has two BNC inputs, (reference and measure), a CMOS logic chip to buffer the inputs, and an Ethernet jack for transmitting reduced-data to a PC. Two extra BNC connectors can be attached for future expandability, such as external synchronization. Each phasemeter handles one metrology channel. A bank of six phasemeters (and two zero-crossing detector cards) with an Ethernet switch can monitor the rigid body motion of an object. This device is smaller and cheaper than existing zero-crossing phasemeters. Also, because it uses Ethernet for communication with a computer, instead of a VME bridge, it is much easier to use. The phasemeter is a key part of the Precision Deployable Apertures and Structures strategic R&D effort to design large, deployable, segmented space telescopes.
Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios
Banta, Edward R.
2014-01-01
Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.
SLS-SPEC-159 Cross-Program Design Specification for Natural Environments (DSNE) Revision D
NASA Technical Reports Server (NTRS)
Roberts, Barry C.
2015-01-01
This document is derived from the former National Aeronautics and Space Administration (NASA) Constellation Program (CxP) document CxP 70023, titled "The Design Specification for Natural Environments (DSNE), Revision C." The original document has been modified to represent updated Design Reference Missions (DRMs) for the NASA Exploration Systems Development (ESD) Programs. The DSNE completes environment-related specifications for architecture, system-level, and lower-tier documents by specifying the ranges of environmental conditions that must be accounted for by NASA ESD Programs. To assure clarity and consistency, and to prevent requirements documents from becoming cluttered with extensive amounts of technical material, natural environment specifications have been compiled into this document. The intent is to keep a unified specification for natural environments that each Program calls out for appropriate application. This document defines the natural environments parameter limits (maximum and minimum values, energy spectra, or precise model inputs, assumptions, model options, etc.), for all ESD Programs. These environments are developed by the NASA Marshall Space Flight Center (MSFC) Natural Environments Branch (MSFC organization code: EV44). Many of the parameter limits are based on experience with previous programs, such as the Space Shuttle Program. The parameter limits contain no margin and are meant to be evaluated individually to ensure they are reasonable (i.e., do not apply unrealistic extreme-on-extreme conditions). The natural environments specifications in this document should be accounted for by robust design of the flight vehicle and support systems. However, it is understood that in some cases the Programs will find it more effective to account for portions of the environment ranges by operational mitigation or acceptance of risk in accordance with an appropriate program risk management plan and/or hazard analysis process. The DSNE is not intended as a definition of operational models or operational constraints, nor is it adequate, alone, for ground facilities which may have additional requirements (for example, building codes and local environmental constraints). "Natural environments," as the term is used here, refers to the environments that are not the result of intended human activity or intervention. It consists of a variety of external environmental factors (most of natural origin and a few of human origin) which impose restrictions or otherwise impact the development or operation of flight vehicles and destination surface systems. These natural environments include the following types of environments: Terrestrial environments at launch, abort, and normal landing sites (winds, temperatures, pressures, surface roughness, sea conditions, etc.); Space environments (ionizing radiation, orbital debris, meteoroids, thermosphere density, plasma, solar, Earth, and lunar-emitted thermal radiation, etc.); Destination environments (Lunar surface and orbital, Mars atmosphere and surface, near Earth asteroids, etc.). Many of the environmental specifications in this document are based on models, data, and environment descriptions contained in the CxP 70044, Constellation Program Natural Environment Definition for Design (NEDD). The NEDD provides additional detailed environment data and model descriptions to support analytical studies for ESD Programs. For background information on specific environments and their effects on spacecraft design and operations, the environment models, and the data used to generate the specifications contained in the DSNE, the reader is referred to the NEDD paragraphs listed in each section of the DSNE. Also, most of the environmental specifications in this document are tied specifically to the ESD DRMs in ESD-10012, Revision B, Exploration Systems Development Concept of Operations (ConOps). Coordination between these environment specifications and the DRMs must be maintained. This document should be compatible with the current ESD DRMs, but updates to the mission definitions and variations in interpretation may require adjustments to the environment specifications.
The impact of input quality on early sign development in native and non-native language learners.
Lu, Jenny; Jones, Anna; Morgan, Gary
2016-05-01
There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the impact of quality of input on early sign acquisition. The current study explores the outcomes of differential input in two groups of children aged two to five years: deaf children of hearing parents (DCHP) and deaf children of deaf parents (DCDP). Analysis of child sign language revealed DCDP had a more developed vocabulary and more phonological handshape types compared with DCHP. In naturalistic conversations deaf parents used more sign tokens and more phonological types than hearing parents. Results are discussed in terms of the effects of early input on subsequent language abilities.
Input Range Testing for the General Mission Analysis Tool (GMAT)
NASA Technical Reports Server (NTRS)
Hughes, Steven P.
2007-01-01
This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.
NASA Astrophysics Data System (ADS)
Fume, Kosei; Ishitani, Yasuto
2008-01-01
We propose a document categorization method based on a document model that can be defined externally for each task and that categorizes Web content or business documents into a target category in accordance with the similarity of the model. The main feature of the proposed method consists of two aspects of semantics extraction from an input document. The semantics of terms are extracted by the semantic pattern analysis and implicit meanings of document substructure are specified by a bottom-up text clustering technique focusing on the similarity of text line attributes. We have constructed a system based on the proposed method for trial purposes. The experimental results show that the system achieves more than 80% classification accuracy in categorizing Web content and business documents into 15 or 70 categories.
Gstruct: a system for extracting schemas from GML documents
NASA Astrophysics Data System (ADS)
Chen, Hui; Zhu, Fubao; Guan, Jihong; Zhou, Shuigeng
2008-10-01
Geography Markup Language (GML) becomes the de facto standard for geographic information representation on the internet. GML schema provides a way to define the structure, content, and semantic of GML documents. It contains useful structural information of GML documents and plays an important role in storing, querying and analyzing GML data. However, GML schema is not mandatory, and it is common that a GML document contains no schema. In this paper, we present Gstruct, a tool for GML schema extraction. Gstruct finds the features in the input GML documents, identifies geometry datatypes as well as simple datatypes, then integrates all these features and eliminates improper components to output the optimal schema. Experiments demonstrate that Gstruct is effective in extracting semantically meaningful schemas from GML documents.
Teachers as Human Capital or Human Beings? USAID's Perspective on Teachers
ERIC Educational Resources Information Center
Ginsburg, Mark
2017-01-01
This article analyzes three USAID education strategy documents (1998, 2005, and 2011) as well as USAID's requests for proposals for three projects to assess how teachers are represented. The main findings indicate that USAID education strategy documents a) treat teachers as human capital, a human resource input, rather than as human beings and b)…
Tonkin, M.J.; Hill, Mary C.; Doherty, John
2003-01-01
This document describes the MOD-PREDICT program, which helps evaluate userdefined sets of observations, prior information, and predictions, using the ground-water model MODFLOW-2000. MOD-PREDICT takes advantage of the existing Observation and Sensitivity Processes (Hill and others, 2000) by initiating runs of MODFLOW-2000 and using the output files produced. The names and formats of the MODFLOW-2000 input files are unchanged, such that full backward compatibility is maintained. A new name file and input files are required for MOD-PREDICT. The performance of MOD-PREDICT has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program using the email address available at the web address below. Updates might occasionally be made to this document, to the MOD-PREDICT program, and to MODFLOW- 2000. Users can check for updates on the Internet at URL http://water.usgs.gov/software/ground water.html/.
Language and Program for Documenting Software Design
NASA Technical Reports Server (NTRS)
Kleine, H.; Zepko, T. M.
1986-01-01
Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.
DOT National Transportation Integrated Search
2011-03-01
Each design input in the Mechanistic-Empirical Design Guide (MEPDG) required for the design of Jointed Plain Concrete : Pavements (JPCPs) is introduced and discussed in this report. Best values for Pennsylvania conditions were established and : recom...
DOT National Transportation Integrated Search
2011-03-01
Each design input in the Mechanistic-Empirical Design Guide (MEPDG) required for the design of Jointed Plain Concrete Pavements (JPCPs) is introduced and discussed in this report. Best values for Pennsylvania conditions were established and recommend...
Antenna for Measuring Electric Fields Within the Inner Heliosphere
NASA Technical Reports Server (NTRS)
Sittler, Edward Charles
2007-01-01
A document discusses concepts for the design of an antenna to be deployed from a spacecraft for measuring the ambient electric field associated with plasma waves at a location within 3 solar radii from the solar photosphere. The antenna must be long enough to extend beyond the photoelectron and plasma sheaths of the spacecraft (expected to be of the order of meters thick) and to enable measurements at frequencies from 20 Hz to 10 MHz without contamination by spacecraft electric-field noise. The antenna must, therefore, extend beyond the thermal protection system (TPS) of the main body of the spacecraft and must withstand solar heating to a temperature as high as 2,000 C while not conducting excessive heat to the interior of the spacecraft. The TPS would be conical and its axis would be pointed toward the Sun. The antenna would include monopole halves of dipoles that would be deployed from within the shadow of the TPS. The outer potion of each monopole would be composed of a carbon-carbon (C-C) composite surface exposed to direct sunlight (hot side) and a C-C side in shadow (cold side) with yttria-stabilized zirconia spacers in-between. The hot side cannot view the spacecraft bus, while the cold side can. The booms also can be tilted to minimize heat input to spacecraft bus. This design allows one to reduce heat input to the spacecraft bus to acceptable levels.
Three-input majority logic gate and multiple input logic circuit based on DNA strand displacement.
Li, Wei; Yang, Yang; Yan, Hao; Liu, Yan
2013-06-12
In biomolecular programming, the properties of biomolecules such as proteins and nucleic acids are harnessed for computational purposes. The field has gained considerable attention due to the possibility of exploiting the massive parallelism that is inherent in natural systems to solve computational problems. DNA has already been used to build complex molecular circuits, where the basic building blocks are logic gates that produce single outputs from one or more logical inputs. We designed and experimentally realized a three-input majority gate based on DNA strand displacement. One of the key features of a three-input majority gate is that the three inputs have equal priority, and the output will be true if any of the two inputs are true. Our design consists of a central, circular DNA strand with three unique domains between which are identical joint sequences. Before inputs are introduced to the system, each domain and half of each joint is protected by one complementary ssDNA that displays a toehold for subsequent displacement by the corresponding input. With this design the relationship between any two domains is analogous to the relationship between inputs in a majority gate. Displacing two or more of the protection strands will expose at least one complete joint and return a true output; displacing none or only one of the protection strands will not expose a complete joint and will return a false output. Further, we designed and realized a complex five-input logic gate based on the majority gate described here. By controlling two of the five inputs the complex gate can realize every combination of OR and AND gates of the other three inputs.
Pre- and postprocessing for reservoir simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, W.L.; Ingalls, L.J.; Prasad, S.J.
1991-05-01
This paper describes the functionality and underlying programing paradigms of Shell's simulator-related reservoir-engineering graphics system. THis system includes the simulation postprocessing programs Reservoir Display System (RDS) and Fast Reservoir Engineering Displays (FRED), a hypertext-like on-line documentation system (DOC), and a simulator input preprocessor (SIMPLSIM). RDS creates displays of reservoir simulation results. These displays represent the areal or cross-section distribution of computer reservoir parameters, such as pressure, phase saturation, or temperature. Generation of these images at real-time animation rates is discussed. FRED facilitates the creation of plot files from reservoir simulation output. The use of dynamic memory allocation, asynchronous I/O, amore » table-driven screen manager, and mixed-language (FORTRAN and C) programming are detailed. DOC is used to create and access on-line documentation for the pre-and post-processing programs and the reservoir simulators. DOC can be run by itself or can be accessed from within any other graphics or nongraphics application program. DOC includes a text editor, which is that basis for a reservoir simulation tutorial and greatly simplifies the preparation of simulator input. The use of sharable images, graphics, and the documentation file network are described. Finally, SIMPLSIM is a suite of program that uses interactive graphics in the preparation of reservoir description data for input into reservoir simulators. The SIMPLSIM user-interface manager (UIM) and its graphic interface for reservoir description are discussed.« less
NASA Astrophysics Data System (ADS)
Kessel, Kerstin A.; Bougatf, Nina; Bohn, Christian; Engelmann, Uwe; Oetzel, Dieter; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E.
2012-02-01
Conducting clinical studies is rather difficult because of the large variety of voluminous datasets, different documentation styles, and various information systems, especially in radiation oncology. In this paper, we describe our development of a web-based documentation system with first approaches of automatic statistical analyses for transnational and multicenter clinical studies in particle therapy. It is possible to have immediate access to all patient information and exchange, store, process, and visualize text data, all types of DICOM images, especially DICOM RT, and any other multimedia data. Accessing the documentation system and submitting clinical data is possible for internal and external users (e.g. referring physicians from abroad, who are seeking the new technique of particle therapy for their patients). Thereby, security and privacy protection is ensured with the encrypted https protocol, client certificates, and an application gateway. Furthermore, all data can be pseudonymized. Integrated into the existing hospital environment, patient data is imported via various interfaces over HL7-messages and DICOM. Several further features replace manual input wherever possible and ensure data quality and entirety. With a form generator, studies can be individually designed to fit specific needs. By including all treated patients (also non-study patients), we gain the possibility for overall large-scale, retrospective analyses. Having recently begun documentation of our first six clinical studies, it has become apparent that the benefits lie in the simplification of research work, better study analyses quality and ultimately, the improvement of treatment concepts by evaluating the effectiveness of particle therapy.
TADS: A CFD-based turbomachinery and analysis design system with GUI. Volume 1: Method and results
NASA Technical Reports Server (NTRS)
Topp, D. A.; Myers, R. A.; Delaney, R. A.
1995-01-01
The primary objective of this study was the development of a CFD (Computational Fluid Dynamics) based turbomachinery airfoil analysis and design system, controlled by a GUI (Graphical User Interface). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is the Final Report describing the theoretical basis and analytical results from the TADS system, developed under Task 18 of NASA Contract NAS3-25950, ADPAC System Coupling to Blade Analysis & Design System GUI. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.
Sierra Structural Dynamics User's Notes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reese, Garth M.
2015-10-19
Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munday, Lynn Brendon; Day, David M.; Bunting, Gregory
Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of weapons systems. This document provides a users guide to the input for Sierra/SD. Details of input specifications for the different solution types, output options, element types and parameters are included. The appendices contain detailed examples, and instructions for running the software on parallel platforms.
User guide for MODPATH version 6 - A particle-tracking model for MODFLOW
Pollock, David W.
2012-01-01
MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.
Documentation of a deep percolation model for estimating ground-water recharge
Bauer, H.H.; Vaccaro, J.J.
1987-01-01
A deep percolation model, which operates on a daily basis, was developed to estimate long-term average groundwater recharge from precipitation. It has been designed primarily to simulate recharge in large areas with variable weather, soils, and land uses, but it can also be used at any scale. The physical and mathematical concepts of the deep percolation model, its subroutines and data requirements, and input data sequence and formats are documented. The physical processes simulated are soil moisture accumulation, evaporation from bare soil, plant transpiration, surface water runoff, snow accumulation and melt, and accumulation and evaporation of intercepted precipitation. The minimum data sets for the operation of the model are daily values of precipitation and maximum and minimum air temperature, soil thickness and available water capacity, soil texture, and land use. Long-term average annual precipitation, actual daily stream discharge, monthly estimates of base flow, Soil Conservation Service surface runoff curve numbers, land surface altitude-slope-aspect, and temperature lapse rates are optional. The program is written in the FORTRAN 77 language with no enhancements and should run on most computer systems without modifications. Documentation has been prepared so that program modifications may be made for inclusions of additional physical processes or deletion of ones not considered important. (Author 's abstract)
Reliability and Maintainability model (RAM) user and maintenance manual. Part 2
NASA Technical Reports Server (NTRS)
Ebeling, Charles E.
1995-01-01
This report documents the procedures for utilizing and maintaining the Reliability and Maintainability Model (RAM) developed by the University of Dayton for the NASA Langley Research Center (LaRC). The RAM model predicts reliability and maintainability (R&M) parameters for conceptual space vehicles using parametric relationships between vehicle design and performance characteristics and subsystem mean time between maintenance actions (MTBM) and manhours per maintenance action (MH/MA). These parametric relationships were developed using aircraft R&M data from over thirty different military aircraft of all types. This report describes the general methodology used within the model, the execution and computational sequence, the input screens and data, the output displays and reports, and study analyses and procedures. A source listing is provided.
CATPAC -- Catalogue Applications Package on UNIX
NASA Astrophysics Data System (ADS)
Wood, A. R.
CATPAC is the STARLINK Catalogue and Table Package. This document describes the CATPAC applications available on UNIX. These include applications for inputing, processing and reporting tabular data including astronomical catalogues.
Sanders, Michael J.; Markstrom, Steven L.; Regan, R. Steven; Atkinson, R. Dwight
2017-09-15
A module for simulation of daily mean water temperature in a network of stream segments has been developed as an enhancement to the U.S. Geological Survey Precipitation Runoff Modeling System (PRMS). This new module is based on the U.S. Fish and Wildlife Service Stream Network Temperature model, a mechanistic, one-dimensional heat transport model. The new module is integrated in PRMS. Stream-water temperature simulation is activated by selection of the appropriate input flags in the PRMS Control File and by providing the necessary additional inputs in standard PRMS input files.This report includes a comprehensive discussion of the methods relevant to the stream temperature calculations and detailed instructions for model input preparation.
Building accurate historic and future climate MEPDG input files for Louisiana DOTD : tech summary.
DOT National Transportation Integrated Search
2017-02-01
The new pavement design process (originally MEPDG, then DARWin-ME, and now Pavement ME Design) requires two types : of inputs to infl uence the prediction of pavement distress for a selected set of pavement materials and structure. One input is : tra...
Liquid cooled data center design selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chainer, Timothy J.; Iyengar, Madhusudan K.; Parida, Pritish R.
Input data, specifying aspects of a thermal design of a liquid cooled data center, is obtained. The input data includes data indicative of ambient outdoor temperature for a location of the data center; and/or data representing workload power dissipation for the data center. The input data is evaluated to obtain performance of the data center thermal design. The performance includes cooling energy usage; and/or one pertinent temperature associated with the data center. The performance of the data center thermal design is output.
Evans, William D [Cupertino, CA
2009-02-24
A secure content object protects electronic documents from unauthorized use. The secure content object includes an encrypted electronic document, a multi-key encryption table having at least one multi-key component, an encrypted header and a user interface device. The encrypted document is encrypted using a document encryption key associated with a multi-key encryption method. The encrypted header includes an encryption marker formed by a random number followed by a derivable variation of the same random number. The user interface device enables a user to input a user authorization. The user authorization is combined with each of the multi-key components in the multi-key encryption key table and used to try to decrypt the encrypted header. If the encryption marker is successfully decrypted, the electronic document may be decrypted. Multiple electronic documents or a document and annotations may be protected by the secure content object.
Econ's optimal decision model of wheat production and distribution-documentation
NASA Technical Reports Server (NTRS)
1977-01-01
The report documents the computer programs written to implement the ECON optical decision model. The programs were written in APL, an extremely compact and powerful language particularly well suited to this model, which makes extensive use of matrix manipulations. The algorithms used are presented and listings of and descriptive information on the APL programs used are given. Possible changes in input data are also given.
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-10-01
Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited; e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.
GEN-IV Benchmarking of Triso Fuel Performance Models under accident conditions modeling input data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Blaise Paul
This document presents the benchmark plan for the calculation of particle fuel performance on safety testing experiments that are representative of operational accidental transients. The benchmark is dedicated to the modeling of fission product release under accident conditions by fuel performance codes from around the world, and the subsequent comparison to post-irradiation experiment (PIE) data from the modeled heating tests. The accident condition benchmark is divided into three parts: • The modeling of a simplified benchmark problem to assess potential numerical calculation issues at low fission product release. • The modeling of the AGR-1 and HFR-EU1bis safety testing experiments. •more » The comparison of the AGR-1 and HFR-EU1bis modeling results with PIE data. The simplified benchmark case, thereafter named NCC (Numerical Calculation Case), is derived from “Case 5” of the International Atomic Energy Agency (IAEA) Coordinated Research Program (CRP) on coated particle fuel technology [IAEA 2012]. It is included so participants can evaluate their codes at low fission product release. “Case 5” of the IAEA CRP-6 showed large code-to-code discrepancies in the release of fission products, which were attributed to “effects of the numerical calculation method rather than the physical model” [IAEA 2012]. The NCC is therefore intended to check if these numerical effects subsist. The first two steps imply the involvement of the benchmark participants with a modeling effort following the guidelines and recommendations provided by this document. The third step involves the collection of the modeling results by Idaho National Laboratory (INL) and the comparison of these results with the available PIE data. The objective of this document is to provide all necessary input data to model the benchmark cases, and to give some methodology guidelines and recommendations in order to make all results suitable for comparison with each other. The participants should read this document thoroughly to make sure all the data needed for their calculations is provided in the document. Missing data will be added to a revision of the document if necessary. 09/2016: Tables 6 and 8 updated. AGR-2 input data added« less
The advanced LIGO input optics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Chris L., E-mail: cmueller@phys.ufl.edu; Arain, Muzammil A.; Ciani, Giacomo
The advanced LIGO gravitational wave detectors are nearing their design sensitivity and should begin taking meaningful astrophysical data in the fall of 2015. These resonant optical interferometers will have unprecedented sensitivity to the strains caused by passing gravitational waves. The input optics play a significant part in allowing these devices to reach such sensitivities. Residing between the pre-stabilized laser and the main interferometer, the input optics subsystem is tasked with preparing the laser beam for interferometry at the sub-attometer level while operating at continuous wave input power levels ranging from 100 mW to 150 W. These extreme operating conditions requiredmore » every major component to be custom designed. These designs draw heavily on the experience and understanding gained during the operation of Initial LIGO and Enhanced LIGO. In this article, we report on how the components of the input optics were designed to meet their stringent requirements and present measurements showing how well they have lived up to their design.« less
Solid-State Lighting 2017 Suggested Research Topics
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2017-09-29
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
Experiment Document for 01-E077 Microgravity Investigation of Crew Reactions in 0-G (MICRO-G)
NASA Technical Reports Server (NTRS)
Newman, Dava J.
2003-01-01
The Experiment Document (ED) serves the following purposes: a) It provides a vehicle for Principal Investigators (PIS) to formally specify the requirements for performing their experiments. b) It provides a technical Statement of Work (SOW). c) It provides experiment investigators and hardware developers with a convenient source of information about Human Life Sciences (HLS) requirements for the development and/or integration of flight experiment projects. d) It is the primary source of experiment specifications for the HLS Research Program Office (RPO). Inputs from this document will be placed into a controlled database that will be used to generate other documents.
Solid-State Lighting 2017 Suggested Research Topics
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
MERRA-2 Input Observations: Summary and Assessment
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); McCarty, Will; Coy, Lawrence; Gelaro, Ronald; Huang, Albert; Merkova, Dagmar; Smith, Edmond B.; Sienkiewicz, Meta; Wargan, Krzysztof
2016-01-01
The Modern-Era Retrospective Analysis for Research and Applications, Version 2 (MERRA-2) is an atmospheric reanalysis, spanning 1980 through near-realtime, that uses state-of-the-art processing of observations from the continually evolving global observing system. The effectiveness of any reanalysis is a function not only of the input observations themselves, but also of how the observations are handled in the assimilation procedure. Relevant issues to consider include, but are not limited to, data selection, data preprocessing, quality control, bias correction procedures, and blacklisting. As the assimilation algorithm and earth system models are fundamentally fixed in a reanalysis, it is often a change in the character of the observations, and their feedbacks on the system, that cause changes in the character of the reanalysis. It is therefore important to provide documentation of the observing system so that its discontinuities and transitions can be readily linked to discontinuities seen in the gridded atmospheric fields of the reanalysis. With this in mind, this document provides an exhaustive list of the input observations, the context under which they are assimilated, and an initial assessment of selected core observations fundamental to the reanalysis.
Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data
NASA Technical Reports Server (NTRS)
Maine, Richard E.
1987-01-01
This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.
Trusted Computing Technologies, Intel Trusted Execution Technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guise, Max Joseph; Wendt, Jeremy Daniel
2011-01-01
We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorizedmore » users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.« less
NASA Technical Reports Server (NTRS)
Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Neukom, Christian; Nishimura, Sayuri; Prevost, Michael; Shankar, Renuka; Staveland, Lowell; Smith, Greg
1992-01-01
This is the Software Concept Document for the Man-machine Integration Design and Analysis System (MIDAS) being developed as part of Phase V of the Army-NASA Aircrew/Aircraft Integration (A3I) Progam. The approach taken in this program since its inception in 1984 is that of incremental development with clearly defined phases. Phase 1 began in 1984 and subsequent phases have progressed at approximately 10-16 month intervals. Each phase of development consists of planning, setting requirements, preliminary design, detailed design, implementation, testing, demonstration and documentation. Phase 5 began with an off-site planning meeting in November, 1990. It is expected that Phase 5 development will be complete and ready for demonstration to invited visitors from industry, government and academia in May, 1992. This document, produced during the preliminary design period of Phase 5, is intended to record the top level design concept for MIDAS as it is currently conceived. This document has two main objectives: (1) to inform interested readers of the goals of the MIDAS Phase 5 development period, and (2) to serve as the initial version of the MIDAS design document which will be continuously updated as the design evolves. Since this document is written fairly early in the design period, many design issues still remain unresolved. Some of the unresolved issues are mentioned later in this document in the sections on specific components. Readers are cautioned that this is not a final design document and that, as the design of MIDAS matures, some of the design ideas recorded in this document will change. The final design will be documented in a detailed design document published after the demonstrations.
DOT National Transportation Integrated Search
2009-01-01
In the MechanisticEmpirical Pavement Design Guide (M-EPDG), prediction of flexible pavement response and performance needs an input of dynamic modulus of hot-mix asphalt (HMA) at all three levels of hierarchical inputs. This study was intended to ...
Section 4. The GIS Weasel User's Manual
Viger, Roland J.; Leavesley, George H.
2007-01-01
INTRODUCTION The GIS Weasel was designed to aid in the preparation of spatial information for input to lumped and distributed parameter hydrologic or other environmental models. The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to a user's model and to generate parameters from those maps. The operation of the GIS Weasel does not require the user to be a GIS expert, only that the user have an understanding of the spatial information requirements of the environmental simulation model being used. The GIS Weasel software system uses a GIS-based graphical user interface (GUI), the C programming language, and external scripting languages. The software will run on any computing platform where ArcInfo Workstation (version 8.0.2 or later) and the GRID extension are accessible. The user controls the processing of the GIS Weasel by interacting with menus, maps, and tables. The purpose of this document is to describe the operation of the software. This document is not intended to describe the usage of this software in support of any particular environmental simulation model. Such guides are published separately.
Surgical quality assessment. A simplified approach.
DeLong, D L
1991-10-01
The current approach to QA primarily involves taking action when problems are discovered and designing a documentation system that records the deliverance of quality care. Involving the entire staff helps eliminate problems before they occur. By keeping abreast of current problems and soliciting input from staff members, the QA at our hospital has improved dramatically. The cross-referencing of JCAHO and AORN standards on the assessment form and the single-sheet reporting form expedite the evaluation process and simplify record keeping. The bulletin board increases staff members' understanding of QA and boosts morale and participation. A sound and effective QA program does not require reorganizing an entire department, nor should it invoke negative connotations. Developing an effective QA program merely requires rethinking current processes. The program must meet the department's specific needs, and although many departments concentrate on documentation, auditing charts does not give a complete picture of the quality of care delivered. The QA committee must employ a variety of data collection methods on multiple indicators to ensure an accurate representation of the care delivered, and they must not overlook any issues that directly affect patient outcomes.
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1995-01-01
Flight test maneuvers are specified for the F-18 High Alpha Research Vehicle (HARV). The maneuvers were designed for open loop parameter identification purposes, specifically for optimal input design validation at 5 degrees angle of attack, identification of individual strake effectiveness at 40 and 50 degrees angle of attack, and study of lateral dynamics and lateral control effectiveness at 40 and 50 degrees angle of attack. Each maneuver is to be realized by applying square wave inputs to specific control effectors using the On-Board Excitation System (OBES). Maneuver descriptions and complete specifications of the time/amplitude points define each input are included, along with plots of the input time histories.
Bruno Garza, J L; Young, J G
2015-01-01
Extended use of conventional computer input devices is associated with negative musculoskeletal outcomes. While many alternative designs have been proposed, it is unclear whether these devices reduce biomechanical loading and musculoskeletal outcomes. To review studies describing and evaluating the biomechanical loading and musculoskeletal outcomes associated with conventional and alternative input devices. Included studies evaluated biomechanical loading and/or musculoskeletal outcomes of users' distal or proximal upper extremity regions associated with the operation of alternative input devices (pointing devices, mice, other devices) that could be used in a desktop personal computing environment during typical office work. Some alternative pointing device designs (e.g. rollerbar) were consistently associated with decreased biomechanical loading while other designs had inconsistent results across studies. Most alternative keyboards evaluated in the literature reduce biomechanical loading and musculoskeletal outcomes. Studies of other input devices (e.g. touchscreen and gestural controls) were rare, however, those reported to date indicate that these devices are currently unsuitable as replacements for traditional devices. Alternative input devices that reduce biomechanical loading may make better choices for preventing or alleviating musculoskeletal outcomes during computer use, however, it is unclear whether many existing designs are effective.
Analysis and Simple Circuit Design of Double Differential EMG Active Electrode.
Guerrero, Federico Nicolás; Spinelli, Enrique Mario; Haberman, Marcelo Alejandro
2016-06-01
In this paper we present an analysis of the voltage amplifier needed for double differential (DD) sEMG measurements and a novel, very simple circuit for implementing DD active electrodes. The three-input amplifier that standalone DD active electrodes require is inherently different from a differential amplifier, and general knowledge about its design is scarce in the literature. First, the figures of merit of the amplifier are defined through a decomposition of its input signal into three orthogonal modes. This analysis reveals a mode containing EMG crosstalk components that the DD electrode should reject. Then, the effect of finite input impedance is analyzed. Because there are three terminals, minimum bounds for interference rejection ratios due to electrode and input impedance unbalances with two degrees of freedom are obtained. Finally, a novel circuit design is presented, including only a quadruple operational amplifier and a few passive components. This design is nearly as simple as the branched electrode and much simpler than the three instrumentation amplifier design, while providing robust EMG crosstalk rejection and better input impedance using unity gain buffers for each electrode input. The interference rejection limits of this input stage are analyzed. An easily replicable implementation of the proposed circuit is described, together with a parameter design guideline to adjust it to specific needs. The electrode is compared with the established alternatives, and sample sEMG signals are obtained, acquired on different body locations with dry contacts, successfully rejecting interference sources.
Input filter compensation for switching regulators
NASA Technical Reports Server (NTRS)
Kelkar, S. S.; Lee, F. C.
1983-01-01
A novel input filter compensation scheme for a buck regulator that eliminates the interaction between the input filter output impedance and the regulator control loop is presented. The scheme is implemented using a feedforward loop that senses the input filter state variables and uses this information to modulate the duty cycle signal. The feedforward design process presented is seen to be straightforward and the feedforward easy to implement. Extensive experimental data supported by analytical results show that significant performance improvement is achieved with the use of feedforward in the following performance categories: loop stability, audiosusceptibility, output impedance and transient response. The use of feedforward results in isolating the switching regulator from its power source thus eliminating all interaction between the regulator and equipment upstream. In addition the use of feedforward removes some of the input filter design constraints and makes the input filter design process simpler thus making it possible to optimize the input filter. The concept of feedforward compensation can also be extended to other types of switching regulators.
NASA Technical Reports Server (NTRS)
Tinoco, E. N.; Lu, P.; Johnson, F. T.
1980-01-01
A computer program developed for solving the subsonic, three dimensional flow over wing-body configurations with leading edge vortex separation is presented. Instructions are given for the proper set up and input of a problem into the computer code. Program input formats and output are described, as well as the overlay structure of the program. The program is written in FORTRAN.
The NBS Energy Model Assessment project: Summary and overview
NASA Astrophysics Data System (ADS)
Gass, S. I.; Hoffman, K. L.; Jackson, R. H. F.; Joel, L. S.; Saunders, P. B.
1980-09-01
The activities and technical reports for the project are summarized. The reports cover: assessment of the documentation of Midterm Oil and Gas Supply Modeling System; analysis of the model methodology characteristics of the input and other supporting data; statistical procedures undergirding construction of the model and sensitivity of the outputs to variations in input, as well as guidelines and recommendations for the role of these in model building and developing procedures for their evaluation.
Trick Simulation Environment 07
NASA Technical Reports Server (NTRS)
Lin, Alexander S.; Penn, John M.
2012-01-01
The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.
Input design for identification of aircraft stability and control derivatives
NASA Technical Reports Server (NTRS)
Gupta, N. K.; Hall, W. E., Jr.
1975-01-01
An approach for designing inputs to identify stability and control derivatives from flight test data is presented. This approach is based on finding inputs which provide the maximum possible accuracy of derivative estimates. Two techniques of input specification are implemented for this objective - a time domain technique and a frequency domain technique. The time domain technique gives the control input time history and can be used for any allowable duration of test maneuver, including those where data lengths can only be of short duration. The frequency domain technique specifies the input frequency spectrum, and is best applied for tests where extended data lengths, much longer than the time constants of the modes of interest, are possible. These technqiues are used to design inputs to identify parameters in longitudinal and lateral linear models of conventional aircraft. The constraints of aircraft response limits, such as on structural loads, are realized indirectly through a total energy constraint on the input. Tests with simulated data and theoretical predictions show that the new approaches give input signals which can provide more accurate parameter estimates than can conventional inputs of the same total energy. Results obtained indicate that the approach has been brought to the point where it should be used on flight tests for further evaluation.
Computational Predictions of the Performance Wright 'Bent End' Propellers
NASA Technical Reports Server (NTRS)
Wang, Xiang-Yu; Ash, Robert L.; Bobbitt, Percy J.; Prior, Edwin (Technical Monitor)
2002-01-01
Computational analysis of two 1911 Wright brothers 'Bent End' wooden propeller reproductions have been performed and compared with experimental test results from the Langley Full Scale Wind Tunnel. The purpose of the analysis was to check the consistency of the experimental results and to validate the reliability of the tests. This report is one part of the project on the propeller performance research of the Wright 'Bent End' propellers, intend to document the Wright brothers' pioneering propeller design contributions. Two computer codes were used in the computational predictions. The FLO-MG Navier-Stokes code is a CFD (Computational Fluid Dynamics) code based on the Navier-Stokes Equations. It is mainly used to compute the lift coefficient and the drag coefficient at specified angles of attack at different radii. Those calculated data are the intermediate results of the computation and a part of the necessary input for the Propeller Design Analysis Code (based on Adkins and Libeck method), which is a propeller design code used to compute the propeller thrust coefficient, the propeller power coefficient and the propeller propulsive efficiency.
The Effect of Amount and Timing of Human Resources Data on Subsystem Design.
ERIC Educational Resources Information Center
Meister, David; And Others
Human resources data (HRD) inputs often fail to influence system development. This study investigated the possibility that these inputs are sometimes deficient in quantity or timing. In addition, the effect upon design of different personnel quality and quantity requirements was analyzed. Equipment and HRD inputs which were produced during actual…
Divide and control: split design of multi-input DNA logic gates.
Gerasimova, Yulia V; Kolpashchikov, Dmitry M
2015-01-18
Logic gates made of DNA have received significant attention as biocompatible building blocks for molecular circuits. The majority of DNA logic gates, however, are controlled by the minimum number of inputs: one, two or three. Here we report a strategy to design a multi-input logic gate by splitting a DNA construct.
39 CFR 3050.60 - Miscellaneous reports and documents.
Code of Federal Regulations, 2011 CFR
2011-07-01
... copy form, and in electronic form, if available; (d) Household Diary Study (when completed); (e) Input... each year); (f) Succinct narrative explanations of how the estimates in the most recent Annual...
39 CFR 3050.60 - Miscellaneous reports and documents.
Code of Federal Regulations, 2010 CFR
2010-07-01
... copy form, and in electronic form, if available; (d) Household Diary Study (when completed); (e) Input... each year); (f) Succinct narrative explanations of how the estimates in the most recent Annual...
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.
1988-01-01
A user's manual for the computer program developed for the prediction of propeller-nacelle aerodynamic performance reported in, An Analysis for High Speed Propeller-Nacelle Aerodynamic Performance Prediction: Volume 1 -- Theory and Application, is presented. The manual describes the computer program mode of operation requirements, input structure, input data requirements and the program output. In addition, it provides the user with documentation of the internal program structure and the software used in the computer program as it relates to the theory presented in Volume 1. Sample input data setups are provided along with selected printout of the program output for one of the sample setups.
Rotor Wake/Stator Interaction Noise Prediction Code Technical Documentation and User's Manual
NASA Technical Reports Server (NTRS)
Topol, David A.; Mathews, Douglas C.
2010-01-01
This report documents the improvements and enhancements made by Pratt & Whitney to two NASA programs which together will calculate noise from a rotor wake/stator interaction. The code is a combination of subroutines from two NASA programs with many new features added by Pratt & Whitney. To do a calculation V072 first uses a semi-empirical wake prediction to calculate the rotor wake characteristics at the stator leading edge. Results from the wake model are then automatically input into a rotor wake/stator interaction analytical noise prediction routine which calculates inlet aft sound power levels for the blade-passage-frequency tones and their harmonics, along with the complex radial mode amplitudes. The code allows for a noise calculation to be performed for a compressor rotor wake/stator interaction, a fan wake/FEGV interaction, or a fan wake/core stator interaction. This report is split into two parts, the first part discusses the technical documentation of the program as improved by Pratt & Whitney. The second part is a user's manual which describes how input files are created and how the code is run.
ERIC Educational Resources Information Center
Brandhorst, Ted, Ed.; And Others
This loose-leaf manual provides the detailed rules, guidelines, and examples to be used by the components of the Educational Resources Information Center (ERIC) Network in acquiring and selecting documents and in processing them (i.e., cataloging, indexing, abstracting) for input to the ERIC computer system and subsequent announcement in…
ASTROS Enhancements. Volume I- ASTRO User’s Manual
1993-03-01
alternative functions. The mechanisms by which these more advanced features are invoked are included in this manual but no attempt is made to provide... advanced topics are treated in the Programmer’s and Application Manuals which document the individual modules in the system and their interactions...of the more advanced features of the system without cluttering the discussion with details of the input structures. The detailed documentation of the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gartling, D.K.
User instructions are given for the finite element, electromagnetics program, TORO II. The theoretical background and numerical methods used in the program are documented in SAND95-2472. The present document also describes a number of example problems that have been analyzed with the code and provides sample input files for typical simulations. 20 refs., 34 figs., 3 tabs.
The DELTA PREP Initiative: Accelerating Coalition Capacity for Intimate Partner Violence Prevention
Zakocs, Ronda; Freire, Kimberley E.
2018-01-01
Background The DELTA PREP Project aimed to build the prevention capacity of 19 state domestic violence coalitions by offering eight supports designed to promote prevention integration over a 3-year period: modest grant awards, training events, technical assistance, action planning, coaching hubs, the Coalition Prevention Capacity Assessment, an online workstation, and the online documentation support system. Objectives Using quantitative and qualitative data, we sought to explain how coalitions integrated prevention within their structures and functions and document how DELTA PREP supports contributed to coalitions’ integration process. Results We found that coalitions followed a common pathway to integrate prevention. First, coalitions exhibited precursors of organizational readiness, especially having prevention champions. Second, coalitions engaged in five critical actions: engaging in dialogue, learning about prevention, forming teams, soliciting input from the coalition, and action planning. Last, by engaging in these critical actions, coalitions enhanced two key organizational readiness factors—developing a common understanding of prevention and an organizational commitment to prevention. We also found that DELTA PREP supports contributed to coalitions’ abilities to integrate prevention by supporting learning about prevention, fostering a prevention team, and engaging in action planning by leveraging existing opportunities. Two DELTA PREP supports—coaching hubs and the workstation—did not work as initially intended. From the DELTA PREP experience, we offer several lessons to consider when designing future prevention capacity-building initiatives. PMID:26245934
The DELTA PREP Initiative: Accelerating Coalition Capacity for Intimate Partner Violence Prevention.
Zakocs, Ronda; Freire, Kimberley E
2015-08-01
The DELTA PREP Project aimed to build the prevention capacity of 19 state domestic violence coalitions by offering eight supports designed to promote prevention integration over a 3-year period: modest grant awards, training events, technical assistance, action planning, coaching hubs, the Coalition Prevention Capacity Assessment, an online workstation, and the online documentation support system. Using quantitative and qualitative data, we sought to explain how coalitions integrated prevention within their structures and functions and document how DELTA PREP supports contributed to coalitions' integration process. We found that coalitions followed a common pathway to integrate prevention. First, coalitions exhibited precursors of organizational readiness, especially having prevention champions. Second, coalitions engaged in five critical actions: engaging in dialogue, learning about prevention, forming teams, soliciting input from the coalition, and action planning. Last, by engaging in these critical actions, coalitions enhanced two key organizational readiness factors-developing a common understanding of prevention and an organizational commitment to prevention. We also found that DELTA PREP supports contributed to coalitions' abilities to integrate prevention by supporting learning about prevention, fostering a prevention team, and engaging in action planning by leveraging existing opportunities. Two DELTA PREP supports-coaching hubs and the workstation-did not work as initially intended. From the DELTA PREP experience, we offer several lessons to consider when designing future prevention capacity-building initiatives. © 2015 Society for Public Health Education.
MELCOR/CONTAIN LMR Implementation Report. FY14 Progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphries, Larry L; Louie, David L.Y.
2014-10-01
This report describes the preliminary implementation of the sodium thermophysical properties and the design documentation for the sodium models of CONTAIN-LMR to be implemented into MELCOR 2.1. In the past year, the implementation included two separate sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. To minimize the impact to MELCOR, the implementation of the fusion safety database (FSD) was done by utilizing the detection of the datamore » input file as a way to invoking the FSD. The FSD methodology has been adapted currently for this work, but it may subject modification as the project continues. The second source uses properties generated for the SIMMER code. Preliminary testing and results from this implementation of sodium properties are given. In this year, the design document for the CONTAIN-LMR sodium models, such as the two condensable option, sodium spray fire, and sodium pool fire is being developed. This design document is intended to serve as a guide for the MELCOR implementation. In addition, CONTAIN-LMR code used was based on the earlier version of CONTAIN code. Many physical models that were developed since this early version of CONTAIN may not be captured by the code. Although CONTAIN 2, which represents the latest development of CONTAIN, contains some sodium specific models, which are not complete, the utilizing CONTAIN 2 with all sodium models implemented from CONTAIN-LMR as a comparison code for MELCOR should be done. This implementation should be completed in early next year, while sodium models from CONTAIN-LMR are being integrated into MELCOR. For testing, CONTAIN decks have been developed for verification and validation use.« less
Huff, G.F.
2004-01-01
The tendency of solutes in input water to precipitate efficiency lowering scale deposits on the membranes of reverse osmosis (RO) desalination systems is an important factor in determining the suitability of input water for desalination. Simulated input water evaporation can be used as a technique to quantitatively assess the potential for scale formation in RO desalination systems. The technique was demonstrated by simulating the increase in solute concentrations required to form calcite, gypsum, and amorphous silica scales at 25??C and 40??C from 23 desalination input waters taken from the literature. Simulation results could be used to quantitatively assess the potential of a given input water to form scale or to compare the potential of a number of input waters to form scale during RO desalination. Simulated evaporation of input waters cannot accurately predict the conditions under which scale will form owing to the effects of potentially stable supersaturated solutions, solution velocity, and residence time inside RO systems. However, the simulated scale-forming potential of proposed input waters could be compared with the simulated scale-forming potentials and actual scale-forming properties of input waters having documented operational histories in RO systems. This may provide a technique to estimate the actual performance and suitability of proposed input waters during RO.
NASA Technical Reports Server (NTRS)
Johnson, D. L. (Editor)
2008-01-01
This document provides guidelines for the terrestrial environment that are specifically applicable in the development of design requirements/specifications for NASA aerospace vehicles, payloads, and associated ground support equipment. The primary geographic areas encompassed are the John F. Kennedy Space Center, FL; Vandenberg AFB, CA; Edwards AFB, CA; Michoud Assembly Facility, New Orleans, LA; John C. Stennis Space Center, MS; Lyndon B. Johnson Space Center, Houston, TX; George C. Marshall Space Flight Center, Huntsville, AL; and the White Sands Missile Range, NM. This document presents the latest available information on the terrestrial environment applicable to the design and operations of aerospace vehicles and supersedes information presented in NASA-HDBK-1001 and TM X-64589, TM X-64757, TM-78118, TM-82473, and TM-4511. Information is included on winds, atmospheric thermodynamic models, radiation, humidity, precipitation, severe weather, sea state, lightning, atmospheric chemistry, seismic criteria, and a model to predict atmospheric dispersion of aerospace engine exhaust cloud rise and growth. In addition, a section has been included to provide information on the general distribution of natural environmental extremes in the conterminous United States, and world-wide, that may be needed to specify design criteria in the transportation of space vehicle subsystems and components. A section on atmospheric attenuation has been added since measurements by sensors on certain Earth orbital experiment missions are influenced by the Earth s atmosphere. There is also a section on mission analysis, prelaunch monitoring, and flight evaluation as related to the terrestrial environment inputs. The information in these guidelines is recommended for use in the development of aerospace vehicle and related equipment design and associated operational criteria, unless otherwise stated in contract work specifications. The terrestrial environmental data in these guidelines are primarily limited to information below 90 km altitude.
New ergonomic and functional design of digital conferencing rooms in a clinical environment
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Amato, Carlos L.; McGill, D. Ric; Liu, Brent J.; Balbona, Joseph A.; McCoy, J. Michael
2003-05-01
Clinical conferences and multidisciplinary medical rounds play a major role in patient management and decision-making relying on presentation of variety of documents: films, charts, videotapes, graphs etc. These conferences and clinical rounds are often carried out in conferencing rooms or department libraries that are usually not suitable for presentation of the data in electronic format. In most instances digital projection equipment is added to existing rooms without proper consideration to functional, ergonomic, acoustical, spatial and environmental requirements. Also, in large academic institutions, the conference rooms serve multiple purposes including as classrooms for teaching and education of students and for administrative meetings among managers and staff. In the migration toward a fully digital hospital we elected to analyze the functional requirements and optimize the ergonomic design of conferencing rooms that can accommodate clinical rounds, multidisciplinary reviews, seminars, formal lectures and department meetings. 3D computer simulation was used for better evaluation and analysis of spatial and ergonomic parameters and for gathering opinions and input from users on different design options. A critical component of the design is the understanding of the different workflow and requirements of different types of conferences and presentations that can be carried out in these conference rooms.
Park, Sun Young; Lee, So Young; Chen, Yunan
2012-03-01
The goal of this study was to examine the effects of medical notes (MD) in an electronic medical records (EMR) system on doctors' work practices at an Emergency Department (ED). We conducted a six-month qualitative study, including in situ field observations and semi-structured interviews, in an ED affiliated with a large teaching hospital during the time periods of before, after, and during the paper-to-electronic transition of the rollout of an EMR system. Data were analyzed using open coding method and various visual representations of workflow diagrams. The use of the EMR in the ED resulted in both direct and indirect effects on ED doctors' work practices. It directly influenced the ED doctors' documentation process: (i) increasing documentation time four to five fold, which in turn significantly increased the number of incomplete charts, (ii) obscuring the distinction between residents' charting inputs and those of attendings, shifting more documentation responsibilities to the residents, and (iii) leading to the use of paper notes as documentation aids to transfer information from the patient bedside to the charting room. EMR use also had indirect consequences: it increased the cognitive burden of doctors, since they had to remember multiple patients' data; it aggravated doctors' multi-tasking due to flexibility in the system use allowing more interruptions; and it caused ED doctors' work to become largely stationary in the charting room, which further contributed to reducing doctors' time with patients and their interaction with nurses. We suggest three guidelines for designing future EMR systems to be used in teaching hospitals. First, the design of documentation tools in EMR needs to take into account what we called "note-intensive tasks" to support the collaborative nature of medical work. Second, it should clearly define roles and responsibilities. Lastly, the system should provide a balance between flexibility and interruption to better manage the complex nature of medical work and to facilitate necessary interactions among ED staff and patients in the work environment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Banning standard cell engineering notebook
NASA Technical Reports Server (NTRS)
1976-01-01
A family of standardized thick-oxide P-MOS building blocks (standard cells) is described. The information is presented in a form useful for systems designs, logic design, and the preparation of inputs to both sets of Design Automation programs for array design and analysis. A data sheet is provided for each cell and gives the cell name, the cell number, its logic symbol, Boolean equation, truth table, circuit schematic circuit composite, input-output capacitances, and revision date. The circuit type file, also given for each cell, together with the logic drawing contained on the data sheet provides all the information required to prepare input data files for the Design Automation Systems. A detailed description of the electrical design procedure is included.
FLIS Procedures Manual. Document Identifier Code Input/Output Formats (Fixed Length). Volume 8.
1997-04-01
DATA ELE- MENTS. SEGMENT R MAY BE REPEATED A MAXIMUM OF THREE (3) TIMES IN ORDER TO ACQUIRE THE REQUIRED MIX OF SEGMENTS OR INDIVIDUAL DATA ELEMENTS TO...preceding record. Marketing input DICs. QI Next DRN of appropriate segment will be QF The assigned NSN or PSCN being can- reflected in accordance with Table...Classified KFC Notification of Possible Duplicate (Sub- KRP Characteristics Data mitter) Follow-Up Interrogation LFU Notification of Return, SSR Transaction
Computer Description of the M561 Utility Truck
1984-10-01
GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom
A Trajectory Algorithm to Support En Route and Terminal Area Self-Spacing Concepts
NASA Technical Reports Server (NTRS)
Abbott, Terence S.
2007-01-01
This document describes an algorithm for the generation of a four dimensional aircraft trajectory. Input data for this algorithm are similar to an augmented Standard Terminal Arrival Route (STAR) with the augmentation in the form of altitude or speed crossing restrictions at waypoints on the route. Wind data at each waypoint are also inputs into this algorithm. The algorithm calculates the altitude, speed, along path distance, and along path time for each waypoint.
Kisely, Steve; Wyder, Marianne; Dietrich, Josie; Robinson, Gail; Siskind, Dan; Crompton, David
2017-02-01
Improving the input of people with mental illness into their recovery plans can potentially lead to better outcomes. In the present study, we evaluated the introduction of motivational aftercare planning (MAP) into the discharge planning of psychiatric inpatients. MAP is a manualized intervention combining motivational interviewing with advance directives. We measured changes in the level of patient input into discharge planning following training staff in the use of MAP. This included the following: (i) documentation of early relapse signs along with successful past responses; (ii) evidence of aftercare planning; and (iii) the use of the patients' own words in the plan. We used a ward-level controlled before-and-after design comparing one intervention ward with two control wards. We used anonymized recovery plans, with a goal of 50 plans per ward before and after the intervention, to look for evidence of patient input into care planning with a standardized checklist. There were also qualitative interviews with individuals discharged from the unit. We reviewed 100 intervention ward plans and 197 control ones (total n = 297). There were no significant differences in recovery plans from intervention and control wards at baseline. Following MAP training, the intervention ward improved significantly (e.g. identification of triggers increased from 52 to 94%, χ 2 = 23.3, d.f. =1, P < 0.001). This did not occur in the control wards. The qualitative data (n = 20 interviews) showed improvements in participants' experiences of discharge planning. MAP increased inpatient input into discharge planning and was valued by participants. The effect on subsequent health service use needs evaluation. © 2016 Australian College of Mental Health Nurses Inc.
Solid-State Lighting 2017 Suggested Research Topics Supplement: Technology and Market Context
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
TADS--A CFD-Based Turbomachinery Analysis and Design System with GUI: User's Manual. 2.0
NASA Technical Reports Server (NTRS)
Koiro, M. J.; Myers, R. A.; Delaney, R. A.
1999-01-01
The primary objective of this study was the development of a Computational Fluid Dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a Graphical User Interface (GUI). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is intended to serve as a User's Manual for the computer programs which comprise the TADS system, developed under Task 18 of NASA Contract NAS3-27350, ADPAC System Coupling to Blade Analysis & Design System GUI and Task 10 of NASA Contract NAS3-27394, ADPAC System Coupling to Blade Analysis & Design System GUI, Phase II-Loss, Design and, Multi-stage Analysis. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) in an interactive package. Throughflow analysis and design capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a highly loaded fan, a compressor stator, a low speed turbine blade and a transonic turbine vane.
A Secure and Reliable High-Performance Field Programmable Gate Array for Information Processing
2012-03-01
receives a data token from its control input (shown as a horizontal arrow above). The value of this data token is used to select an input port. The input...dual of a merge. It receives a data token from its control input (shown as a horizontal arrow above). The value of this data token is used to select...Transactions on Computer-Aided Design of Intergrated Circuits and Systems, Vol. 26, No. 2, February 2007. [12] Cadence Design Systems, “Clock Domain
Flight Demonstration of Integrated Airport Surface Movement Technologies
NASA Technical Reports Server (NTRS)
Young, Steven D.; Jones, Denise R.
1998-01-01
This document describes operations associated with a set of flight experiments and demonstrations using a Boeing-757-200 research aircraft as part of low visibility landing and surface operations (LVLASO) research activities. To support this experiment, the B-757 performed flight and taxi operations at the Atlanta Hartsfield International Airport in Atlanta, GA. The test aircraft was equipped with experimental displays that were designed to provide flight crews with sufficient information to enable safe, expedient surface operations in any weather condition down to a runway visual range of 300 feet. In addition to flight deck displays and supporting equipment onboard the B-757, there was also a ground-based component of the system that provided for ground controller inputs and surveillance of airport surface movements. Qualitative and quantitative results are discussed.
Development of an Aircraft Approach and Departure Atmospheric Profile Generation Algorithm
NASA Technical Reports Server (NTRS)
Buck, Bill K.; Velotas, Steven G.; Rutishauser, David K. (Technical Monitor)
2004-01-01
In support of NASA Virtual Airspace Modeling and Simulation (VAMS) project, an effort was initiated to develop and test techniques for extracting meteorological data from landing and departing aircraft, and for building altitude based profiles for key meteorological parameters from these data. The generated atmospheric profiles will be used as inputs to NASA s Aircraft Vortex Spacing System (AVOLSS) Prediction Algorithm (APA) for benefits and trade analysis. A Wake Vortex Advisory System (WakeVAS) is being developed to apply weather and wake prediction and sensing technologies with procedures to reduce current wake separation criteria when safe and appropriate to increase airport operational efficiency. The purpose of this report is to document the initial theory and design of the Aircraft Approach Departure Atmospheric Profile Generation Algorithm.
Dynamic Analysis of Spur Gear Transmissions (DANST). PC Version 3.00 User Manual
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Lin, Hsiang Hsi; Delgado, Irebert R.
1996-01-01
DANST is a FORTRAN computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the static transmission error, dynamic load, tooth bending stress and other properties of spur gears as they are influenced by operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratios ranging from one to three. It was designed to be easy to use and it is extensively documented in several previous reports and by comments in the source code. This report describes installing and using a new PC version of DANST, covers input data requirements and presents examples.
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1975-01-01
STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.
VLBI2010 Receiver Back End Comparison
NASA Technical Reports Server (NTRS)
Petrachenko, Bill
2013-01-01
VLBI2010 requires a receiver back-end to convert analog RF signals from the receiver front end into channelized digital data streams to be recorded or transmitted electronically. The back end functions are typically performed in two steps: conversion of analog RF inputs into IF bands (see Table 2), and conversion of IF bands into channelized digital data streams (see Tables 1a, 1b and 1c). The latter IF systems are now completely digital and generically referred to as digital back ends (DBEs). In Table 2 two RF conversion systems are compared, and in Tables 1a, 1b, and 1c nine DBE systems are compared. Since DBE designs are advancing rapidly, the data in these tables are only guaranteed to be current near the update date of this document.
Microgravity science experiment integration - When the PI and the PED differ
NASA Technical Reports Server (NTRS)
Baer-Peckham, M. S.; Mccarley, K. S.
1991-01-01
This paper addresses issues related to the integration of principal investigators (PIs) and payload-element developers (PEDs) for conducting effective microgravity experiments. The Crystal Growth Furnace (CGF) is used as an example to demonstrate the key issues related to the integration of a PI's sample into a facility run by a different organization. Attention is given to the typical preflight timeline, documentation required for experimental implementation, and hardware deliverables. A flow chart delineates the payload-integration process flow, and PI inputs required for an experiment include equipment and procedure definitions, detailed design and fabrication of the experiment-specific equipment, and specifications of the contract-end item. The present analysis is of interest to the coordination of effective microgravity experiments on the Space Station Freedom that incorporate PIs and PEDs from different organizations.
NASA Technical Reports Server (NTRS)
Rozendaal, Rodger A.; Behbehani, Roxanna
1990-01-01
NASA initiated the Variable Sweep Transition Flight Experiment (VSTFE) to establish a boundary layer transition database for laminar flow wing design. For this experiment, full-span upper surface gloves were fitted to a variable sweep F-14 aircraft. The development of an improved laminar boundary layer stability analysis system called the Unified Stability System (USS) is documented and results of its use on the VSTFE flight data are shown. The USS consists of eight computer codes. The theoretical background of the system is described, as is the input, output, and usage hints. The USS is capable of analyzing boundary layer stability over a wide range of disturbance frequencies and orientations, making it possible to use different philosophies in calculating the growth of disturbances on sweptwings.
STREAM Table Program: User's manual and program document
NASA Technical Reports Server (NTRS)
Hiles, K. H.
1981-01-01
This program was designed to be an editor for the Lewis Chemical Equilibrium program input files and is used for storage, manipulation and retrieval of the large amount of data required. The files are based on the facility name, case number, and table number. The data is easily recalled by supplying the sheet number to be displayed. The retrieval basis is a sheet defined to be all of the individual flow streams which comprise a given portion of a coal gasification system. A sheet may cover more than one page of output tables. The program allows for the insertion of a new table, revision of existing tables, deletion of existing tables, or the printing of selected tables. No calculations are performed. Only pointers are used to keep track of the data.
DOT National Transportation Integrated Search
2000-03-01
Seven Key ITS Application Goals emerged from the document review, key contact interviews and input from attendees at ITS committee meetings. They were to use the ITS applications to improve the overall safety of the transportation network, to improve...
Guide for Commenting on NEEDS and IPM
Find on this page a document intended to provide guidance on submitting clear, concise, and impactful comments on NEEDS (National Electric Energy Data System), other inputs to the Integrated Planning Model (IPM), or outputs from IPM.
SMM-UVSP ozone profile inversion programs
NASA Technical Reports Server (NTRS)
Smith, H. J. P.
1983-01-01
The documentation and user manual for the software used to invert the UVSP aeronomy data taken by the SMM are provided. The programs are described together with their interfaces and what inputs are required from the user.
Computer program documentation user information for the RSO-tape print program (RSOPRNT)
NASA Technical Reports Server (NTRS)
Gibbs, P. M. (Principal Investigator)
1980-01-01
A user's guide for the RSOPRNT, a TRASYS Master Restart Output Tape (RSO) reader is presented. Background information and sample runstreams, as well as, references, input requirements and options, are included.
IVHS Architecture Development, Regional Forum Results
DOT National Transportation Integrated Search
1994-06-01
THIS DOCUMENT SUMMARIZES STAKEHOLDER FEEDBACK FROM TEN REGIONAL ARCHITECTURE FORUMS CONDUCTED FROM APRIL 21 THROUGH MAY 11, 1994. A WRITTEN FORM WAS THE PRIMARY MEANS FOR OBTAINING INPUT. EACH ARCHITECTURE FORUM ALSO PROVIDED THE OPPORTUNITY FOR PART...
Rapid replacement of bridge deck expansion joints study - phase I : [tech transfer summary].
DOT National Transportation Integrated Search
2014-12-01
This initial research phase focused on documenting the current : means and methods of bridge expansion joint deterioration, : maintenance, and replacement and on identifying improvements : through all of the input gathered.
Design of vaccination and fumigation on Host-Vector Model by input-output linearization method
NASA Astrophysics Data System (ADS)
Nugraha, Edwin Setiawan; Naiborhu, Janson; Nuraini, Nuning
2017-03-01
Here, we analyze the Host-Vector Model and proposed design of vaccination and fumigation to control infectious population by using feedback control especially input-output liniearization method. Host population is divided into three compartments: susceptible, infectious and recovery. Whereas the vector population is divided into two compartment such as susceptible and infectious. In this system, vaccination and fumigation treat as input factors and infectious population as output result. The objective of design is to stabilize of the output asymptotically tend to zero. We also present the examples to illustrate the design model.
ERIC Educational Resources Information Center
Ding, Daniel D.
2000-01-01
Presents historical roots of page design principles, arguing that current theories and practices of document design have their roots in gender-related theories of images. Claims visual design should be evaluated regarding the rhetorical situation in which the design is used. Focuses on visual images of documents in professional communication,…
Status of Hollow Cathode Heater Development for the Space Station Plasma Contactor
NASA Technical Reports Server (NTRS)
Soulas, George C.
1994-01-01
A hollow cathode-based plasma contactor has been selected for use on the Space Station. During the operation of the plasma contactor, the hollow cathode heater will endure approximately 12000 thermal cycles. Since a hollow cathode heater failure would result in a plasma contactor failure, a hollow cathode heater development program was established to produce a reliable heater. The development program includes the heater design, process documents for both heater fabrication and assembly, and heater testing. The heater design was a modification of a sheathed ion thruster cathode heater. Heater tests included testing of the heater unit alone and plasma contactor and ion thruster testing. To date, eight heaters have been or are being processed through heater unit testing, two through plasma contactor testing and three through ion thruster testing, all using direct current power supplies. Comparisons of data from heater unit performance tests before cyclic testing, plasma contactor tests, and ion thruster tests at the ignition input current level show the average deviation of input power and tube temperature near the cathode tip to be +/-0.9 W and +/- 21 C, respectively. Heater unit testing included cyclic testing to evaluate reliability under thermal cycling. The first heater, although damaged during assembly, completed 5985 ignition cycles before failing. Four additional heaters successfully completed 6300, 6300, 700, and 700 cycles. Heater unit testing is currently ongoing for three heaters which have to date accumulated greater than 7250, greater than 5500, and greater than 5500 cycles, respectively.
The Assay Development Working Group (ADWG) of the CPTAC Program is currently drafting a document to propose best practices for generation, quantification, storage, and handling of peptide standards used for mass spectrometry-based assays, as well as interpretation of quantitative proteomic data based on peptide standards. The ADWG is seeking input from commercial entities that provide peptide standards for mass spectrometry-based assays or that perform amino acid analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Leake, S.A.; Prudic, David E.
1991-01-01
Removal of ground water by pumping from aquifers may result in compaction of compressible fine-grained beds that are within or adjacent to the aquifers. Compaction of the sediments and resulting land subsidence may be permanent if the head declines result in vertical stresses beyond the previous maximum stress. The process of permanent compaction is not routinely included in simulations of ground-water flow. To simulate storage changes from both elastic and inelastic compaction, a computer program was written for use with the U.S. Geological Survey modular finite-difference ground- water flow model. The new program, the Interbed-Storage Package, is designed to be incorporated into this model. In the Interbed-Storage Package, elastic compaction or expansion is assumed to be proportional to change in head. The constant of proportionality is the product of the skeletal component of elastic specific storage and the thickness of the sediments. Similarly, inelastic compaction is assumed to be proportional to decline in head. The constant of proportionality is the product of the skeletal component of inelastic specific storage and the thickness of the sediments. Storage changes are incorporated into the ground-water flow model by adding an additional term to the right-hand side of the flow equation. Within a model time step, the package appropriately apportions storage changes between elastic and inelastic components on the basis of the relation of simulated head to the previous minimum (preconsolidation) head. Two tests were performed to verify that the package works correctly. The first test compared model-calculated storage and compaction changes to hand-calculated values for a three-dimensional simulation. Model and hand-calculated values were essentially equal. The second test was performed to compare the results of the Interbed-Storage Package with results of the one-dimensional Helm compaction model. This test problem simulated compaction in doubly draining confining beds stressed by head changes in adjacent aquifers. The Interbed-Storage Package and the Helm model computed essentially equal values of compaction. Documentation of the Interbed-Storage Package includes data input instructions, flow charts, narratives, and listings for each of the five modules included in the package. The documentation also includes an appendix describing input instructions and a listing of a computer program for time-variant specified-head boundaries. That package was developed to reduce the amount of data input and output associated with one of the Interbed-Storage Package test problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria N.; Salko, Robert K.
Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, andmore » subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.« less
The Pollution Hazard Assessment System: Documentation and Users Manual
1989-10-01
term f(Ki,Si) is a measure of the ability of a pollutant to be transmitted from soil to the ingested item (if the item ingested is soil itself, this...limiting dose for cattle based on extrapolation from either: 43 o The long term human no observed effect dose level (NOEL) o A mammalian lifetime NOEL o A...101) * PANV(101) 11360 ANSWER-NOEL*100:RTURN 11400 INPUT I Enter Mammalian Lifetime NOEL in mg/kg-day: ",NOEL 11410 ANSWER-NOEL/lORETUW 11450 INPUT
1994-07-01
REQUIRED MIX OF SEGMENTS OR INDIVIDUAL DATA ELEMENTS TO BE EXTRACTED. IN SEGMENT R ON AN INTERROGATION TRANSACTION (LTI), DATA RECORD NUMBER (DRN 0950) ONLY...and zation and Marketing input DICs. insert the Continuation Indicator Code (DRN 8555) in position 80 of this record. Maximum of OF The assigned NSN...for Procurement KFR, File Data Minus Security Classified Characteristics Data KFC 8.5-2 DoD 4100.39-M Volume 8 CHAPTER 5 ALPHABETIC INDEX OF DIC
1985-10-01
NOTE3 1W. KFY OORDS (Continwo =n reverse aide If necesesar aid ldwttlfy by" block ntmber) •JW7 Regions, COM-EOM Region Ident• fication GIFT Material...technique of mobna.tcri• i Geometr- (Com-Geom). The Com-Gem data is used as input to the Geometric Inf• •cation for Targets ( GIFT ) computer code to... GIFT ) 2 3 computer code. This report documents the combinatorial geometry (Com-Geom) target description data which is the input data for the GIFT code
Shared Bibliographic Input Network (SBIN) Conference Proceedings, October 1982
1982-10-01
records for their document collections. In the process of development of the CIRCANET requirements, we conceived a flowchart (figure 3) of the ideal system...with beginners ). b. If an error is made when taping, it can be detecteA and corrected on screen before writing onto the tape being used for input. c...It’s an easier and quicker method for beginners to become familiar with all of the fields and their entries. d. If a record is lost during the process
TADS: A CFD-Based Turbomachinery Analysis and Design System with GUI: Methods and Results. 2.0
NASA Technical Reports Server (NTRS)
Koiro, M. J.; Myers, R. A.; Delaney, R. A.
1999-01-01
The primary objective of this study was the development of a Computational Fluid Dynamics (CFD) based turbomachinery airfoil analysis and design system, controlled by a Graphical User Interface (GUI). The computer codes resulting from this effort are referred to as TADS (Turbomachinery Analysis and Design System). This document is the Final Report describing the theoretical basis and analytical results from the TADS system developed under Task 10 of NASA Contract NAS3-27394, ADPAC System Coupling to Blade Analysis & Design System GUI, Phase II-Loss, Design and. Multi-stage Analysis. TADS couples a throughflow solver (ADPAC) with a quasi-3D blade-to-blade solver (RVCQ3D) or a 3-D solver with slip condition on the end walls (B2BADPAC) in an interactive package. Throughflow analysis and design capability was developed in ADPAC through the addition of blade force and blockage terms to the governing equations. A GUI was developed to simplify user input and automate the many tasks required to perform turbomachinery analysis and design. The coupling of the various programs was done in such a way that alternative solvers or grid generators could be easily incorporated into the TADS framework. Results of aerodynamic calculations using the TADS system are presented for a multistage compressor, a multistage turbine, two highly loaded fans, and several single stage compressor and turbine example cases.
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
Emergency medical services : agenda for the future
DOT National Transportation Integrated Search
1996-08-01
The purpose of the EMS Agenda for the Future is to determine the most important directions for future EMS development, incorporating input from a broad, multidisciplinary spectrum of EMS stakeholders. This document provides guiding principles for the...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Demand Side Variability, and Network Variability studies, including input data, processing programs, and... should include the product or product groups carried under each listed contract; (k) Spreadsheets and...
Low-cost safety enhancements for stop-controlled and signalized intersections
DOT National Transportation Integrated Search
2009-05-01
The purpose of this document is to present information on suggested effective, low-cost intersection countermeasures developed using intersection safety research results and input from an intersection safety expert panel. These low-cost countermeasur...
MOVES2010a regional level sensitivity analysis
DOT National Transportation Integrated Search
2012-12-10
This document discusses the sensitivity of various input parameter effects on emission rates using the US Environmental Protection Agencys (EPAs) MOVES2010a model at the regional level. Pollutants included in the study are carbon monoxide (CO),...
Towards Rational Decision-Making in Secondary Education.
ERIC Educational Resources Information Center
Cohn, Elchanan
Without a conscious effort to achieve optimum resource allocation, there is a real danger that educational resources may be wasted. This document uses input-output analysis to develop a model for rational decision-making in secondary education. (LLR)
MAPSIT and a Roadmap for Lunar and Planetary Spatial Data Infrastructure
NASA Astrophysics Data System (ADS)
Radebaugh, J.; Archinal, B.; Beyer, R.; DellaGiustina, D.; Fassett, C.; Gaddis, L.; Hagerty, J.; Hare, T.; Laura, J.; Lawrence, S. J.; Mazarico, E.; Naß, A.; Patthoff, A.; Skinner, J.; Sutton, S.; Thomson, B. J.; Williams, D.
2017-10-01
We describe MAPSIT, and the development of a roadmap for lunar and planetary SDI, based on previous relevant documents and community input, and consider how to best advance lunar science, exploration, and commercial development.
40 CFR 60.665 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... heater with a design heat input capacity of 44 MW (150 million Btu/hour) or greater is used to comply...) The average combustion temperature of the boiler or process heater with a design heat input capacity... design (i.e., steam-assisted, air-assisted or nonassisted), all visible emission readings, heat content...
Hanford analytical sample projections FY 1998--FY 2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, S.M.
1998-02-12
Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less
Gaia DR2 documentation Chapter 3: Astrometry
NASA Astrophysics Data System (ADS)
Hobbs, D.; Lindegren, L.; Bastian, U.; Klioner, S.; Butkevich, A.; Stephenson, C.; Hernandez, J.; Lammers, U.; Bombrun, A.; Mignard, F.; Altmann, M.; Davidson, M.; de Bruijne, J. H. J.; Fernández-Hernández, J.; Siddiqui, H.; Utrilla Molina, E.
2018-04-01
This chapter of the Gaia DR2 documentation describes the models and processing steps used for the astrometric core solution, namely, the Astrometric Global Iterative Solution (AGIS). The inputs to this solution rely heavily on the basic observables (or astrometric elementaries) which have been pre-processed and discussed in Chapter 2, the results of which were published in Fabricius et al. (2016). The models consist of reference systems and time scales; assumed linear stellar motion and relativistic light deflection; in addition to fundamental constants and the transformation of coordinate systems. Higher level inputs such as: planetary and solar system ephemeris; Gaia tracking and orbit information; initial quasar catalogues and BAM data are all needed for the processing described here. The astrometric calibration models are outlined followed by the details processing steps which give AGIS its name. We also present a basic quality assessment and validation of the scientific results (for details, see Lindegren et al. 2018).
ERIC Educational Resources Information Center
Andrews, Deborah C., Ed.; Dyrud, Marilyn, Ed.
1996-01-01
Presents four articles that provide suggestions for teaching document design: (1) "Teaching the Rhetoric of Document Design" (Michael J. Hassett); (2) "Teaching by Example: Suggestions for Assignment Design" (Marilyn A. Dyrud); (3) "Teaching the Page as a Visual Unit" (Bill Hart-Davidson); and (4) "Designing a…
Development and Pilot Testing of a Video-Assisted Informed Consent Process
Sonne, Susan C.; Andrews, Jeannette O.; Gentilin, Stephanie M.; Oppenheimer, Stephanie; Obeid, Jihad; Brady, Kathleen; Wolf, Sharon; Davis, Randal; Magruder, Kathryn
2013-01-01
The informed consent process for research has come under scrutiny, as consent documents are increasingly long and difficult to understand. Innovations are needed to improve comprehension in order to make the consent process truly informed. We report on the development and pilot testing of video clips that could be used during the consent process to better explain research procedures to potential participants. Based on input from researchers and community partners, 15 videos of common research procedures/concepts were produced. The utility of the videos was then tested by embedding them in mock informed consent documents that were presented via an online electronic consent system designed for delivery via iPad. Three mock consents were developed, each containing five videos. All participants (n=61) read both a paper version and the video-assisted iPad version of the same mock consent and were randomized to which format they reviewed first. Participants were given a competency quiz that posed specific questions about the information in the consent after reviewing the first consent document to which they were exposed. Most participants (78.7%) preferred the video-assisted format compared to paper (12.9%). Nearly all (96.7%) reported that the videos improved their understanding of the procedures described in the consent document; however, comprehension of material did not significantly differ by consent format. Results suggest videos may be helpful in providing participants with information about study procedures in a way that is easy to understand. Additional testing of video consents for complex protocols and with subjects of lower literacy is warranted. PMID:23747986
Development and pilot testing of a video-assisted informed consent process.
Sonne, Susan C; Andrews, Jeannette O; Gentilin, Stephanie M; Oppenheimer, Stephanie; Obeid, Jihad; Brady, Kathleen; Wolf, Sharon; Davis, Randal; Magruder, Kathryn
2013-09-01
The informed consent process for research has come under scrutiny, as consent documents are increasingly long and difficult to understand. Innovations are needed to improve comprehension in order to make the consent process truly informed. We report on the development and pilot testing of video clips that could be used during the consent process to better explain research procedures to potential participants. Based on input from researchers and community partners, 15 videos of common research procedures/concepts were produced. The utility of the videos was then tested by embedding them in mock-informed consent documents that were presented via an online electronic consent system designed for delivery via iPad. Three mock consents were developed, each containing five videos. All participants (n = 61) read both a paper version and the video-assisted iPad version of the same mock consent and were randomized to which format they reviewed first. Participants were given a competency quiz that posed specific questions about the information in the consent after reviewing the first consent document to which they were exposed. Most participants (78.7%) preferred the video-assisted format compared to paper (12.9%). Nearly all (96.7%) reported that the videos improved their understanding of the procedures described in the consent document; however, the comprehension of material did not significantly differ by consent format. Results suggest videos may be helpful in providing participants with information about study procedures in a way that is easy to understand. Additional testing of video consents for complex protocols and with subjects of lower literacy is warranted. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Maghsoudi, Mohammad Javad; Mohamed, Z.; Sudin, S.; Buyamin, S.; Jaafar, H. I.; Ahmad, S. M.
2017-08-01
This paper proposes an improved input shaping scheme for an efficient sway control of a nonlinear three dimensional (3D) overhead crane with friction using the particle swarm optimization (PSO) algorithm. Using this approach, a higher payload sway reduction is obtained as the input shaper is designed based on a complete nonlinear model, as compared to the analytical-based input shaping scheme derived using a linear second order model. Zero Vibration (ZV) and Distributed Zero Vibration (DZV) shapers are designed using both analytical and PSO approaches for sway control of rail and trolley movements. To test the effectiveness of the proposed approach, MATLAB simulations and experiments on a laboratory 3D overhead crane are performed under various conditions involving different cable lengths and sway frequencies. Their performances are studied based on a maximum residual of payload sway and Integrated Absolute Error (IAE) values which indicate total payload sway of the crane. With experiments, the superiority of the proposed approach over the analytical-based is shown by 30-50% reductions of the IAE values for rail and trolley movements, for both ZV and DZV shapers. In addition, simulations results show higher sway reductions with the proposed approach. It is revealed that the proposed PSO-based input shaping design provides higher payload sway reductions of a 3D overhead crane with friction as compared to the commonly designed input shapers.
Computing Shapes Of Cascade Diffuser Blades
NASA Technical Reports Server (NTRS)
Tran, Ken; Prueger, George H.
1993-01-01
Computer program generates sizes and shapes of cascade-type blades for use in axial or radial turbomachine diffusers. Generates shapes of blades rapidly, incorporating extensive cascade data to determine optimum incidence and deviation angle for blade design based on 65-series data base of National Advisory Commission for Aeronautics and Astronautics (NACA). Allows great variability in blade profile through input variables. Also provides for design of three-dimensional blades by allowing variable blade stacking. Enables designer to obtain computed blade-geometry data in various forms: as input for blade-loading analysis; as input for quasi-three-dimensional analysis of flow; or as points for transfer to computer-aided design.
,
2015-10-20
From 2000 to 2011, the U.S. Geological Survey conducted 139 quantitative assessments of continuous (unconventional) oil and gas accumulations within the United States. This report documents those assessments more fully than previously done by providing detailed documentation of both the assessment input and output. This report also compiles the data into spreadsheet tables that can be more readily used to provide analogs for future assessments, especially for hypothetical continuous accumulations.
FIM Avionics Operations Manual
NASA Technical Reports Server (NTRS)
Alves, Erin E.
2017-01-01
This document describes the operation and use of the Flight Interval Management (FIM) Application installed on an electronic flight bag (EFB). Specifically, this document includes: 1) screen layouts for each page of the interface; 2) step-by-step instructions for data entry, data verification, and input error correction; 3) algorithm state messages and error condition alerting messages; 4) aircraft speed guidance and deviation indications; and 5) graphical display of the spatial relationships between the Ownship aircraft and the Target aircraft.
Kepler Planet Detection Metrics: Per-Target Detection Contours for Data Release 25
NASA Technical Reports Server (NTRS)
Burke, Christopher J.; Catanzarite, Joseph
2017-01-01
A necessary input to planet occurrence calculations is an accurate model for the pipeline completeness (Burke et al., 2015). This document describes the use of the Kepler planet occurrence rate products in order to calculate a per-target detection contour for the measured Data Release 25 (DR25) pipeline performance. A per-target detection contour measures for a given combination of orbital period, Porb, and planet radius, Rp, what fraction of transit signals are recoverable by the Kepler pipeline (Twicken et al., 2016; Jenkins et al., 2017). The steps for calculating a detection contour follow the procedure outlined in Burke et al. (2015), but have been updated to provide improved accuracy enabled by the substantially larger database of transit injection and recovery tests that were performed on the final version (i.e., SOC 9.3) of the Kepler pipeline (Christiansen, 2017; Burke Catanzarite, 2017a). In the following sections, we describe the main inputs to the per-target detection contour and provide a worked example of the python software released with this document (Kepler Planet Occurrence Rate Tools KeplerPORTs)1 that illustrates the generation of a detection contour in practice. As background material for this document and its nomenclature, we recommend the reader be familiar with the previous method of calculating a detection contour (Section 2 of Burke et al.,2015), input parameters relevant for describing the data quantity and quality of Kepler targets (Burke Catanzarite, 2017b), and the extensive new transit injection and recovery tests of the Kepler pipeline (Christiansen et al., 2016; Burke Catanzarite, 2017a; Christiansen, 2017).
Software design and documentation language
NASA Technical Reports Server (NTRS)
Kleine, H.
1977-01-01
A communications medium to support the design and documentation of complex software applications is studied. The medium also provides the following: (1) a processor which can convert design specifications into an intelligible, informative machine reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor.
Design and Documentation: The State of the Art.
ERIC Educational Resources Information Center
Gibbons, Andrew S.
1998-01-01
Although the trend is for less documentation, this article argues that more is needed to help in the analysis of design failure in instructional design. Presents arguments supporting documented design, including error recognition and correction, verification of completeness and soundness, sharing of new design principles, modifiability, error…
Computer simulation and design of a three degree-of-freedom shoulder module
NASA Technical Reports Server (NTRS)
Marco, David; Torfason, L.; Tesar, Delbert
1989-01-01
An in-depth kinematic analysis of a three degree of freedom fully-parallel robotic shoulder module is presented. The major goal of the analysis is to determine appropriate link dimensions which will provide a maximized workspace along with desirable input to output velocity and torque amplification. First order kinematic influence coefficients which describe the output velocity properties in terms of actuator motions provide a means to determine suitable geometric dimensions for the device. Through the use of computer simulation, optimal or near optimal link dimensions based on predetermined design criteria are provided for two different structural designs of the mechanism. The first uses three rotational inputs to control the output motion. The second design involves the use of four inputs, actuating any three inputs for a given position of the output link. Alternative actuator placements are examined to determine the most effective approach to control the output motion.
Tank Monitoring and Document control System (TMACS) As Built Software Design Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
GLASSCOCK, J.A.
This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.
I-81 ITS program evaluation framework
DOT National Transportation Integrated Search
2003-07-01
This document presents the evaluation framework for the I-81 ITS Model Safety Corridor Program. The objectives of the framework are threefold: 1) serve as input into the development of infrastructure in the I-81 Corridor to generate baseline data for...
Best Communication Practices for Preparation of Exceptional Event Demonstrations
OAQPS developed this document of Best Practices based on input received from EPA regional offices and selected state/local air agencies who submitted Exceptional Event Demonstrations under the 2007 version of the Exceptional Events Rule (EE Rule).
Safe mobility for a maturing society challenges and opportunities.
DOT National Transportation Integrated Search
2003-11-01
The U.S. Department of Transportation wishes : to thank the many groups and individuals : who contributed extensively to this document. : Input came from participants in an : array of local and state activities, creating a report that represents the ...
Communications and Information: Compendium of Communications and Information Terminology
2002-02-01
Basic Access Module BASIC— Beginners All-Purpose Symbolic Instruction Code BBP—Baseband Processor BBS—Bulletin Board Service (System) BBTC—Broadband...media, formats and labels, programming language, computer documentation, flowcharts and terminology, character codes, data communications and input
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-19
... that there are numerous issues that this research does not address, including online data mining by... described in this document will provide data that, along with other input and considerations, will inform...
NASA Technical Reports Server (NTRS)
Solloway, C. B.; Wakeland, W.
1976-01-01
First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.
EPA sought advice from stakeholders regarding potential case studies, stakeholder were invited to provide suggestions and refinements to the prioritization of criteria and information listed in Table 1 of the document.
Lamas, Daniela; Panariello, Natalie; Henrich, Natalie; Hammes, Bernard; Hanson, Laura C; Meier, Diane E; Guinn, Nancy; Corrigan, Janet; Hubber, Sean; Luetke-Stahlman, Hannah; Block, Susan
2018-04-01
To develop a set of clinically relevant recommendations to improve the state of advance care planning (ACP) documentation in the electronic health record (EHR). Advance care planning (ACP) is a key process that supports goal-concordant care. For preferences to be honored, clinicians must be able to reliably record, find, and use ACP documentation. However, there are no standards to guide ACP documentation in the electronic health record (EHR). We interviewed 21 key informants to understand the strengths and weaknesses of EHR documentation systems for ACP and identify best practices. We analyzed these interviews using a qualitative content analysis approach and subsequently developed a preliminary set of recommendations. These recommendations were vetted and refined in a second round of input from a national panel of content experts. Informants identified six themes regarding current inadequacies in documentation and accessibility of ACP information and opportunities for improvement. We offer a set of concise, clinically relevant recommendations, informed by expert opinion, to improve the state of ACP documentation in the EHR.
The Use of User-Centered Participatory Design in Serious Games for Anxiety and Depression.
Dekker, Maria R; Williams, Alishia D
2017-12-01
There is increasing interest in using serious games to deliver or complement healthcare interventions for mental health, particularly for the most common mental health conditions such as anxiety and depression. Initial results seem promising, yet variations exist in the effectiveness of serious games, highlighting the importance of understanding optimal design features. It has been suggested that the involvement of end-users in the design and decision-making process could influence game effectiveness. In user-centered design (UCD) or participatory design (PD), users are involved in stages of the process, including planning, designing, implementing, and testing the serious game. To the authors' knowledge, no literature review to date has assessed the use of UCD/PD in games that are designed for mental health, specifically for anxiety or depression. The aim of this review is, therefore, to document the extent to which published studies of serious games that are designed to prevent or treat anxiety and depression have adopted a PD framework. A search of keywords in PubMed and PsychINFO databases through to December 2016 was conducted. We identified 20 serious games developed to prevent, treat or complement existing therapies for anxiety and/or depression. Half (N = 10; 50%) of these games were developed with input from the intended end-users, in either informant (N = 7; 70%) or full participatory co-design roles (N = 3; 30%). Less than half of games (45%) included users only in the testing phase.
40 CFR 61.305 - Reporting and recordkeeping.
Code of Federal Regulations, 2012 CFR
2012-07-01
... unit or process heater with a design heat input capacity of 44 MW (150 × 106 BTU/hr) or greater is used... or other flare design (i.e., steam-assisted, air-assisted or nonassisted), all visible emission... temperature of the steam generating unit or process heater with a design heat input capacity of less than 44...
40 CFR 61.305 - Reporting and recordkeeping.
Code of Federal Regulations, 2014 CFR
2014-07-01
... unit or process heater with a design heat input capacity of 44 MW (150 × 106 BTU/hr) or greater is used... or other flare design (i.e., steam-assisted, air-assisted or nonassisted), all visible emission... temperature of the steam generating unit or process heater with a design heat input capacity of less than 44...
40 CFR 61.305 - Reporting and recordkeeping.
Code of Federal Regulations, 2010 CFR
2010-07-01
... unit or process heater with a design heat input capacity of 44 MW (150 × 106 BTU/hr) or greater is used... or other flare design (i.e., steam-assisted, air-assisted or nonassisted), all visible emission... temperature of the steam generating unit or process heater with a design heat input capacity of less than 44...
40 CFR 61.305 - Reporting and recordkeeping.
Code of Federal Regulations, 2013 CFR
2013-07-01
... unit or process heater with a design heat input capacity of 44 MW (150 × 106 BTU/hr) or greater is used... or other flare design (i.e., steam-assisted, air-assisted or nonassisted), all visible emission... temperature of the steam generating unit or process heater with a design heat input capacity of less than 44...
40 CFR 61.305 - Reporting and recordkeeping.
Code of Federal Regulations, 2011 CFR
2011-07-01
... unit or process heater with a design heat input capacity of 44 MW (150 × 106 BTU/hr) or greater is used... or other flare design (i.e., steam-assisted, air-assisted or nonassisted), all visible emission... temperature of the steam generating unit or process heater with a design heat input capacity of less than 44...
Design, Fabrication, and Modeling of a Novel Dual-Axis Control Input PZT Gyroscope.
Chang, Cheng-Yang; Chen, Tsung-Lin
2017-10-31
Conventional gyroscopes are equipped with a single-axis control input, limiting their performance. Although researchers have proposed control algorithms with dual-axis control inputs to improve gyroscope performance, most have verified the control algorithms through numerical simulations because they lacked practical devices with dual-axis control inputs. The aim of this study was to design a piezoelectric gyroscope equipped with a dual-axis control input so that researchers may experimentally verify those control algorithms in future. Designing a piezoelectric gyroscope with a dual-axis control input is more difficult than designing a conventional gyroscope because the control input must be effective over a broad frequency range to compensate for imperfections, and the multiple mode shapes in flexural deformations complicate the relation between flexural deformation and the proof mass position. This study solved these problems by using a lead zirconate titanate (PZT) material, introducing additional electrodes for shielding, developing an optimal electrode pattern, and performing calibrations of undesired couplings. The results indicated that the fabricated device could be operated at 5.5±1 kHz to perform dual-axis actuations and position measurements. The calibration of the fabricated device was completed by system identifications of a new dynamic model including gyroscopic motions, electromechanical coupling, mechanical coupling, electrostatic coupling, and capacitive output impedance. Finally, without the assistance of control algorithms, the "open loop sensitivity" of the fabricated gyroscope was 1.82 μV/deg/s with a nonlinearity of 9.5% full-scale output. This sensitivity is comparable with those of other PZT gyroscopes with single-axis control inputs.
Software design and documentation language, revision 1
NASA Technical Reports Server (NTRS)
Kleine, H.
1979-01-01
The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.
NASA Technical Reports Server (NTRS)
Baxley, Brian T.; Palmer, Michael T.; Swieringa, Kurt A.
2015-01-01
This document describes the IM cockpit interfaces, displays, and alerting capabilities that were developed for and used in the IMAC experiment, which was conducted at NASA Langley in the summer of 2015. Specifically, this document includes: (1) screen layouts for each page of the interface; (2) step-by-step instructions for data entry, data verification and input error correction; (3) algorithm state messages and error condition alerting messages; (4) aircraft speed guidance and deviation indications; and (5) graphical display of the spatial relationships between the Ownship aircraft and the Target aircraft. The controller displays for IM will be described in a separate document.
Designing Input Fields for Non-Narrative Open-Ended Responses in Web Surveys
Couper, Mick P.; Kennedy, Courtney; Conrad, Frederick G.; Tourangeau, Roger
2012-01-01
Web surveys often collect information such as frequencies, currency amounts, dates, or other items requiring short structured answers in an open-ended format, typically using text boxes for input. We report on several experiments exploring design features of such input fields. We find little effect of the size of the input field on whether frequency or dollar amount answers are well-formed or not. By contrast, the use of templates to guide formatting significantly improves the well-formedness of responses to questions eliciting currency amounts. For date questions (whether month/year or month/day/year), we find that separate input fields improve the quality of responses over single input fields, while drop boxes further reduce the proportion of ill-formed answers. Drop boxes also reduce completion time when the list of responses is short (e.g., months), but marginally increases completion time when the list is long (e.g., birth dates). These results suggest that non-narrative open questions can be designed to help guide respondents to provide answers in the desired format. PMID:23411468
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2018-03-09
Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.
ISAMBARD: an open-source computational environment for biomolecular analysis, modelling and design.
Wood, Christopher W; Heal, Jack W; Thomson, Andrew R; Bartlett, Gail J; Ibarra, Amaurys Á; Brady, R Leo; Sessions, Richard B; Woolfson, Derek N
2017-10-01
The rational design of biomolecules is becoming a reality. However, further computational tools are needed to facilitate and accelerate this, and to make it accessible to more users. Here we introduce ISAMBARD, a tool for structural analysis, model building and rational design of biomolecules. ISAMBARD is open-source, modular, computationally scalable and intuitive to use. These features allow non-experts to explore biomolecular design in silico. ISAMBARD addresses a standing issue in protein design, namely, how to introduce backbone variability in a controlled manner. This is achieved through the generalization of tools for parametric modelling, describing the overall shape of proteins geometrically, and without input from experimentally determined structures. This will allow backbone conformations for entire folds and assemblies not observed in nature to be generated de novo, that is, to access the 'dark matter of protein-fold space'. We anticipate that ISAMBARD will find broad applications in biomolecular design, biotechnology and synthetic biology. A current stable build can be downloaded from the python package index (https://pypi.python.org/pypi/isambard/) with development builds available on GitHub (https://github.com/woolfson-group/) along with documentation, tutorial material and all the scripts used to generate the data described in this paper. d.n.woolfson@bristol.ac.uk or chris.wood@bristol.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Design Optimization Tool for Synthetic Jet Actuators Using Lumped Element Modeling
NASA Technical Reports Server (NTRS)
Gallas, Quentin; Sheplak, Mark; Cattafesta, Louis N., III; Gorton, Susan A. (Technical Monitor)
2005-01-01
The performance specifications of any actuator are quantified in terms of an exhaustive list of parameters such as bandwidth, output control authority, etc. Flow-control applications benefit from a known actuator frequency response function that relates the input voltage to the output property of interest (e.g., maximum velocity, volumetric flow rate, momentum flux, etc.). Clearly, the required performance metrics are application specific, and methods are needed to achieve the optimal design of these devices. Design and optimization studies have been conducted for piezoelectric cantilever-type flow control actuators, but the modeling issues are simpler compared to synthetic jets. Here, lumped element modeling (LEM) is combined with equivalent circuit representations to estimate the nonlinear dynamic response of a synthetic jet as a function of device dimensions, material properties, and external flow conditions. These models provide reasonable agreement between predicted and measured frequency response functions and thus are suitable for use as design tools. In this work, we have developed a Matlab-based design optimization tool for piezoelectric synthetic jet actuators based on the lumped element models mentioned above. Significant improvements were achieved by optimizing the piezoceramic diaphragm dimensions. Synthetic-jet actuators were fabricated and benchtop tested to fully document their behavior and validate a companion optimization effort. It is hoped that the tool developed from this investigation will assist in the design and deployment of these actuators.
Guidelines for PCC inputs to AASHTOWare Pavement ME.
DOT National Transportation Integrated Search
2014-12-01
The objective of this research study was to develop guidelines for portland cement concrete (PCC) material inputs to the : AASHTOWare Pavement ME Design program. The AASHTOWare Pavement ME Design is the software program used by the : Mississippi Depa...
The Distributed Geothermal Market Demand Model (dGeo): Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O
The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less
AEDT Software Requirements Documents - Draft
DOT National Transportation Integrated Search
2007-01-25
This software requirements document serves as the basis for designing and testing the Aviation Environmental Design Tool (AEDT) software. The intended audience for this document consists of the following groups: the AEDT designers, developers, and te...
ERIC Educational Resources Information Center
Boot, Eddy W.; Nelson, Jon; van Merrienboer, Jeroen J. G.; Gibbons, Andrew S.
2007-01-01
Designers and producers of instructional materials lack a common design language. As a result, producers have difficulties translating design documents into technical specifications. The 3D-model is introduced to improve the stratification, elaboration and formalisation of design documents. It is hypothesised that producers working with improved…
Tziraki, Chariklia; Berenbaum, Rakel; Gross, Daniel; Abikhzer, Judith; Ben-David, Boaz M
2017-07-31
The field of serious games for people with dementia (PwD) is mostly driven by game-design principals typically applied to games created by and for younger individuals. Little has been done developing serious games to help PwD maintain cognition and to support functionality. We aimed to create a theory-based serious game for PwD, with input from a multi-disciplinary team familiar with aging, dementia, and gaming theory, as well as direct input from end users (the iterative process). Targeting enhanced self-efficacy in daily activities, the goal was to generate a game that is acceptable, accessible and engaging for PwD. The theory-driven game development was based on the following learning theories: learning in context, errorless learning, building on capacities, and acknowledging biological changes-all with the aim to boost self-efficacy. The iterative participatory process was used for game screen development with input of 34 PwD and 14 healthy community dwelling older adults, aged over 65 years. Development of game screens was informed by the bio-psychological aging related disabilities (ie, motor, visual, and perception) as well as remaining neuropsychological capacities (ie, implicit memory) of PwD. At the conclusion of the iterative development process, a prototype game with 39 screens was used for a pilot study with 24 PwD and 14 healthy community dwelling older adults. The game was played twice weekly for 10 weeks. Quantitative analysis showed that the average speed of successful screen completion was significantly longer for PwD compared with healthy older adults. Both PwD and controls showed an equivalent linear increase in the speed for task completion with practice by the third session (P<.02). Most important, the rate of improved processing speed with practice was not statistically different between PwD and controls. This may imply that some form of learning occurred for PwD at a nonsignificantly different rate than for controls. Qualitative results indicate that PwD found the game engaging and fun. Healthy older adults found the game too easy. Increase in self-reported self-efficacy was documented with PwD only. Our study demonstrated that PwD's speed improved with practice at the same rate as healthy older adults. This implies that when tasks are designed to match PwD's abilities, learning ensues. In addition, this pilot study of a serious game, designed for PwD, was accessible, acceptable, and enjoyable for end users. Games designed based on learning theories and input of end users and a multi-disciplinary team familiar with dementia and aging may have the potential of maintaining capacity and improving functionality of PwD. A larger longer study is needed to confirm our findings and evaluate the use of these games in assessing cognitive status and functionality. ©Chariklia Tziraki, Rakel Berenbaum, Daniel Gross, Judith Abikhzer, Boaz M Ben-David. Originally published in JMIR Serious Games (http://games.jmir.org), 31.07.2017.
Gross, Daniel; Abikhzer, Judith
2017-01-01
Background The field of serious games for people with dementia (PwD) is mostly driven by game-design principals typically applied to games created by and for younger individuals. Little has been done developing serious games to help PwD maintain cognition and to support functionality. Objectives We aimed to create a theory-based serious game for PwD, with input from a multi-disciplinary team familiar with aging, dementia, and gaming theory, as well as direct input from end users (the iterative process). Targeting enhanced self-efficacy in daily activities, the goal was to generate a game that is acceptable, accessible and engaging for PwD. Methods The theory-driven game development was based on the following learning theories: learning in context, errorless learning, building on capacities, and acknowledging biological changes—all with the aim to boost self-efficacy. The iterative participatory process was used for game screen development with input of 34 PwD and 14 healthy community dwelling older adults, aged over 65 years. Development of game screens was informed by the bio-psychological aging related disabilities (ie, motor, visual, and perception) as well as remaining neuropsychological capacities (ie, implicit memory) of PwD. At the conclusion of the iterative development process, a prototype game with 39 screens was used for a pilot study with 24 PwD and 14 healthy community dwelling older adults. The game was played twice weekly for 10 weeks. Results Quantitative analysis showed that the average speed of successful screen completion was significantly longer for PwD compared with healthy older adults. Both PwD and controls showed an equivalent linear increase in the speed for task completion with practice by the third session (P<.02). Most important, the rate of improved processing speed with practice was not statistically different between PwD and controls. This may imply that some form of learning occurred for PwD at a nonsignificantly different rate than for controls. Qualitative results indicate that PwD found the game engaging and fun. Healthy older adults found the game too easy. Increase in self-reported self-efficacy was documented with PwD only. Conclusions Our study demonstrated that PwD’s speed improved with practice at the same rate as healthy older adults. This implies that when tasks are designed to match PwD’s abilities, learning ensues. In addition, this pilot study of a serious game, designed for PwD, was accessible, acceptable, and enjoyable for end users. Games designed based on learning theories and input of end users and a multi-disciplinary team familiar with dementia and aging may have the potential of maintaining capacity and improving functionality of PwD. A larger longer study is needed to confirm our findings and evaluate the use of these games in assessing cognitive status and functionality. PMID:28760730
NASA Astrophysics Data System (ADS)
Chen, Chao; Liu, Qian; Zhao, Jun
2018-01-01
This paper studies the problem of stabilisation of switched nonlinear systems with output and input constraints. We propose a recursive approach to solve this issue. None of the subsystems are assumed to be stablisable while the switched system is stabilised by dual design of controllers for subsystems and a switching law. When only dealing with bounded input, we provide nested switching controllers using an extended backstepping procedure. If both input and output constraints are taken into consideration, a Barrier Lyapunov Function is employed during operation to construct multiple Lyapunov functions for switched nonlinear system in the backstepping procedure. As a practical example, the control design of an equilibrium manifold expansion model of aero-engine is given to demonstrate the effectiveness of the proposed design method.
Manual for LS-DYNA Wood Material Model 143
DOT National Transportation Integrated Search
2007-08-01
An elastoplastic damage model with rate effects was developed for wood and was implemented into LS-DYNA, a commercially available finite element code. This manual documents the theory of the wood material model, describes the LS-DYNA input and output...
A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems
DOT National Transportation Integrated Search
2009-07-01
This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...
Hop, Skip and Jump: Animation Software.
ERIC Educational Resources Information Center
Eiser, Leslie
1986-01-01
Discusses the features of animation software packages, reviewing eight commercially available programs. Information provided for each program includes name, publisher, current computer(s) required, cost, documentation, input device, import/export capabilities, printing possibilities, what users can originate, types of image manipulation possible,…
Aquarius Salinity Retrieval Algorithm: Final Pre-Launch Version
NASA Technical Reports Server (NTRS)
Wentz, Frank J.; Le Vine, David M.
2011-01-01
This document provides the theoretical basis for the Aquarius salinity retrieval algorithm. The inputs to the algorithm are the Aquarius antenna temperature (T(sub A)) measurements along with a number of NCEP operational products and pre-computed tables of space radiation coming from the galaxy and sun. The output is sea-surface salinity and many intermediate variables required for the salinity calculation. This revision of the Algorithm Theoretical Basis Document (ATBD) is intended to be the final pre-launch version.
MAGMA: analysis of two-channel microarrays made easy.
Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph
2007-07-01
The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.
Testing and validation of computerized decision support systems.
Sailors, R M; East, T D; Wallace, C J; Carlson, D A; Franklin, M A; Heermann, L K; Kinder, A T; Bradshaw, R L; Randolph, A G; Morris, A H
1996-01-01
Systematic, through testing of decision support systems (DSSs) prior to release to general users is a critical aspect of high quality software design. Omission of this step may lead to the dangerous, and potentially fatal, condition of relying on a system with outputs of uncertain quality. Thorough testing requires a great deal of effort and is a difficult job because tools necessary to facilitate testing are not well developed. Testing is a job ill-suited to humans because it requires tireless attention to a large number of details. For these reasons, the majority of DSSs available are probably not well tested prior to release. We have successfully implemented a software design and testing plan which has helped us meet our goal of continuously improving the quality of our DSS software prior to release. While requiring large amounts of effort, we feel that the process of documenting and standardizing our testing methods are important steps toward meeting recognized national and international quality standards. Our testing methodology includes both functional and structural testing and requires input from all levels of development. Our system does not focus solely on meeting design requirements but also addresses the robustness of the system and the completeness of testing.
Computer-Aided Engineering Of Cabling
NASA Technical Reports Server (NTRS)
Billitti, Joseph W.
1989-01-01
Program generates data sheets, drawings, and other information on electrical connections. DFACS program, centered around single data base, has built-in menus providing easy input of, and access to, data for all personnel involved in system, subsystem, and cabling. Enables parallel design of circuit-data sheets and drawings of harnesses. Also recombines raw information to generate automatically various project documents and drawings, including index of circuit-data sheets, list of electrical-interface circuits, lists of assemblies and equipment, cabling trees, and drawings of cabling electrical interfaces and harnesses. Purpose of program to provide engineering community with centralized data base for putting in, and gaining access to, functional definition of system as specified in terms of details of pin connections of end circuits of subsystems and instruments and data on harnessing. Primary objective to provide instantaneous single point of interchange of information, thus avoiding
An Evaluation of Automatic Control System Concepts for General Aviation Airplanes
NASA Technical Reports Server (NTRS)
Stewart, E. C.
1990-01-01
A piloted simulation study of automatic longitudinal control systems for general aviation airplanes has been conducted. These automatic control systems were designed to make the simulated airplane easy to fly for a beginning or infrequent pilot. Different control systems are presented and their characteristics are documented. In a conventional airplane control system each cockpit controller commands combinations of both the airspeed and the vertical speed. The best system in the present study decoupled the airspeed and vertical speed responses to cockpit controller inputs. An important feature of the automatic system was that neither changing flap position nor maneuvering in steeply banked turns affected either the airspeed or the vertical speed. All the pilots who flew the control system simulation were favorably impressed with the very low workload and the excellent handling qualities of the simulated airplane.
Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition
NASA Technical Reports Server (NTRS)
Ewing, Anthony; Adams, Charles
2004-01-01
Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.
Engine structures modeling software system: Computer code. User's manual
NASA Technical Reports Server (NTRS)
1992-01-01
ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.
Orbital Maneuvering Engine Feed System Coupled Stability Investigation, Computer User's Manual
NASA Technical Reports Server (NTRS)
Schuman, M. D.; Fertig, K. W.; Hunting, J. K.; Kahn, D. R.
1975-01-01
An operating manual for the feed system coupled stability model was given, in partial fulfillment of a program designed to develop, verify, and document a digital computer model that can be used to analyze and predict engine/feed system coupled instabilities in pressure-fed storable propellant propulsion systems over a frequency range of 10 to 1,000 Hz. The first section describes the analytical approach to modelling the feed system hydrodynamics, combustion dynamics, chamber dynamics, and overall engineering model structure, and presents the governing equations in each of the technical areas. This is followed by the program user's guide, which is a complete description of the structure and operation of the computerized model. Last, appendices provide an alphabetized FORTRAN symbol table, detailed program logic diagrams, computer code listings, and sample case input and output data listings.
An analysis of a candidate control algorithm for a ride quality augmentation system
NASA Technical Reports Server (NTRS)
Suikat, Reiner; Donaldson, Kent; Downing, David R.
1987-01-01
This paper presents a detailed analysis of a candidate algorithm for a ride quality augmentation system. The algorithm consists of a full-state feedback control law based on optimal control output weighting, estimators for angle of attack and sideslip, and a maneuvering algorithm. The control law is shown to perform well by both frequency and time domain analysis. The rms vertical acceleration is reduced by about 40 percent over the whole mission flight envelope. The estimators for the angle of attack and sideslip avoid the often inaccurate or costly direct measurement of those angles. The maneuvering algorithm will allow the augmented airplane to respond to pilot inputs. The design characteristics and performance are documented by the closed-loop eigenvalues; rms levels of vertical, lateral, and longitudinal acceleration; and representative time histories and frequency response.
DOT National Transportation Integrated Search
2009-02-01
The resilient modulus (MR) input parameters in the Mechanistic-Empirical Pavement Design Guide (MEPDG) program have a significant effect on the projected pavement performance. The MEPDG program uses three different levels of inputs depending on the d...
Development of traffic data input resources for the mechanistic empirical pavement design process.
DOT National Transportation Integrated Search
2011-12-12
The Mechanistic-Empirical Pavement Design Guide (MEPDG) for New and Rehabilitated Pavement Structures uses : nationally based data traffic inputs and recommends that state DOTs develop their own site-specific and regional : values. To support the MEP...
40 CFR 60.615 - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... or process heater with a design heat input capacity of 44 MW (150 million Btu/hour) or greater is...) The average combustion temperature of the boiler or process heater with a design heat input capacity... this subpart seeks to comply with § 60.612(b) through the use of a smokeless flare, flare design (i.e...
RX: a nonimaging concentrator.
Miñano, J C; Benítez, P; González, J C
1995-05-01
A detailed description of the design procedure for a new concentrator, RX, and some examples of it's use are given. The method of design is basically the same as that used in the design of two other concentrators: the RR and the XR [Appl. Opt. 31, 3051 (1992)]. The RX is ideal in two-dimensional geometry. The performance of the rotational RX is good when the average angular spread of the input bundle is small: up to 95% of the power of the input bundle can be transferred to the output bundle (with the assumption of a constant radiance for the rays of the input bundle).
NASA Technical Reports Server (NTRS)
Griffin, Brian Joseph; Burken, John J.; Xargay, Enric
2010-01-01
This paper presents an L(sub 1) adaptive control augmentation system design for multi-input multi-output nonlinear systems in the presence of unmatched uncertainties which may exhibit significant cross-coupling effects. A piecewise continuous adaptive law is adopted and extended for applicability to multi-input multi-output systems that explicitly compensates for dynamic cross-coupling. In addition, explicit use of high-fidelity actuator models are added to the L1 architecture to reduce uncertainties in the system. The L(sub 1) multi-input multi-output adaptive control architecture is applied to the X-29 lateral/directional dynamics and results are evaluated against a similar single-input single-output design approach.
A Revised Trajectory Algorithm to Support En Route and Terminal Area Self-Spacing Concepts
NASA Technical Reports Server (NTRS)
Abbott, Terence S.
2010-01-01
This document describes an algorithm for the generation of a four dimensional trajectory. Input data for this algorithm are similar to an augmented Standard Terminal Arrival (STAR) with the augmentation in the form of altitude or speed crossing restrictions at waypoints on the route. This version of the algorithm accommodates descent Mach values that are different from the cruise Mach values. Wind data at each waypoint are also inputs into this algorithm. The algorithm calculates the altitude, speed, along path distance, and along path time for each waypoint.
FEMFLOW3D; a finite-element program for the simulation of three-dimensional aquifers; version 1.0
Durbin, Timothy J.; Bond, Linda D.
1998-01-01
This document also includes model validation, source code, and example input and output files. Model validation was performed using four test problems. For each test problem, the results of a model simulation with FEMFLOW3D were compared with either an analytic solution or the results of an independent numerical approach. The source code, written in the ANSI x3.9-1978 FORTRAN standard, and the complete input and output of an example problem are listed in the appendixes.
NASA Technical Reports Server (NTRS)
Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.
1974-01-01
The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.
Program document for Energy Systems Optimization Program 2 (ESOP2). Volume 1: Engineering manual
NASA Technical Reports Server (NTRS)
Hamil, R. G.; Ferden, S. L.
1977-01-01
The Energy Systems Optimization Program, which is used to provide analyses of Modular Integrated Utility Systems (MIUS), is discussed. Modifications to the input format to allow modular inputs in specified blocks of data are described. An optimization feature which enables the program to search automatically for the minimum value of one parameter while varying the value of other parameters is reported. New program option flags for prime mover analyses and solar energy for space heating and domestic hot water are also covered.
Space Transportation Engine Program (STEP), phase B
NASA Technical Reports Server (NTRS)
1990-01-01
The Space Transportation Engine Program (STEP) Phase 2 effort includes preliminary design and activities plan preparation that will allow smooth and time transition into a Prototype Phase and then into Phases 3, 4, and 5. A Concurrent Engineering approach using Total Quality Management (TQM) techniques, is being applied to define an oxygen-hydrogen engine. The baseline from Phase 1/1' studies was used as a point of departure for trade studies and analyses. Existing STME system models are being enhanced as more detailed module/component characteristics are determined. Preliminary designs for the open expander, closed expander, and gas generator cycles were prepared, and recommendations for cycle selection made at the Design Concept Review (DCR). As a result of July '90 DCR, and information subsequently supplied to the Technical Review Team, a gas generator cycle was selected. Results of the various Advanced Development Programs (ADP's) for the Advanced Launch Systems (ALS) were contributive to this effort. An active vehicle integration effort is supplying the NASA, Air Force, and vehicle contractors with engine parameters and data, and flowing down appropriate vehicle requirements. Engine design and analysis trade studies are being documented in a data base that was developed and is being used to organize information. To date, seventy four trade studies were input to the data base.
Modular Integrated Stackable Layers (MISL) MI_MSP430A Board Design Document (BDD)
NASA Technical Reports Server (NTRS)
Yim, Hester
2013-01-01
This is a board-level design document for Modular Integrated Stackable Layers (MISL) MI_MSP430A board (PIN MSP430F5438A). The Board Design Document (BDD) contains the description, features of microcontroller, electrical and mechanical design, and drawings.
NASA Technical Reports Server (NTRS)
Huffman, S.
1977-01-01
Detailed instructions on the use of two computer-aided-design programs for designing the energy storage inductor for single winding and two winding dc to dc converters are provided. Step by step procedures are given to illustrate the formatting of user input data. The procedures are illustrated by eight sample design problems which include the user input and the computer program output.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Robust binarization of degraded document images using heuristics
NASA Astrophysics Data System (ADS)
Parker, Jon; Frieder, Ophir; Frieder, Gideon
2013-12-01
Historically significant documents are often discovered with defects that make them difficult to read and analyze. This fact is particularly troublesome if the defects prevent software from performing an automated analysis. Image enhancement methods are used to remove or minimize document defects, improve software performance, and generally make images more legible. We describe an automated, image enhancement method that is input page independent and requires no training data. The approach applies to color or greyscale images with hand written script, typewritten text, images, and mixtures thereof. We evaluated the image enhancement method against the test images provided by the 2011 Document Image Binarization Contest (DIBCO). Our method outperforms all 2011 DIBCO entrants in terms of average F1 measure - doing so with a significantly lower variance than top contest entrants. The capability of the proposed method is also illustrated using select images from a collection of historic documents stored at Yad Vashem Holocaust Memorial in Israel.
Building accurate historic and future climate MEPDG input files for Louisiana DOTD.
DOT National Transportation Integrated Search
2017-02-01
The pavement design process (originally MEPDG, then DARWin-ME, and now Pavement ME Design) requires a multi-year set of hourly : climate input data that influence pavement material properties. In Louisiana, the software provides nine locations with c...
The Software Design Document: More than a User's Manual.
ERIC Educational Resources Information Center
Bowers, Dennis
1989-01-01
Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…
NASA Technical Reports Server (NTRS)
Zak, J. Allen; Rodgers, William G., Jr.
2000-01-01
The quality of the Aircraft Vortex Spacing System (AVOSS) is critically dependent on representative wind profiles in the atmospheric boundary layer. These winds observed from a number of sensor systems around the Dallas-Fort Worth airport were combined into single vertical wind profiles by an algorithm developed and implemented by MIT Lincoln Laboratory. This process, called the AVOSS Winds Analysis System (AWAS), is used by AVOSS for wake corridor predictions. During times when AWAS solutions were available, the quality of the resultant wind profiles and variance was judged from a series of plots combining all sensor observations and AWAS profiles during the period 1200 to 0400 UTC daily. First, input data was evaluated for continuity and consistency from criteria established. Next, the degree of agreement among all wind sensor systems was noted and cases of disagreement identified. Finally, the resultant AWAS solution was compared to the quality-assessed input data. When profiles differed by a specified amount from valid sensor consensus winds, times and altitudes were flagged. Volume one documents the process and quality of input sensor data. Volume two documents the data processing/sorting process and provides the resultant flagged files.
NASA Astrophysics Data System (ADS)
Mohlman, H. T.
1983-04-01
The Air Force community noise prediction model (NOISEMAP) is used to describe the aircraft noise exposure around airbases and thereby aid airbase planners to minimize exposure and prevent community encroachment which could limit mission effectiveness of the installation. This report documents two computer programs (OMEGA 10 and OMEGA 11) which were developed to prepare aircraft flight and ground runup noise data for input to NOISEMAP. OMEGA 10 is for flight operations and OMEGA 11 is for aircraft ground runups. All routines in each program are documented at a level useful to a programmer working with the code or a reader interested in a general overview of what happens within a specific subroutine. Both programs input normalized, reference aircraft noise data; i.e., data at a standard reference distance from the aircraft, for several fixed engine power settings, a reference airspeed and standard day meteorological conditions. Both programs operate on these normalized, reference data in accordance with user-defined, non-reference conditions to derive single-event noise data for 22 distances (200 to 25,000 feet) in a variety of physical and psycho-acoustic metrics. These outputs are in formats ready for input to NOISEMAP.
Public Notification Rulemaking Documents
When EPA is directed to revise a regulation the Agency publishes a proposed rule. The proposed rule includes all the ideas the Agency has for how to improve the Rule. The publication of the proposed rule is also a way to ask the public for input.
System Environmental Outreach Feature Stories Individual Permit for Storm Water Public Reading Room calendar to provide input on the subject matter. Visit the Public Reading Rooms and study our environmental Mitigating Wildland Fires Public Reading Room: Environmental Documents, Reports Los Alamos National
Appendix F: FreedomCAR and Vehicle Technologies Program inputs for FY 2008 benefits estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2009-01-18
Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.
Brownfields and Urban Agriculture: Interim Guidelines for Safe Gardening Practices
This document is a condensation of the input of experts from the government, the nonprofit sector, and academia who gathered to outline the range of issues which need to be addressed in order to safely grow food on former brownfield sites.
The dairy_wa.zip file is a zip file containing an Arc/Info export file and a text document. Note the DISCLAIM.TXT file as these data are not verified. Map extent: statewide. Input Source: Address database obtained from Wa Dept of Agriculture. Data was originally developed und...
DOE Office of Scientific and Technical Information (OSTI.GOV)
HCTT CHE
2009-12-16
The purpose of this document is to provide a suggested approach, based on input from pediatric stakeholders, to communicating pediatric-related information on pandemic influenza at the community level in a step-by-step manner.
Gaze and Feet as Additional Input Modalities for Interacting with Geospatial Interfaces
NASA Astrophysics Data System (ADS)
Çöltekin, A.; Hempel, J.; Brychtova, A.; Giannopoulos, I.; Stellmach, S.; Dachselt, R.
2016-06-01
Geographic Information Systems (GIS) are complex software environments and we often work with multiple tasks and multiple displays when we work with GIS. However, user input is still limited to mouse and keyboard in most workplace settings. In this project, we demonstrate how the use of gaze and feet as additional input modalities can overcome time-consuming and annoying mode switches between frequently performed tasks. In an iterative design process, we developed gaze- and foot-based methods for zooming and panning of map visualizations. We first collected appropriate gestures in a preliminary user study with a small group of experts, and designed two interaction concepts based on their input. After the implementation, we evaluated the two concepts comparatively in another user study to identify strengths and shortcomings in both. We found that continuous foot input combined with implicit gaze input is promising for supportive tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.
This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARCmore » and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.« less
Performance Confirmation Data Aquisition System
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.W. Markman
2000-10-27
The purpose of this analysis is to identify and analyze concepts for the acquisition of data in support of the Performance Confirmation (PC) program at the potential subsurface nuclear waste repository at Yucca Mountain. The scope and primary objectives of this analysis are to: (1) Review the criteria for design as presented in the Performance Confirmation Data Acquisition/Monitoring System Description Document, by way of the Input Transmittal, Performance Confirmation Input Criteria (CRWMS M&O 1999c). (2) Identify and describe existing and potential new trends in data acquisition system software and hardware that would support the PC plan. The data acquisition softwaremore » and hardware will support the field instruments and equipment that will be installed for the observation and perimeter drift borehole monitoring, and in-situ monitoring within the emplacement drifts. The exhaust air monitoring requirements will be supported by a data communication network interface with the ventilation monitoring system database. (3) Identify the concepts and features that a data acquisition system should have in order to support the PC process and its activities. (4) Based on PC monitoring needs and available technologies, further develop concepts of a potential data acquisition system network in support of the PC program and the Site Recommendation and License Application.« less
Script-independent text line segmentation in freestyle handwritten documents.
Li, Yi; Zheng, Yefeng; Doermann, David; Jaeger, Stefan; Li, Yi
2008-08-01
Text line segmentation in freestyle handwritten documents remains an open document analysis problem. Curvilinear text lines and small gaps between neighboring text lines present a challenge to algorithms developed for machine printed or hand-printed documents. In this paper, we propose a novel approach based on density estimation and a state-of-the-art image segmentation technique, the level set method. From an input document image, we estimate a probability map, where each element represents the probability that the underlying pixel belongs to a text line. The level set method is then exploited to determine the boundary of neighboring text lines by evolving an initial estimate. Unlike connected component based methods ( [1], [2] for example), the proposed algorithm does not use any script-specific knowledge. Extensive quantitative experiments on freestyle handwritten documents with diverse scripts, such as Arabic, Chinese, Korean, and Hindi, demonstrate that our algorithm consistently outperforms previous methods [1]-[3]. Further experiments show the proposed algorithm is robust to scale change, rotation, and noise.
NASA Astrophysics Data System (ADS)
Ebrahimzadeh, Faezeh; Tsai, Jason Sheng-Hong; Chung, Min-Ching; Liao, Ying Ting; Guo, Shu-Mei; Shieh, Leang-San; Wang, Li
2017-01-01
Contrastive to Part 1, Part 2 presents a generalised optimal linear quadratic digital tracker (LQDT) with universal applications for the discrete-time (DT) systems. This includes (1) a generalised optimal LQDT design for the system with the pre-specified trajectories of the output and the control input and additionally with both the input-to-output direct-feedthrough term and known/estimated system disturbances or extra input/output signals; (2) a new optimal filter-shaped proportional plus integral state-feedback LQDT design for non-square non-minimum phase DT systems to achieve a minimum-phase-like tracking performance; (3) a new approach for computing the control zeros of the given non-square DT systems; and (4) a one-learning-epoch input-constrained iterative learning LQDT design for the repetitive DT systems.
Economic Input-Output Life Cycle Assessment of Water Reuse Strategies in Residential Buildings
This paper evaluates the environmental sustainability and economic feasibility of four water reuse designs through economic input-output life cycle assessments (EIO-LCA) and benefit/cost analyses. The water reuse designs include: 1. Simple Greywater Reuse System for Landscape Ir...
DOT National Transportation Integrated Search
2010-02-01
This study developed traffic inputs for use with the Guide for the Mechanistic-Empirical Design of New & Rehabilitated Pavement Structures (MEPDG) in Virginia and sought to determine if the predicted distresses showed differences between site-specifi...
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Keller, Vernon W.; Vaughan, William W.
2005-01-01
The description and interpretation of the terrestrial environment (0-90 km altitude) is an important driver of aerospace vehicle structural, control, and thermal system design. NASA is currently in the process of reviewing the meteorological information acquired over the past decade and producing an update to the 1993 Terrestrial Environment Guidelines for Aerospace Vehicle Design and Development handbook. This paper addresses the contents of this updated handbook, with special emphasis on new material being included in the areas of atmospheric thermodynamic models, wind dynamics, atmospheric composition, atmospheric electricity, cloud phenomena, atmospheric extremes, sea state, etc. In addition, the respective engineering design elements will be discussed relative to the importance and influence of terrestrial environment inputs that require consideration and interpretation for design applications. Specific lessons learned that have contributed to the advancements made in the acquisition, interpretation, application and awareness of terrestrial environment inputs for aerospace engineering applications are discussed.
Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.
2002-01-01
Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.
Geochemical Data Package for Performance Assessment Calculations Related to the Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, Daniel I.
The Savannah River Site (SRS) disposes of low-level radioactive waste (LLW) and stabilizes high-level radioactive waste (HLW) tanks in the subsurface environment. Calculations used to establish the radiological limits of these facilities are referred to as Performance Assessments (PA), Special Analyses (SA), and Composite Analyses (CA). The objective of this document is to revise existing geochemical input values used for these calculations. This work builds on earlier compilations of geochemical data (2007, 2010), referred to a geochemical data packages. This work is being conducted as part of the on-going maintenance program of the SRS PA programs that periodically updates calculationsmore » and data packages when new information becomes available. Because application of values without full understanding of their original purpose may lead to misuse, this document also provides the geochemical conceptual model, the approach used for selecting the values, the justification for selecting data, and the assumptions made to assure that the conceptual and numerical geochemical models are reasonably conservative (i.e., bias the recommended input values to reflect conditions that will tend to predict the maximum risk to the hypothetical recipient). This document provides 1088 input parameters for geochemical parameters describing transport processes for 64 elements (>740 radioisotopes) potentially occurring within eight subsurface disposal or tank closure areas: Slit Trenches (ST), Engineered Trenches (ET), Low Activity Waste Vault (LAWV), Intermediate Level (ILV) Vaults, Naval Reactor Component Disposal Areas (NRCDA), Components-in-Grout (CIG) Trenches, Saltstone Facility, and Closed Liquid Waste Tanks. The geochemical parameters described here are the distribution coefficient, Kd value, apparent solubility concentration, k s value, and the cementitious leachate impact factor.« less
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
1984-12-01
input/output relationship. These are obtained from the design specifications (10:68i-684). Note that the first digit of the subscript of bkj refers...to the output and the second digit to the input. Thus, bkj is.a function of the response requirements on the output, Yk’ due to the input, r.. 169 . A...NXPMAX pNYPMAX, IPLOT) C C C* LIBARY OF PLOT SUBR(OUTINES PSNTCT NLIEPRINTER ONLY~ C* C C C SUP’ LPLOTS C C C DIMENSION IXY(101,71)918UF(100) COMMON /HOPY
Bridge, Heather; Smolskis, Mary; Bianchine, Peter; Dixon, Dennis O.; Kelly, Grace; Herpin, Betsey; Tavel, Jorge
2009-01-01
Background: A clinical research protocol document must reflect both sound scientific rationale as well as local, national and, when applicable, international regulatory and human subject protections requirements. These requirements originate from a variety of sources, undergo frequent revision and are subject to interpretation. Tools to assist clinical investigators in the production of clinical protocols could facilitate navigating these requirements and ultimately increase the efficiency of clinical research. Purpose: The National Institute of Allergy and Infectious Diseases (NIAID) developed templates for investigators to serve as the foundation for protocol development. These protocol templates are designed as tools to support investigators in developing clinical protocols. Methods: NIAID established a series of working groups to determine how to improve its capacity to conduct clinical research more efficiently and effectively. The Protocol Template Working Group was convened to determine what protocol templates currently existed within NIAID and whether standard NIAID protocol templates should be produced. After review and assessment of existing protocol documents and requirements, the group reached consensus about required and optional content, determined the format and identified methods for distribution as well as education of investigators in the use of these templates. Results: The templates were approved by the NIAID Executive Committee in 2006 and posted as part of the NIAID Clinical Research Toolkit[1]website for broad access. These documents require scheduled revisions to stay current with regulatory and policy changes. Limitations: The structure of any clinical protocol template, whether comprehensive or specific to a particular study phase, setting or design, affects how it is used by investigators. Each structure presents its own set of advantages and disadvantages. While useful, protocol templates are not stand-alone tools for creating an optimal protocol document but must be complemented by institutional resources and support. Education and guidance of investigators in the appropriate use of templates is necessary to ensure a complete yet concise protocol document. Due to changing regulatory requirements, clinical protocol templates cannot become static but require frequent revisions. Conclusions: Standard protocol templates that meet applicable regulations can be important tools to assist investigators in the effective conduct of clinical research, but they require dedicated resources and ongoing input from key stakeholders. PMID:19625326
Brooks, Amy C; Fryer, Mike; Lawrence, Alan; Pascual, Juan; Sharp, Rachel
2017-03-01
The use of plant protection products on agricultural crops can result in exposure of birds and mammals to toxic chemicals. In the European Union, the risks from such exposures are assessed under the current (2009) guidance document from the European Food Safety Authority (EFSA), designed to increase the realism of the theoretical risk assessments in comparison to its predecessor (SANCO/4145/2000). Since its adoption over 7 yr ago, many plant protection products have been evaluated successfully using the 2009 EFSA guidance document. However, there are still significant areas of improvement recommended for future revisions of this guidance. The present Focus article discusses experiences to date with the current scheme, including levels of conservatism in input parameters and interpretation by regulatory authorities together with proposals for how the guidance document could be improved when it is revised in the not too distant future. Several areas for which further guidance is recommended have been identified, such as the derivation of ecologically relevant bird and mammal reproductive endpoints and the use of modeling approaches to contextualize risk assessments. Areas where existing databases could be improved were also highlighted, including the collation of relevant focal species across Europe and expansion of the residue database for food items. To produce a realistic and useable guidance document in the future, it is strongly recommended that there is open and constructive communication between industry, regulatory authorities, and the EFSA. Such collaboration would also encourage harmonization between member states, thus reducing workloads for both industry and regulatory authorities. Environ Toxicol Chem 2017;36:565-575. © 2017 SETAC. © 2017 SETAC.
2011-01-01
Background Information on the costs of implementing programmes designed to provide support of orphans and vulnerable children (OVC) in sub-Saharan Africa and elsewhere is increasingly being requested by donors for programme evaluation purposes. To date, little information exists to document the costs and structure of costs of OVC programmes as actually implemented "on the ground" by local non-governmental organizations (NGOs). This analysis provides a practical, six-step approach that NGOs can incorporate into routine operations to evaluate their costs of implementing their OVC programmes annually. This approach is applied to the Community-Based Care for Orphans and Vulnerable Children (CBCO) Program implemented by BIDII (a Kenyan NGO) in Eastern Province of Kenya. Methods and results The costing methodology involves the following six steps: accessing and organizing the NGO's annual financial report into logical sub-categories; reorganizing the sub-categories into input cost categories to create a financial cost profile; estimating the annual equivalent payment for programme equipment; documenting donations to the NGO for programme implementation; including a portion of NGO organizational costs not attributed to specific programmes; and including the results of Steps 3-5 into an expanded cost profile. Detailed results are provided for the CBCO programme. Conclusions This paper shows through a concrete example how NGOs implementing OVC programmes (and other public health programmes) can organize themselves for data collection and documentation prospectively during the implementation of their OVC programmes so that costing analyses become routine practice to inform programme implementation rather than a painful and flawed retrospective activity. Such information is required if the costs and outcomes achieved by OVC programmes will ever be clearly documented and compared across OVC programmes and other types of programmes (prevention, treatment, etc.). PMID:22182588
Larson, Bruce A; Wambua, Nancy
2011-12-19
Information on the costs of implementing programmes designed to provide support of orphans and vulnerable children (OVC) in sub-Saharan Africa and elsewhere is increasingly being requested by donors for programme evaluation purposes. To date, little information exists to document the costs and structure of costs of OVC programmes as actually implemented "on the ground" by local non-governmental organizations (NGOs). This analysis provides a practical, six-step approach that NGOs can incorporate into routine operations to evaluate their costs of implementing their OVC programmes annually. This approach is applied to the Community-Based Care for Orphans and Vulnerable Children (CBCO) Program implemented by BIDII (a Kenyan NGO) in Eastern Province of Kenya. The costing methodology involves the following six steps: accessing and organizing the NGO's annual financial report into logical sub-categories; reorganizing the sub-categories into input cost categories to create a financial cost profile; estimating the annual equivalent payment for programme equipment; documenting donations to the NGO for programme implementation; including a portion of NGO organizational costs not attributed to specific programmes; and including the results of Steps 3-5 into an expanded cost profile. Detailed results are provided for the CBCO programme. This paper shows through a concrete example how NGOs implementing OVC programmes (and other public health programmes) can organize themselves for data collection and documentation prospectively during the implementation of their OVC programmes so that costing analyses become routine practice to inform programme implementation rather than a painful and flawed retrospective activity. Such information is required if the costs and outcomes achieved by OVC programmes will ever be clearly documented and compared across OVC programmes and other types of programmes (prevention, treatment, etc.).
Ada Structure Design Language (ASDL)
NASA Technical Reports Server (NTRS)
Chedrawi, Lutfi
1986-01-01
An artist acquires all the necessary tools before painting a scene. In the same analogy, a software engineer needs the necessary tools to provide their design with the proper means for implementation. Ada provide these tools. Yet, as an artist's painting needs a brochure to accompany it for further explanation of the scene, an Ada design also needs a document along with it to show the design in its detailed structure and hierarchical order. Ada could be self-explanatory in small programs not exceeding fifty lines of code in length. But, in a large environment, ranging from thousands of lines and above, Ada programs need to be well documented to be preserved and maintained. The language used to specify an Ada document is called Ada Structure Design Language (ASDL). This language sets some rules to help derive a well formatted Ada detailed design document. The rules are defined to meet the needs of a project manager, a maintenance team, a programmer and a systems designer. The design document templates, the document extractor, and the rules set forth by the ASDL are explained in detail.
NASA Technical Reports Server (NTRS)
Meyn, Larry A.
2018-01-01
One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use
Existence conditions for unknown input functional observers
NASA Astrophysics Data System (ADS)
Fernando, T.; MacDougall, S.; Sreeram, V.; Trinh, H.
2013-01-01
This article presents necessary and sufficient conditions for the existence and design of an unknown input Functional observer. The existence of the observer can be verified by computing a nullspace of a known matrix and testing some matrix rank conditions. The existence of the observer does not require the satisfaction of the observer matching condition (i.e. Equation (16) in Hou and Muller 1992, 'Design of Observers for Linear Systems with Unknown Inputs', IEEE Transactions on Automatic Control, 37, 871-875), is not limited to estimating scalar functionals and allows for arbitrary pole placement. The proposed observer always exists when a state observer exists for the unknown input system, and furthermore, the proposed observer can exist even in some instances when an unknown input state observer does not exist.
Mining knowledge from corpora: an application to retrieval and indexing.
Soualmia, Lina F; Dahamna, Badisse; Darmoni, Stéfan
2008-01-01
The present work aims at discovering new associations between medical concepts to be exploited as input in retrieval and indexing. Association rules method is applied to documents. The process is carried out on three major document categories referring to e-health information consumers: health professionals, students and lay people. Association rules evaluation is founded on statistical measures combined with domain knowledge. Association rules represent existing relations between medical concepts (60.62%) and new knowledge (54.21%). Based on observations, 463 expert rules are defined by medical librarians for retrieval and indexing. Association rules bear out existing relations, produce new knowledge and support users and indexers in document retrieval and indexing.
NASA Technical Reports Server (NTRS)
Maples, A. L.
1981-01-01
The operation of solidification Model 2 is described and documentation of the software associated with the model is provided. Model 2 calculates the macrosegregation in a rectangular ingot of a binary alloy as a result of unsteady horizontal axisymmetric bidirectional solidification. The solidification program allows interactive modification of calculation parameters as well as selection of graphical and tabular output. In batch mode, parameter values are input in card image form and output consists of printed tables of solidification functions. The operational aspects of Model 2 that differ substantially from Model 1 are described. The global flow diagrams and data structures of Model 2 are included. The primary program documentation is the code itself.
Hierarchical resilience with lightweight threads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wheeler, Kyle Bruce
2011-10-01
This paper proposes methodology for providing robustness and resilience for a highly threaded distributed- and shared-memory environment based on well-defined inputs and outputs to lightweight tasks. These inputs and outputs form a failure 'barrier', allowing tasks to be restarted or duplicated as necessary. These barriers must be expanded based on task behavior, such as communication between tasks, but do not prohibit any given behavior. One of the trends in high-performance computing codes seems to be a trend toward self-contained functions that mimic functional programming. Software designers are trending toward a model of software design where their core functions are specifiedmore » in side-effect free or low-side-effect ways, wherein the inputs and outputs of the functions are well-defined. This provides the ability to copy the inputs to wherever they need to be - whether that's the other side of the PCI bus or the other side of the network - do work on that input using local memory, and then copy the outputs back (as needed). This design pattern is popular among new distributed threading environment designs. Such designs include the Barcelona STARS system, distributed OpenMP systems, the Habanero-C and Habanero-Java systems from Vivek Sarkar at Rice University, the HPX/ParalleX model from LSU, as well as our own Scalable Parallel Runtime effort (SPR) and the Trilinos stateless kernels. This design pattern is also shared by CUDA and several OpenMP extensions for GPU-type accelerators (e.g. the PGI OpenMP extensions).« less
Electrometer Amplifier With Overload Protection
NASA Technical Reports Server (NTRS)
Woeller, F. H.; Alexander, R.
1986-01-01
Circuit features low noise, input offset, and high linearity. Input preamplifier includes input-overload protection and nulling circuit to subtract dc offset from output. Prototype dc amplifier designed for use with ion detector has features desirable in general laboratory and field instrumentation.
A parallel input composite transimpedance amplifier.
Kim, D J; Kim, C
2018-01-01
A new approach to high performance current to voltage preamplifier design is presented. The design using multiple operational amplifiers (op-amps) has a parasitic capacitance compensation network and a composite amplifier topology for fast, precision, and low noise performance. The input stage consisting of a parallel linked JFET op-amps and a high-speed bipolar junction transistor (BJT) gain stage driving the output in the composite amplifier topology, cooperating with the capacitance compensation feedback network, ensures wide bandwidth stability in the presence of input capacitance above 40 nF. The design is ideal for any two-probe measurement, including high impedance transport and scanning tunneling microscopy measurements.
A parallel input composite transimpedance amplifier
NASA Astrophysics Data System (ADS)
Kim, D. J.; Kim, C.
2018-01-01
A new approach to high performance current to voltage preamplifier design is presented. The design using multiple operational amplifiers (op-amps) has a parasitic capacitance compensation network and a composite amplifier topology for fast, precision, and low noise performance. The input stage consisting of a parallel linked JFET op-amps and a high-speed bipolar junction transistor (BJT) gain stage driving the output in the composite amplifier topology, cooperating with the capacitance compensation feedback network, ensures wide bandwidth stability in the presence of input capacitance above 40 nF. The design is ideal for any two-probe measurement, including high impedance transport and scanning tunneling microscopy measurements.
ERGONOMICS ABSTRACTS 48983-49619.
ERIC Educational Resources Information Center
Ministry of Technology, London (England). Warren Spring Lab.
THE LITERATURE OF ERGONOMICS, OR BIOTECHNOLOGY, IS CLASSIFIED INTO 15 AREAS--METHODS, SYSTEMS OF MEN AND MACHINES, VISUAL AND AUDITORY AND OTHER INPUTS AND PROCESSES, INPUT CHANNELS, BODY MEASUREMENTS, DESIGN OF CONTROLS AND INTEGRATION WITH DISPLAYS, LAYOUT OF PANELS AND CONSOLES, DESIGN OF WORK SPACE, CLOTHING AND PERSONAL EQUIPMENT, SPECIAL…
Microresonator electrode design
Olsson, III, Roy H.; Wojciechowski, Kenneth; Branch, Darren W.
2016-05-10
A microresonator with an input electrode and an output electrode patterned thereon is described. The input electrode includes a series of stubs that are configured to isolate acoustic waves, such that the waves are not reflected into the microresonator. Such design results in reduction of spurious modes corresponding to the microresonator.
Generative Representations for Evolving Families of Designs
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.
2003-01-01
Since typical evolutionary design systems encode only a single artifact with each individual, each time the objective changes a new set of individuals must be evolved. When this objective varies in a way that can be parameterized, a more general method is to use a representation in which a single individual encodes an entire class of artifacts. In addition to saving time by preventing the need for multiple evolutionary runs, the evolution of parameter-controlled designs can create families of artifacts with the same style and a reuse of parts between members of the family. In this paper an evolutionary design system is described which uses a generative representation to encode families of designs. Because a generative representation is an algorithmic encoding of a design, its input parameters are a way to control aspects of the design it generates. By evaluating individuals multiple times with different input parameters the evolutionary design system creates individuals in which the input parameter controls specific aspects of a design. This system is demonstrated on two design substrates: neural-networks which solve the 3/5/7-parity problem and three-dimensional tables of varying heights.
"Promotores'" Perspectives on a Male-to-Male Peer Network
ERIC Educational Resources Information Center
Macia, Laura; Ruiz, Hector Camilo; Boyzo, Roberto; Documet, Patricia Isabel
2016-01-01
Little documentation exists about male community health workers ("promotores") networks. The experiences of "promotores" can provide input on how to attract, train, supervise and maintain male "promotores" in CHW programs. We present the experience and perspectives of "promotores" who participated in a male…
CITE NLM: Natural-Language Searching in an Online Catalog.
ERIC Educational Resources Information Center
Doszkocs, Tamas E.
1983-01-01
The National Library of Medicine's Current Information Transfer in English public access online catalog offers unique subject search capabilities--natural-language query input, automatic medical subject headings display, closest match search strategy, ranked document output, dynamic end user feedback for search refinement. References, description…
FINAL REPORT FOR VERIFICATION OF THE METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (MFFPPT)
The United States Environmental Protection Agency (USEPA) has prepared a computer process simulation package for the metal finishing industry that enables users to predict process outputs based upon process inputs and other operating conditions. This report documents the developm...
Analysis of Louisiana Vehicular Input Data for MOBILE 6.
DOT National Transportation Integrated Search
2008-06-01
The purpose of this study was to identify sources of data for MOBILE 6 and set procedures to prepare the data in the : format required for use in MOBILE 6. The Environmental Protection Agency (EPA) has provided a comprehensive set of : documents desc...
Holocene paleoecology of an estuary on Santa Rosa Island, California
Cole, K.L.; Liu, Gaisheng
1994-01-01
The middle to late Holocene history and early Anglo-European settlement impacts on Santa Rosa Island, California, were studied through the analysis of sediments in a small estuarine marsh. A 5.4-m-long sediment core produced a stratigraphic and pollen record spanning the last 5200 yr. Three major zones are distinguishable in the core. The lowermost zone (5200 to 3250 yr B.P.) represents a time of arid climate with predominantly marine sediment input and high Chenopodiaceae and Ambrosia pollen values. The intermediate zone (3250 yr B.P. to 1800 A.D.) is characterized by greater fresh water input and high values for Asteraceae and Cyperaceae pollen and charcoal particles. The uppermost zone (1800 A.D. to present) documents the unprecedented erosion, sedimentation, and vegetation change that resulted from the introduction of large exotic herbivores and exotic plants to the island during Anglo-European settlement. The identification of pollen grains of Torrey Pine (Pinus torreyana) documents the persistence of this endemic species on the island throughout the middle to late Holocene.
Tillman, Fred D.
2015-01-01
The Colorado River and its tributaries supply water to more than 35 million people in the United States and 3 million people in Mexico, irrigating more than 4.5 million acres of farmland, and generating about 12 billion kilowatt hours of hydroelectric power annually. The Upper Colorado River Basin, encompassing more than 110,000 square miles (mi2), contains the headwaters of the Colorado River (also known as the River) and is an important source of snowmelt runoff to the River. Groundwater discharge also is an important source of water in the River and its tributaries, with estimates ranging from 21 to 58 percent of streamflow in the upper basin. Planning for the sustainable management of the Colorado River in future climates requires an understanding of the Upper Colorado River Basin groundwater system. This report documents input datasets for a Soil-Water Balance groundwater recharge model that was developed for the Upper Colorado River Basin.
Souza, W.R.
1987-01-01
This report documents a graphical display program for the U. S. Geological Survey finite-element groundwater flow and solute transport model. Graphic features of the program, SUTRA-PLOT (SUTRA-PLOT = saturated/unsaturated transport), include: (1) plots of the finite-element mesh, (2) velocity vector plots, (3) contour plots of pressure, solute concentration, temperature, or saturation, and (4) a finite-element interpolator for gridding data prior to contouring. SUTRA-PLOT is written in FORTRAN 77 on a PRIME 750 computer system, and requires Version 9.0 or higher of the DISSPLA graphics library. The program requires two input files: the SUTRA input data list and the SUTRA simulation output listing. The program is menu driven and specifications for individual types of plots are entered and may be edited interactively. Installation instruction, a source code listing, and a description of the computer code are given. Six examples of plotting applications are used to demonstrate various features of the plotting program. (Author 's abstract)
A High-Performance Reconfigurable Fabric for Cognitive Information Processing
2010-12-01
receives a data token from its control input (shown as a horizontal arrow above). The value of this data token is used to select an input port. The...dual of a merge. It receives a data token from its control input (shown as a horizontal arrow above). The value of this data token is used to select...Computer-Aided Design of Intergrated Circuits and Systems, Vol. 26, No. 2, February 2007. [12] Cadence Design Systems. Clock Domain Crossing: Closing the
US-CERT Control System Center Input/Output (I/O) Conceputal Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2005-02-01
This document was prepared for the US-CERT Control Systems Center of the National Cyber Security Division (NCSD) of the Department of Homeland Security (DHS). DHS has been tasked under the Homeland Security Act of 2002 to coordinate the overall national effort to enhance the protection of the national critical infrastructure. Homeland Security Presidential Directive HSPD-7 directs the federal departments to identify and prioritize critical infrastructure and protect it from terrorist attack. The US-CERT National Strategy for Control Systems Security was prepared by the NCSD to address the control system security component addressed in the National Strategy to Secure Cyberspace andmore » the National Strategy for the Physical Protection of Critical Infrastructures and Key Assets. The US-CERT National Strategy for Control Systems Security identified five high-level strategic goals for improving cyber security of control systems; the I/O upgrade described in this document supports these goals. The vulnerability assessment Test Bed, located in the Information Operations Research Center (IORC) facility at Idaho National Laboratory (INL), consists of a cyber test facility integrated with multiple test beds that simulate the nation's critical infrastructure. The fundamental mission of the Test Bed is to provide industry owner/operators, system vendors, and multi-agency partners of the INL National Security Division a platform for vulnerability assessments of control systems. The Input/Output (I/O) upgrade to the Test Bed (see Work Package 3.1 of the FY-05 Annual Work Plan) will provide for the expansion of assessment capabilities within the IORC facility. It will also provide capabilities to connect test beds within the Test Range and other Laboratory resources. This will allow real time I/O data input and communication channels for full replications of control systems (Process Control Systems [PCS], Supervisory Control and Data Acquisition Systems [SCADA], and components). This will be accomplished through the design and implementation of a modular infrastructure of control system, communications, networking, computing and associated equipment, and measurement/control devices. The architecture upgrade will provide a flexible patching system providing a quick ''plug and play''configuration through various communication paths to gain access to live I/O running over specific protocols. This will allow for in-depth assessments of control systems in a true-to-life environment. The full I/O upgrade will be completed through a two-phased approach. Phase I, funded by DHS, expands the capabilities of the Test Bed by developing an operational control system in two functional areas, the Science & Technology Applications Research (STAR) Facility and the expansion of various portions of the Test Bed. Phase II (see Appendix A), funded by other programs, will complete the full I/O upgrade to the facility.« less
Riss, Patrick J; Hong, Young T; Williamson, David; Caprioli, Daniele; Sitnikov, Sergey; Ferrari, Valentina; Sawiak, Steve J; Baron, Jean-Claude; Dalley, Jeffrey W; Fryer, Tim D; Aigbirhio, Franklin I
2011-01-01
The 5-hydroxytryptamine type 2a (5-HT2A) selective radiotracer [18F]altanserin has been subjected to a quantitative micro-positron emission tomography study in Lister Hooded rats. Metabolite-corrected plasma input modeling was compared with reference tissue modeling using the cerebellum as reference tissue. [18F]altanserin showed sufficient brain uptake in a distribution pattern consistent with the known distribution of 5-HT2A receptors. Full binding saturation and displacement was documented, and no significant uptake of radioactive metabolites was detected in the brain. Blood input as well as reference tissue models were equally appropriate to describe the radiotracer kinetics. [18F]altanserin is suitable for quantification of 5-HT2A receptor availability in rats. PMID:21750562
Terminal Area Simulation System User's Guide - Version 10.0
NASA Technical Reports Server (NTRS)
Switzer, George F.; Proctor, Fred H.
2014-01-01
The Terminal Area Simulation System (TASS) is a three-dimensional, time-dependent, large eddy simulation model that has been developed for studies of wake vortex and weather hazards to aviation, along with other atmospheric turbulence, and cloud-scale weather phenomenology. This document describes the source code for TASS version 10.0 and provides users with needed documentation to run the model. The source code is programed in Fortran language and is formulated to take advantage of vector and efficient multi-processor scaling for execution on massively-parallel supercomputer clusters. The code contains different initialization modules allowing the study of aircraft wake vortex interaction with the atmosphere and ground, atmospheric turbulence, atmospheric boundary layers, precipitating convective clouds, hail storms, gust fronts, microburst windshear, supercell and mesoscale convective systems, tornadic storms, and ring vortices. The model is able to operate in either two- or three-dimensions with equations numerically formulated on a Cartesian grid. The primary output from the TASS is time-dependent domain fields generated by the prognostic equations and diagnosed variables. This document will enable a user to understand the general logic of TASS, and will show how to configure and initialize the model domain. Also described are the formats of the input and output files, as well as the parameters that control the input and output.
Tools to Develop or Convert MOVES Inputs
The following tools are designed to help users develop inputs to MOVES and post-process the output. With the release of MOVES2014, EPA strongly encourages state and local agencies to develop local inputs based on MOVES fleet and activity categories.
Very High-Temperature Reactor (VHTR) Proliferation Resistance and Physical Protection (PR&PP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moses, David Lewis
2011-10-01
This report documents the detailed background information that has been compiled to support the preparation of a much shorter white paper on the design features and fuel cycles of Very High-Temperature Reactors (VHTRs), including the proposed Next-Generation Nuclear Plant (NGNP), to identify the important proliferation resistance and physical protection (PR&PP) aspects of the proposed concepts. The shorter white paper derived from the information in this report was prepared for the Department of Energy Office of Nuclear Science and Technology for the Generation IV International Forum (GIF) VHTR Systems Steering Committee (SSC) as input to the GIF Proliferation Resistance and Physicalmore » Protection Working Group (PR&PPWG) (http://www.gen-4.org/Technology/horizontal/proliferation.htm). The short white paper was edited by the GIF VHTR SCC to address their concerns and thus may differ from the information presented in this supporting report. The GIF PR&PPWG will use the derived white paper based on this report along with other white papers on the six alternative Generation IV design concepts (http://www.gen-4.org/Technology/systems/index.htm) to employ an evaluation methodology that can be applied and will evolve from the earliest stages of design. This methodology will guide system designers, program policy makers, and external stakeholders in evaluating the response of each system, to determine each system's resistance to proliferation threats and robustness against sabotage and terrorism threats, and thereby guide future international cooperation on ensuring safeguards in the deployment of the Generation IV systems. The format and content of this report is that specified in a template prepared by the GIF PR&PPWG. Other than the level of detail, the key exception to the specified template format is the addition of Appendix C to document the history and status of coated-particle fuel reprocessing technologies, which fuel reprocessing technologies have yet to be deployed commercially and have only been demonstrated in testing at a laboratory scale.« less
77 FR 66588 - Development of the Nationwide Interoperable Public Safety Broadband Network
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... architecture and applications as well as to invite input on other network design and business plan... Authority (FirstNet) as well as to invite input on other network design and business plan considerations... name and organizational affiliation of the filer. Do not submit Confidential Business Information or...
Spectrum analysis on quality requirements consideration in software design documents.
Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji
2013-12-01
Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.
Tolerance and UQ4SIM: Nimble Uncertainty Documentation and Analysis Software
NASA Technical Reports Server (NTRS)
Kleb, Bil
2008-01-01
Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and variabilities is a necessary first step toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. The basic premise of uncertainty markup is to craft a tolerance and tagging mini-language that offers a natural, unobtrusive presentation and does not depend on parsing each type of input file format. Each file is marked up with tolerances and optionally, associated tags that serve to label the parameters and their uncertainties. The evolution of such a language, often called a Domain Specific Language or DSL, is given in [1], but in final form it parallels tolerances specified on an engineering drawing, e.g., 1 +/- 0.5, 5 +/- 10%, 2 +/- 10 where % signifies percent and o signifies order of magnitude. Tags, necessary for error propagation, can be added by placing a quotation-mark-delimited tag after the tolerance, e.g., 0.7 +/- 20% 'T_effective'. In addition, tolerances might have different underlying distributions, e.g., Uniform, Normal, or Triangular, or the tolerances may merely be intervals due to lack of knowledge (uncertainty). Finally, to address pragmatic considerations such as older models that require specific number-field formats, C-style format specifiers can be appended to the tolerance like so, 1.35 +/- 10U_3.2f. As an example of use, consider figure 1, where a chemical reaction input file is has been marked up to include tolerances and tags per table 1. Not only does the technique provide a natural method of specifying tolerances, but it also servers as in situ documentation of model uncertainties. This tolerance language comes with a utility to strip the tolerances (and tags), to provide a path to the nominal model parameter file. And, as shown in [1], having the ability to quickly mark and identify model parameter uncertainties facilitates error propagation, which in turn yield output uncertainties.
Subranging technique using superconducting technology
Gupta, Deepnarayan
2003-01-01
Subranging techniques using "digital SQUIDs" are used to design systems with large dynamic range, high resolution and large bandwidth. Analog-to-digital converters (ADCs) embodying the invention include a first SQUID based "coarse" resolution circuit and a second SQUID based "fine" resolution circuit to convert an analog input signal into "coarse" and "fine" digital signals for subsequent processing. In one embodiment, an ADC includes circuitry for supplying an analog input signal to an input coil having at least a first inductive section and a second inductive section. A first superconducting quantum interference device (SQUID) is coupled to the first inductive section and a second SQUID is coupled to the second inductive section. The first SQUID is designed to produce "coarse" (large amplitude, low resolution) output signals and the second SQUID is designed to produce "fine" (low amplitude, high resolution) output signals in response to the analog input signals.
Visualizing the Topical Structure of the Medical Sciences: A Self-Organizing Map Approach
Skupin, André; Biberstine, Joseph R.; Börner, Katy
2013-01-01
Background We implement a high-resolution visualization of the medical knowledge domain using the self-organizing map (SOM) method, based on a corpus of over two million publications. While self-organizing maps have been used for document visualization for some time, (1) little is known about how to deal with truly large document collections in conjunction with a large number of SOM neurons, (2) post-training geometric and semiotic transformations of the SOM tend to be limited, and (3) no user studies have been conducted with domain experts to validate the utility and readability of the resulting visualizations. Our study makes key contributions to all of these issues. Methodology Documents extracted from Medline and Scopus are analyzed on the basis of indexer-assigned MeSH terms. Initial dimensionality is reduced to include only the top 10% most frequent terms and the resulting document vectors are then used to train a large SOM consisting of over 75,000 neurons. The resulting two-dimensional model of the high-dimensional input space is then transformed into a large-format map by using geographic information system (GIS) techniques and cartographic design principles. This map is then annotated and evaluated by ten experts stemming from the biomedical and other domains. Conclusions Study results demonstrate that it is possible to transform a very large document corpus into a map that is visually engaging and conceptually stimulating to subject experts from both inside and outside of the particular knowledge domain. The challenges of dealing with a truly large corpus come to the fore and require embracing parallelization and use of supercomputing resources to solve otherwise intractable computational tasks. Among the envisaged future efforts are the creation of a highly interactive interface and the elaboration of the notion of this map of medicine acting as a base map, onto which other knowledge artifacts could be overlaid. PMID:23554924
TransGuide : model deployment initiative design report
DOT National Transportation Integrated Search
1998-09-01
This report documents the high-level design of the TransGuide MDI project and discusses the design trade-off decisions. A detailed, specific project level design is provided in each projects System Design document.
Warfighting Concepts to Future Weapon System Designs (WARCON)
2003-09-12
34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As