Sample records for design methodology volume

  1. Transportation Energy Conservation Data Book: A Selected Bibliography. Edition 3,

    DTIC Science & Technology

    1978-11-01

    Charlottesville, VA 22901 TITLE: Couputer-Based Resource Accounting Model TT1.1: Methodology for the Design of Urban for Automobile Technology Impact...Evaluation System ACCOUNTING; INDUSTRIAL SECTOR; ENERGY tPIESi Documentation. volume 6. CONSUM PTION: PERFORANCE: DESIGN : NASTE MEAT: Methodology for... Methodology for the Design of Urban Transportation 000172 Energy Flows In the U.S., 1973 and 1974. Volume 1: Methodology * $opdate to the Fational Energy

  2. Development of Probabilistic Rigid Pavement Design Methodologies for Military Airfields.

    DTIC Science & Technology

    1983-12-01

    4A161102AT22, Task AO, Work Unit 009, "Methodology for Considering Material Variability in Pavement Design." OCE Project Monitor was Mr. S. S. Gillespie. The...PREFACE. .. ............................. VOLUME 1: STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS VOLUME 11: MATHEMATICAL FORMULATION OF...VOLUME IV: PROBABILISTIC ANALYSIS OF RIGID AIRFIELD DESIGN BY ELASTIC LAYERED THEORY VOLUME I STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS

  3. Early Childhood Longitudinal Study, Birth Cohort (ECLS-B): Methodology Report for the 9-Month Data Collection (2001-02). Volume 2: Sampling. NCES 2005-147

    ERIC Educational Resources Information Center

    Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry

    2005-01-01

    This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…

  4. Rural Schools Prototype Analysis. Volume II: Methodology. An Example Process of Identifying Determinants, Selecting Options, & Developing Schematic Designs.

    ERIC Educational Resources Information Center

    Construction Systems Management, Inc., Anchorage, AK.

    Volume II of a 3-volume report demonstrates the use of Design Determinants and Options (presented in Volume I) in the planning and design of small rural Alaskan secondary schools. Section I, a checklist for gathering site-specific information to be used as a data base for facility design, is organized in the same format as Volume I, which can be…

  5. OPUS: Optimal Projection for Uncertain Systems. Volume 1

    DTIC Science & Technology

    1991-09-01

    unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of

  6. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 1

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chinyere; Onyebueke, Landon

    1996-01-01

    This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.

  7. [Optimization of Polysaccharide Extraction from Spirodela polyrrhiza by Plackett-Burman Design Combined with Box-Behnken Response Surface Methodology].

    PubMed

    Jiang, Zheng; Wang, Hong; Wu, Qi-nan

    2015-06-01

    To optimize the processing of polysaccharide extraction from Spirodela polyrrhiza. Five factors related to extraction rate of polysaccharide were optimized by the Plackett-Burman design. Based on this study, three factors, including alcohol volume fraction, extraction temperature and ratio of material to liquid, were regarded as investigation factors by Box-Behnken response surface methodology. The effect order of three factors on the extraction rate of polysaccharide from Spirodela polyrrhiza were as follows: extraction temperature, alcohol volume fraction,ratio of material to liquid. According to Box-Behnken response, the best extraction conditions were: alcohol volume fraction of 81%, ratio of material to liquid of 1:42, extraction temperature of 100 degrees C, extraction time of 60 min for four times. Plackett-Burman design and Box-Behnken response surface methodology used to optimize the extraction process for the polysaccharide in this study is effective and stable.

  8. The Self-Concept. Volume 1, A Review of Methodological Considerations and Measuring Instruments. Revised Edition.

    ERIC Educational Resources Information Center

    Wylie, Ruth C.

    This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…

  9. Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...

  10. Architecture, Design, and System; Performance Assessment and Development Methodology for Computer-Based Systems. Volume 1. Methodology Description, Discussion, and Assessment,

    DTIC Science & Technology

    1983-12-30

    AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S

  11. N+3 Aircraft Concept Designs and Trade Studies. Volume 2; Appendices-Design Methodologies for Aerodynamics, Structures, Weight, and Thermodynamic Cycles

    NASA Technical Reports Server (NTRS)

    Greitzer, E. M.; Bonnefoy, P. A.; delaRosaBlanco, E.; Dorbian, C. S.; Drela, M.; Hall, D. K.; Hansman, R. J.; Hileman, J. I.; Liebeck, R. H.; Lovegren, J.; hide

    2010-01-01

    Appendices A to F present the theory behind the TASOPT methodology and code. Appendix A describes the bulk of the formulation, while Appendices B to F develop the major sub-models for the engine, fuselage drag, BLI accounting, etc.

  12. C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1

    NASA Astrophysics Data System (ADS)

    Wilson, J. L.; Jolly, M. B.

    1984-01-01

    A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.

  13. SRB ascent aerodynamic heating design criteria reduction study, volume 1

    NASA Technical Reports Server (NTRS)

    Crain, W. K.; Frost, C. L.; Engel, C. D.

    1989-01-01

    An independent set of solid rocket booster (SRB) convective ascent design environments were produced which would serve as a check on the Rockwell IVBC-3 environments used to design the ascent phase of flight. In addition, support was provided for lowering the design environments such that Thermal Protection System (TPS), based on conservative estimates, could be removed leading to a reduction in SRB refurbishment time and cost. Ascent convective heating rates and loads were generated at locations in the SRB where lowering the thermal environment would impact the TPS design. The ascent thermal environments are documented along with the wind tunnel/flight test data base used as well as the trajectory and environment generation methodology. Methodology, as well as, environment summaries compared to the 1980 Design and Rockwell IVBC-3 Design Environment are presented in this volume, 1.

  14. Persian Basic Course: Volume I, Lesson 1-18.

    ERIC Educational Resources Information Center

    Defense Language Inst., Monterey, CA.

    The first of 10 volumes of a basic course in Persian is presented that is designed for use in the Defense Language Institute's intensive programs. The course, employing the audiolingual methodology, is designed to train native English speakers to level three proficiency in comprehension and speaking and level two proficiency in reading and writing…

  15. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    DOT National Transportation Integrated Search

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  16. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    DOT National Transportation Integrated Search

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  17. Site characterization methodology for aquifers in support of bioreclamation activities. Volume 2: Borehole flowmeter technique, tracer tests, geostatistics and geology. Final report, August 1987-September 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S.C.

    1993-08-01

    This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less

  18. Study for the optimization of a transport aircraft wing for maximum fuel efficiency. Volume 1: Methodology, criteria, aeroelastic model definition and results

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.

    1985-01-01

    Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.

  19. Challenges and solutions for high-volume testing of silicon photonics

    NASA Astrophysics Data System (ADS)

    Polster, Robert; Dai, Liang Yuan; Oikonomou, Michail; Cheng, Qixiang; Rumley, Sebastien; Bergman, Keren

    2018-02-01

    The first generation of silicon photonic products is now commercially available. While silicon photonics possesses key economic advantages over classical photonic platforms, it has yet to become a commercial success because these advantages can be fully realized only when high-volume testing of silicon photonic devices is made possible. We discuss the costs, challenges, and solutions of photonic chip testing as reported in the recent research literature. We define and propose three underlying paradigms that should be considered when creating photonic test structures: Design for Fast Coupling, Design for Minimal Taps, and Design for Parallel Testing. We underline that a coherent test methodology must be established prior to the design of test structures, and demonstrate how an optimized methodology dramatically reduces the burden when designing for test, by reducing the needed complexity of test structures.

  20. Weapon System Costing Methodology for Aircraft Airframes and Basic Structures. Volume I. Technical Volume

    DTIC Science & Technology

    1975-06-01

    the Air Force Flight Dynamics Laboratory for use in conceptual and preliminary designs pauses of weapon system development. The methods are a...trade study method provides ai\\ iterative capability stemming from a direct interface with design synthesis programs. A detailed cost data base ;ind...system for data expmjsion is provided. The methods are designed for ease in changing cost estimating relationships and estimating coefficients

  1. Tunnel and Station Cost Methodology Volume II: Stations

    DOT National Transportation Integrated Search

    1981-01-01

    The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...

  2. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  3. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide.

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  4. Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation

    NASA Technical Reports Server (NTRS)

    Garrison, J. M.

    1973-01-01

    The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.

  5. Concept design and analysis of intermodal freight systems : volume II : Methodology and Results

    DOT National Transportation Integrated Search

    1980-01-01

    This report documents the concept design and analysis of intermodal freight systems. The primary objective of this project was to quantify the various tradeoffs and relationships between fundamental system design parameters and operating strategies, ...

  6. Deliverology

    ERIC Educational Resources Information Center

    Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen

    2017-01-01

    Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…

  7. Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, volume 1

    NASA Technical Reports Server (NTRS)

    Soderquist, Joseph R. (Compiler); Neri, Lawrence M. (Compiler); Bohon, Herman L. (Compiler)

    1992-01-01

    This publication contains the proceedings of the Ninth DOD/NASA/FAA conference on Fibrous Composites in structural Design. Presentations were made in the following areas of composite structural design: perspectives in composites; design methodology; design applications; design criteria; supporting technology; damage tolerance; and manufacturing.

  8. Volume, Conservation and Instruction: A Classroom Based Solomon Four Group Study of Conflict.

    ERIC Educational Resources Information Center

    Rowell, J. A.; Dawson, C. J.

    1981-01-01

    Summarizes a study to widen the applicability of Piagetian theory-based conflict methodology from individual situations to entire classes. A Solomon four group design was used to study effects of conflict instruction on students' (N=127) ability to conserve volume of noncompressible matter and to apply that knowledge to gas volume. (Author/JN)

  9. Lean for Education

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia

    2017-01-01

    Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…

  10. Highway Safety Information System guidebook for the Maine state data files. Volume 1 : SAS file formats

    DOT National Transportation Integrated Search

    2012-05-05

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the ICM AMS methodology successfully and effectively. It provides a step-by-step approach to ...

  11. Descriptive Summaries of the Research Development Test & Evaluation. Army Appropriation FY 1984. Supporting Data FY 1984 Budget Estimate Submitted to Congress--February 1983. Volume I.

    DTIC Science & Technology

    1983-02-01

    s.,ccesstully modeled to enhance future computer design simulations; (2) a new methodology for conduc*n dynamic analysis of vehicle mechanics was...to prelminary design methodology for tilt rotors, advancing blade concepts configuration helicopters, and compound helicopters in conjunction with...feasibility of low-level personnel parachutes has been demon- strated. A study was begun to design a free-fall water contalner. An experimental program to

  12. Environmental exposure effects on composite materials for commercial aircraft

    NASA Technical Reports Server (NTRS)

    Hoffman, D. J.

    1978-01-01

    Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.

  13. Design Science Methodology Applied to a Chemical Surveillance Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.

    Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less

  14. Six Sigma in Education

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.

    2017-01-01

    Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…

  15. Positive Deviance: Learning from Positive Anomalies

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Gale, Dick

    2017-01-01

    Purpose: This paper is one of seven in this volume, each elaborating different approaches to quality improvement in education. The purpose of this paper is to delineate a methodology called positive deviance. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an…

  16. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  17. Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, volume 2

    NASA Technical Reports Server (NTRS)

    Soderquist, Joseph R. (Compiler); Neri, Lawrence M. (Compiler); Bohon, Herman L. (Compiler)

    1992-01-01

    This publication contains the proceedings of the Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design held at Lake Tahoe, Nevada, during 4-7 Nov. 1991. Presentations were made in the following areas of composite structural design: perspectives in composites, design methodology, design applications, design criteria, supporting technology, damage tolerance, and manufacturing.

  18. Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design, volume 3

    NASA Technical Reports Server (NTRS)

    Soderquist, Joseph R. (Compiler); Neri, Lawrence M. (Compiler); Bohon, Herman L. (Compiler)

    1992-01-01

    This publication contains the proceedings of the Ninth DOD/NASA/FAA Conference on Fibrous Composites in Structural Design held at Lake Tahoe, Nevada, during 4-7 Nov. 1991. Presentations were made in the following areas of composite structural design: perspectives in composites, design methodology, design applications, design criteria, supporting technology, damage tolerance, and manufacturing.

  19. Multi-Objective Optimization of Moving-magnet Linear Oscillatory Motor Using Response Surface Methodology with Quantum-Behaved PSO Operator

    NASA Astrophysics Data System (ADS)

    Lei, Meizhen; Wang, Liqiang

    2018-01-01

    To reduce the difficulty of manufacturing and increase the magnetic thrust density, a moving-magnet linear oscillatory motor (MMLOM) without inner-stators was Proposed. To get the optimal design of maximum electromagnetic thrust with minimal permanent magnetic material, firstly, the 3D finite element analysis (FEA) model of the MMLOM was built and verified by comparison with prototype experiment result. Then the influence of design parameters of permanent magnet (PM) on the electromagnetic thrust was systematically analyzed by the 3D FEA to get the design parameters. Secondly, response surface methodology (RSM) was employed to build the response surface model of the new MMLOM, which can obtain an analytical model of the PM volume and thrust. Then a multi-objective optimization methods for design parameters of PM, using response surface methodology (RSM) with a quantum-behaved PSO (QPSO) operator, was proposed. Then the way to choose the best design parameters of PM among the multi-objective optimization solution sets was proposed. Then the 3D FEA of the optimal design candidates was compared. The comparison results showed that the proposed method can obtain the best combination of the geometric parameters of reducing the PM volume and increasing the thrust.

  20. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  1. C-Band Airport Surface Communications System Standards Development. Phase II Final Report. Volume 1: Concepts of Use, Initial System Requirements, Architecture, and AeroMACS Design Considerations

    NASA Technical Reports Server (NTRS)

    Hall, Edward; Isaacs, James; Henriksen, Steve; Zelkin, Natalie

    2011-01-01

    This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I (this document) is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.

  2. Volume, conservation and instruction: A classroom based solomon four group study of conflict

    NASA Astrophysics Data System (ADS)

    Rowell, J. A.; Dawson, C. J.

    The research reported is an attempt to widen the applicability of Piagetian theory-based conflict methodology from individual situations to whole classes. A Solomon four group experimental design augmented by a delayed posttest, was used to provide a controlled framework for studying the effects of conflict instruction on Grade 8 students' ability to conserve volume of noncompressible matter, and to apply that knowledge to gas volume. The results, reported for individuals and groups, show the methodology can be effective, particularly when instruction is preceded by a pretest. Immediate posttest differences in knowledge of gas volume between spontaneous (pretest) conservers and instructed conservers of volume of noncompressible matter were no longer in evidence on the delayed posttest. This observation together with the effects of pretesting and of the instructional sequence are shown to have a consistent Piagetian interpretation. Practical implications are discussed.

  3. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  4. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  5. Family Day Care in the United States: Family Day Care Systems. Final Report of the National Day Care Home Study. Volume 5.

    ERIC Educational Resources Information Center

    Grasso, Janet; Fosburg, Steven

    Fifth in a series of seven volumes reporting the design, methodology, and findings of the 4-year National Day Care Home Study (NDCHS), this volume presents a descriptive and statistical analysis of the day care institutions that administer day care systems. These systems, such as Learning Unlimited in Los Angeles and the family day care program of…

  6. Fatigue criterion to system design, life and reliability

    NASA Technical Reports Server (NTRS)

    Zaretsky, E. V.

    1985-01-01

    A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.

  7. Long-Term Bioeffects of 435-MHz Radiofrequency Radiation on Selected Blood-Borne Endpoints in Cannulated Rats. Volume 4. Plasma Catecholamines.

    DTIC Science & Technology

    1987-08-01

    out. To use each animal as its own control , arterial blood was sampled by means of chronically implanted aortic cannulas 112,13,14]. This simple...APPENDIX B STATISTICAL METHODOLOGY 37 APPENDIX B STATISTICAL METHODOLOGY The balanced design of this experiment (requiring that 25 animals from each...protoccl in that, in numerous cases, samples were collected at odd intervals (invalidating the orthogonality of the design ) and the number of samples’taken

  8. Planning criteria for express bus-fringe parking operations : volume I of express bus-fringe parking planning methodology.

    DOT National Transportation Integrated Search

    1975-01-01

    Tripmaker reactions to two recent express bus-fringe parking operations in Richmond and Norfolk-Virginia Beach Virginia are examined. This travel behavior's interpreted to establish planning and design guidelines for locating and designing fringe lot...

  9. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    NASA Technical Reports Server (NTRS)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  10. Learning from Research on Teaching: Perspective, Methodology, and Representation. Advances in Research on Teaching. Volume 11

    ERIC Educational Resources Information Center

    Brophy, Jere, Ed.; Pinnegar, Stefinee, Ed.

    2005-01-01

    This volume is designed to accomplish three primary purposes: (1) illustrate a variety of qualitative methods that researchers have used to study teaching and teacher education; (2) assess the affordances and constraints of these methods and the ways that they focus and shape explorations of teaching; and (3) illuminate representative questions…

  11. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  12. Design-Based Implementation Research

    ERIC Educational Resources Information Center

    LeMahieu, Paul G.; Nordstrum, Lee E.; Potvin, Ashley Seidel

    2017-01-01

    Purpose: This paper is second of seven in this volume elaborating different approaches to quality improvement in education. It delineates a methodology called design-based implementation research (DBIR). The approach used in this paper is aimed at iteratively improving the quality of classroom teaching and learning practices in defined problem…

  13. Military Personnel Attrition and Retention: Research in Progress

    DTIC Science & Technology

    1981-10-01

    designed to assess the relative risks of attrition among non -h.gh. school graduate Army enlistees. As a group, these individuals historically...this methodology see the Majcýhrzak paper in this volume) Research in the PlanningStaaqe Stage ’TV: Administrative Experimentation In order to derive...different research designs : 1. True experimental designs ; 2. Correlational research designs ; 3. Quasi- experimental research designs . We rejected the idea

  14. C-Band Airport Surface Communications System Standards Development. Phase II Final Report. Volume 2: Test Bed Performance Evaluation and Final AeroMACS Recommendations

    NASA Technical Reports Server (NTRS)

    Hall, Edward; Magner, James

    2011-01-01

    This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II (this document) describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.

  15. Application of the Response Surface Methodology to Optimize the Fermentation Parameters for Enhanced Docosahexaenoic Acid (DHA) Production by Thraustochytrium sp. ATCC 26185.

    PubMed

    Wu, Kang; Ding, Lijian; Zhu, Peng; Li, Shuang; He, Shan

    2018-04-22

    The aim of this study was to determine the cumulative effect of fermentation parameters and enhance the production of docosahexaenoic acid (DHA) by Thraustochytrium sp. ATCC 26185 using response surface methodology (RSM). Among the eight variables screened for effects of fermentation parameters on DHA production by Plackett-Burman design (PBD), the initial pH, inoculum volume, and fermentation volume were found to be most significant. The Box-Behnken design was applied to derive a statistical model for optimizing these three fermentation parameters for DHA production. The optimal parameters for maximum DHA production were initial pH: 6.89, inoculum volume: 4.16%, and fermentation volume: 140.47 mL, respectively. The maximum yield of DHA production was 1.68 g/L, which was in agreement with predicted values. An increase in DHA production was achieved by optimizing the initial pH, fermentation, and inoculum volume parameters. This optimization strategy led to a significant increase in the amount of DHA produced, from 1.16 g/L to 1.68 g/L. Thraustochytrium sp. ATCC 26185 is a promising resource for microbial DHA production due to the high-level yield of DHA that it produces, and the capacity for large-scale fermentation of this organism.

  16. Advanced piloted aircraft flight control system design methodology. Volume 2: The FCX flight control design expert system

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Mcruer, Duane T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.

  17. Architectural and Behavioral Systems Design Methodology and Analysis for Optimal Habitation in a Volume-Limited Spacecraft for Long Duration Flights

    NASA Technical Reports Server (NTRS)

    Kennedy, Kriss J.; Lewis, Ruthan; Toups, Larry; Howard, Robert; Whitmire, Alexandra; Smitherman, David; Howe, Scott

    2016-01-01

    As our human spaceflight missions change as we reach towards Mars, the risk of an adverse behavioral outcome increases, and requirements for crew health, safety, and performance, and the internal architecture, will need to change to accommodate unprecedented mission demands. Evidence shows that architectural arrangement and habitability elements impact behavior. Net habitable volume is the volume available to the crew after accounting for elements that decrease the functional volume of the spacecraft. Determination of minimum acceptable net habitable volume and associated architectural design elements, as mission duration and environment varies, is key to enabling, maintaining, andor enhancing human performance and psychological and behavioral health. Current NASA efforts to derive minimum acceptable net habitable volumes and study the interaction of covariates and stressors, such as sensory stimulation, communication, autonomy, and privacy, and application to internal architecture design layouts, attributes, and use of advanced accommodations will be presented. Furthermore, implications of crew adaptation to available volume as they transfer from Earth accommodations, to deep space travel, to planetary surface habitats, and return, will be discussed.

  18. Hypersonics. Volume 1 - Defining the hypersonic environment; Proceedings of the First Joint Europe/U.S. Short Course on Hypersonics, Paris, France, Dec. 7-11, 1987

    NASA Astrophysics Data System (ADS)

    Bertin, John J.; Glowinski, Roland; Periaux, Jacques

    1989-05-01

    The present work discusses the general characterization of hypersonic flows, the hypersonic phenomena to be encountered by the Hermes spacecraft, industrial methodologies for the design of hypersonic vehicles, the definition of aerodynamic methodology, and hypersonic airbreathing-propulsion vehicle design practices applicable to the U.S. National Aerospace Plane. Also discussed are real gas effects in the hypersonic regime, the influence of thermochemistry and of nonequilibrium and surface catalysis on hypersonic vehicle design, the modelling of nonequilibrium effects in high speed flows, air-dissociation thermochemistry, and rarefied gas dynamics effects for spacecraft.

  19. External tank aerothermal design criteria verification, volume 1

    NASA Technical Reports Server (NTRS)

    Crain, William K.; Frost, Cynthia; Warmbrod, John

    1990-01-01

    The objective of this study was to produce an independent set of ascent environments which would serve as a check on the Rockwell IVBC-3 environments and provide an independent reevaluation of the thermal design criteria for the External Tank (ET). Design heating rates and loads were calculated at 367 acreage body point locations. Ascent flight regimes covered were lift-off, first stage ascent, Solid Rocket Booster (SRB) staging and second stage ascent through ET separation. The purpose here is to document these results, briefly describe the methodology used and present the environments along with a comparison with the Rockwell IVBC-3 counterpart. The methodology and environment summaries are given.

  20. Parent and Community Involvement in Education. Volume III: Technical Appendix--Research Design and Methodology. Studies of Education Reform.

    ERIC Educational Resources Information Center

    Anderson, Beckie; And Others

    Genuine educational reform depends on developing relationships with the home, community groups, politicians, and the business community (Seeley, 1981). This report is the third of three volumes that are products of a 3.5 year study of education reform, with a focus on the role of parent, family, and community involvement in the middle grades. The…

  1. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  2. Design optimum frac jobs using virtual intelligence techniques

    NASA Astrophysics Data System (ADS)

    Mohaghegh, Shahab; Popa, Andrei; Ameri, Sam

    2000-10-01

    Designing optimal frac jobs is a complex and time-consuming process. It usually involves the use of a two- or three-dimensional computer model. For the computer models to perform as intended, a wealth of input data is required. The input data includes wellbore configuration and reservoir characteristics such as porosity, permeability, stress and thickness profiles of the pay layers as well as the overburden layers. Among other essential information required for the design process is fracturing fluid type and volume, proppant type and volume, injection rate, proppant concentration and frac job schedule. Some of the parameters such as fluid and proppant types have discrete possible choices. Other parameters such as fluid and proppant volume, on the other hand, assume values from within a range of minimum and maximum values. A potential frac design for a particular pay zone is a combination of all of these parameters. Finding the optimum combination is not a trivial process. It usually requires an experienced engineer and a considerable amount of time to tune the parameters in order to achieve desirable outcome. This paper introduces a new methodology that integrates two virtual intelligence techniques, namely, artificial neural networks and genetic algorithms to automate and simplify the optimum frac job design process. This methodology requires little input from the engineer beyond the reservoir characterizations and wellbore configuration. The software tool that has been developed based on this methodology uses the reservoir characteristics and an optimization criteria indicated by the engineer, for example a certain propped frac length, and provides the detail of the optimum frac design that will result in the specified criteria. An ensemble of neural networks is trained to mimic the two- or three-dimensional frac simulator. Once successfully trained, these networks are capable of providing instantaneous results in response to any set of input parameters. These networks will be used as the fitness function for a genetic algorithm routine that will search for the best combination of the design parameters for the frac job. The genetic algorithm will search through the entire solution space and identify the optimal combination of parameters to be used in the design process. Considering the complexity of this task this methodology converges relatively fast, providing the engineer with several near-optimum scenarios for the frac job design. These scenarios, which can be achieved in just a minute or two, can be valuable initial points for the engineer to start his/her design job and save him/her hours of runs on the simulator.

  3. Configuration evaluation and criteria plan. Volume 1: System trades study and design methodology plan (preliminary). Space Transportation Main Engine (STME) configuration study

    NASA Technical Reports Server (NTRS)

    Bair, E. K.

    1986-01-01

    The System Trades Study and Design Methodology Plan is used to conduct trade studies to define the combination of Space Shuttle Main Engine features that will optimize candidate engine configurations. This is accomplished by using vehicle sensitivities and engine parametric data to establish engine chamber pressure and area ratio design points for candidate engine configurations. Engineering analyses are to be conducted to refine and optimize the candidate configurations at their design points. The optimized engine data and characteristics are then evaluated and compared against other candidates being considered. The Evaluation Criteria Plan is then used to compare and rank the optimized engine configurations on the basis of cost.

  4. Development of Fatigue and Crack Propagation Design and Analysis Methodology in a Corrosive Environment for Typical Mechanically-Fastened Joints. Volume 3. Phase II Documentation.

    DTIC Science & Technology

    1984-10-01

    8217. .,.. .-- . -.- , -. .... , . - .. , -. , , . ..- . . -.- - .- -. . ..-. . . . -. . . . -.- . . . . - . . K~~ K 7--- K, log SclI I.I. Fig. A-2 True Stress Versus Plastic Strain for Cyclic Response

  5. NPAC-Nozzle Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Barnhart, Paul J.

    1997-01-01

    A simple and accurate nozzle performance analysis methodology has been developed. The geometry modeling requirements are minimal and very flexible, thus allowing rapid design evaluations. The solution techniques accurately couple: continuity, momentum, energy, state, and other relations which permit fast and accurate calculations of nozzle gross thrust. The control volume and internal flow analyses are capable of accounting for the effects of: over/under expansion, flow divergence, wall friction, heat transfer, and mass addition/loss across surfaces. The results from the nozzle performance methodology are shown to be in excellent agreement with experimental data for a variety of nozzle designs over a range of operating conditions.

  6. Early Childhood Reform in Seven Communities: Front-Line Practice, Agency Management, and Public Policy. Volume III: Technical Appendix--Research Design and Methodology. Studies of Education Reform.

    ERIC Educational Resources Information Center

    Lopez, Elena

    The administration and funding of early childhood education programs has engendered recent federal policy debates. This volume is the third report in a series of three, which are derived from a study that examined how local organizations implement complex government programs for early childhood education. The study analyzed and documented…

  7. Aroma profile design of wine spirits: Multi-objective optimization using response surface methodology.

    PubMed

    Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco

    2018-04-15

    Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Development of estimation methodology for bicycle and pedestrian volumes based on existing counts.

    DOT National Transportation Integrated Search

    2013-10-01

    The Colorado Department of Transportation (CDOT) adopted the Bicycle and Pedestrian Policy directive in 2009 : stating that "...the needs of bicyclists and pedestrians shall be included in the planning, design, and operation of : transportation facil...

  9. The Ohio River Basin energy facility siting model. Volume 1: Methodology

    NASA Astrophysics Data System (ADS)

    Fowler, G. L.; Bailey, R. E.; Gordon, S. I.; Jansen, S. D.; Randolph, J. C.; Jones, W. W.

    1981-04-01

    The siting model developed for ORBES is specifically designed for regional policy analysis. The region includes 423 counties in an area that consists of all of Kentucky and substantial portions of Illinois, Indiana, Ohio, Pennsylvania, and West Virginia.

  10. A Review of Online Evidence-based Practice Point-of-Care Information Summary Providers

    PubMed Central

    Liberati, Alessandro; Moschetti, Ivan; Tagliabue, Ludovica; Moja, Lorenzo

    2010-01-01

    Background Busy clinicians need easy access to evidence-based information to inform their clinical practice. Publishers and organizations have designed specific tools to meet doctors’ needs at the point of care. Objective The aim of this study was to describe online point-of-care summaries and evaluate their breadth, content development, and editorial policy against their claims of being “evidence-based.” Methods We searched Medline, Google, librarian association websites, and information conference proceedings from January to December 2008. We included English Web-based point-of-care summaries designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, evidence-based information to clinicians. Two investigators independently extracted data on the general characteristics and content presentation of summaries. We assessed and ranked point-of-care products according to: (1) coverage (volume) of medical conditions, (2) editorial quality, and (3) evidence-based methodology. We explored how these factors were associated. Results We retrieved 30 eligible summaries. Of these products, 18 met our inclusion criteria and were qualitatively described, and 16 provided sufficient data for quantitative evaluation. The median volume of medical conditions covered was 80.6% (interquartile range, 68.9% - 84.2%) and varied for the different products. Similarly, differences emerged for editorial policy (median 8.0, interquartile range 5.8 - 10.3) and evidence-based methodology scores (median 10.0, interquartile range 1.0 - 12.8) on a 15-point scale. None of these dimensions turned out to be significantly associated with the other dimensions (editorial quality and volume, Spearman rank correlation r = -0.001, P = .99; evidence-based methodology and volume, r = -0.19, P = .48; editorial and evidence-based methodology, r = 0.43, P =.09). Conclusions Publishers are moving to develop point-of-care summary products. Some of these have better profiles than others, and there is room for improved reporting of the strengths and weaknesses of these products. PMID:20610379

  11. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  12. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling

    PubMed Central

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-01-01

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370

  13. Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling.

    PubMed

    Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro

    2016-03-11

    This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator.

  14. United States Metric Board. A Study of Metric Measurement and Legislation. Volume 1.

    DTIC Science & Technology

    1979-09-10

    LEGAL ADVISORY PANEL A. Panel Membership VIII.I B. Role of the Panel VIII.2 IX. DATA COLLECTION METHODOLOGY A. Basic Research IX.I B. Computer...First, the Panel was involved in a review of the overall study design . Second, the Panel reviewed the various change mechanisms which were identified...collection methodology . • X summarizes the relevant experiences of Canada and Australia. MIOOLEBNX *NEARC CRNTE 1.3 II. THE UNITED STATES METRIC

  15. Methodology for the systems engineering process. Volume 1: System functional activities

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    Systems engineering is examined in terms of functional activities that are performed in the conduct of a system definition/design, and system development is described in a parametric analysis that combines functions, performance, and design variables. Emphasis is placed on identification of activities performed by design organizations, design specialty groups, as well as a central systems engineering organizational element. Identification of specific roles and responsibilities for doing functions, and monitoring and controlling activities within the system development operation are also emphasized.

  16. Life cycle design and design management strategies in fashion apparel manufacturing

    NASA Astrophysics Data System (ADS)

    Tutia, R.; Mendes, FD; Ventura, A.

    2017-10-01

    The generation of solid textile waste in the process of development and clothing production is an error that causes serious damages to the environment and must be minimized. The greatest volume of textile residues is generated by the department of cut, such as textiles parings and snips that are not used in the productive process. (MILAN et al, 2007). One way to conceive new products environmently conscious is turned to the adoption of a methodology based on Life Cycle Design (LCD) and Design Management.

  17. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  18. Rapid Airplane Parametric Input Design(RAPID)

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.; Bloor, Malcolm I. G.; Wilson, Michael J.; Thomas, Almuttil M.

    2004-01-01

    An efficient methodology is presented for defining a class of airplane configurations. Inclusive in this definition are surface grids, volume grids, and grid sensitivity. A small set of design parameters and grid control parameters govern the process. The general airplane configuration has wing, fuselage, vertical tail, horizontal tail, and canard components. The wing, tail, and canard components are manifested by solving a fourth-order partial differential equation subject to Dirichlet and Neumann boundary conditions. The design variables are incorporated into the boundary conditions, and the solution is expressed as a Fourier series. The fuselage has circular cross section, and the radius is an algebraic function of four design parameters and an independent computational variable. Volume grids are obtained through an application of the Control Point Form method. Grid sensitivity is obtained by applying the automatic differentiation precompiler ADIFOR to software for the grid generation. The computed surface grids, volume grids, and sensitivity derivatives are suitable for a wide range of Computational Fluid Dynamics simulation and configuration optimizations.

  19. Multicultural Prism: Voices from the Field. Volume 3.

    ERIC Educational Resources Information Center

    Adams, J. Q., Ed.; Welsch, Janice R., Ed.

    Focusing on multicultural issues, this collection presents essays and syllabi for courses in the fields of teacher education, composition, psychology, music, public health, and counselor education/college student personnel designed to prompt educators to make those courses more inclusive in both content and methodology. The essays and syllabi…

  20. Photovoltaic system criteria documents. Volume 5: Safety criteria for photovoltaic applications

    NASA Technical Reports Server (NTRS)

    Koenig, John C.; Billitti, Joseph W.; Tallon, John M.

    1979-01-01

    Methodology is described for determining potential safety hazards involved in the construction and operation of photovoltaic power systems and provides guidelines for the implementation of safety considerations in the specification, design and operation of photovoltaic systems. Safety verification procedures for use in solar photovoltaic systems are established.

  1. Bibliography on Cold Regions Science and Technology. Volume 44, Part 1, 1990

    DTIC Science & Technology

    1990-12-01

    Design criteria. Ice mechanics, composition. 44-975 44.985 44-966 Theoretical and experimental analyses of glacial Primary production, chlorophyll...44-1209 New methods and materials for molding and casting Murrell, S.A.F., Rist, M.A. - Experimental methodologies to support aircraft icing ice...Safety Dynamic loads, Moisture, Design , Thermocouples, Leavesley, G.H., Hydrological sciences journal, Dec. Bitumens, Experimentation . 1989, 34(6), p.6 17

  2. Combat Service Support MOD II Design (CSS MOD II). Volume 2. Appendixes

    DTIC Science & Technology

    1986-10-01

    through contractual aoreement with Michael Jackson , Ltd. London. The spelling, syntax, and word usage adopted throughout the document have been made...to conform to army standards. The design teas members wish to thank the followin, individuals for their assistances Mr. John Cameron, Michael Jackson , Ltd...JSP) methodology is a product of Michael Jackson Systems, Ltd., London, England. In 1984, Dr. Wilbur Payne, Director, TRADOC Operations Research

  3. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  4. Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.

    PubMed

    Stern, Cindy; Chur-Hansen, Anna

    2013-02-27

    This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.

  5. A methodology for designing aircraft to low sonic boom constraints

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.; Needleman, Kathy E.

    1991-01-01

    A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.

  6. Recommended System Design for the Occupational Health Management Information System (OHMIS). Volume 1.

    DTIC Science & Technology

    1983-04-01

    Management Information System (OHMIS). The system design includes: detailed function data flows for each of the core data processing functions of OHMIS, in the form of input/processing/output algorithms; detailed descriptions of the inputs and outputs; performance specifications of OHMIS; resources required to develop and operate OHMIS (Vol II). In addition, the report provides a summary of the rationale used to develop the recommended system design, a description of the methodology used to develop the recommended system design, and a review of existing

  7. Design of pellet surface grooves for fission gas plenum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, T.J.; Jones, L.R.; Macici, N.

    1986-01-01

    In the Canada deuterium uranium pressurized heavy water reactor, short (50-cm) Zircaloy-4 clad bundles are fueled on-power. Although internal void volume within the fuel rods is adequate for the present once-through natural uranium cycle, the authors have investigated methods for increasing the internal gas storage volume needed in high-power, high-burnup, experimental ceramic fuels. This present work sought to prove the methodology for design of gas storage volume within the fuel pellets - specifically the use of grooves pressed or machined into the relatively cool pellet/cladding interface. Preanalysis and design of pellet groove shape and volume was accomplished using the TRUMPmore » heat transfer code. Postirradiation examination (PIE) was used to check the initial design and heat transfer assumptions. Fission gas release was found to be higher for the grooved pellet rods than for the comparison rods with hollow or unmodified pellets. This had been expected from the initial TRUMP thermal analyses. The ELESIM fuel modeling code was used to check in-reactor performance, but some modifications were necessary to accommodate the loss of heat transfer surface to the grooves. It was concluded that for plenum design purposes, circumferential pellet grooves could be adequately modeled by the codes TRUMP and ELESIM.« less

  8. AAHPER Research Consortium Symposium Papers: Socio-Psychological Dimensions, Research Design and Safety. Volume II, Book 3.

    ERIC Educational Resources Information Center

    Cox, Richard H., Ed.

    This collection focuses on research topics in physical education and athletics and includes the following papers: Methodological Problems in the Assessment of Personality from the Psychoanalytic, Behavioral and Cognitive Positions; Some Factors Affecting the Performance of Women in Sports and Activity; Multivariate Considerations in Children's…

  9. HTGR plant availability and reliability evaluations. Volume I. Summary of evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, G.J.; Hannaman, G.W.; Jacobsen, F.K.

    1976-12-01

    The report (1) describes a reliability assessment methodology for systematically locating and correcting areas which may contribute to unavailability of new and uniquely designed components and systems, (2) illustrates the methodology by applying it to such components in a high-temperature gas-cooled reactor (Public Service Company of Colorado's Fort St. Vrain 330-MW(e) HTGR), and (3) compares the results of the assessment with actual experience. The methodology can be applied to any component or system; however, it is particularly valuable for assessments of components or systems which provide essential functions, or the failure or mishandling of which could result in relatively largemore » economic losses.« less

  10. Optimization of cyanide extraction from wastewater using emulsion liquid membrane system by response surface methodology.

    PubMed

    Xue, Juan Qin; Liu, Ni Na; Li, Guo Ping; Dang, Long Tao

    To solve the disposal problem of cyanide wastewater, removal of cyanide from wastewater using a water-in-oil emulsion type of emulsion liquid membrane (ELM) was studied in this work. Specifically, the effects of surfactant Span-80, carrier trioctylamine (TOA), stripping agent NaOH solution and the emulsion-to-external-phase-volume ratio on removal of cyanide were investigated. Removal of total cyanide was determined using the silver nitrate titration method. Regression analysis and optimization of the conditions were conducted using the Design-Expert software and response surface methodology (RSM). The actual cyanide removals and the removals predicted using RSM analysis were in close agreement, and the optimal conditions were determined to be as follows: the volume fraction of Span-80, 4% (v/v); the volume fraction of TOA, 4% (v/v); the concentration of NaOH, 1% (w/v); and the emulsion-to-external-phase volume ratio, 1:7. Under the optimum conditions, the removal of total cyanide was 95.07%, and the RSM predicted removal was 94.90%, with a small exception. The treatment of cyanide wastewater using an ELM is an effective technique for application in industry.

  11. Bivariate analysis of floods in climate impact assessments.

    PubMed

    Brunner, Manuela Irene; Sikorska, Anna E; Seibert, Jan

    2018-03-01

    Climate impact studies regarding floods usually focus on peak discharges and a bivariate assessment of peak discharges and hydrograph volumes is not commonly included. A joint consideration of peak discharges and hydrograph volumes, however, is crucial when assessing flood risks for current and future climate conditions. Here, we present a methodology to develop synthetic design hydrographs for future climate conditions that jointly consider peak discharges and hydrograph volumes. First, change factors are derived based on a regional climate model and are applied to observed precipitation and temperature time series. Second, the modified time series are fed into a calibrated hydrological model to simulate runoff time series for future conditions. Third, these time series are used to construct synthetic design hydrographs. The bivariate flood frequency analysis used in the construction of synthetic design hydrographs takes into account the dependence between peak discharges and hydrograph volumes, and represents the shape of the hydrograph. The latter is modeled using a probability density function while the dependence between the design variables peak discharge and hydrograph volume is modeled using a copula. We applied this approach to a set of eight mountainous catchments in Switzerland to construct catchment-specific and season-specific design hydrographs for a control and three scenario climates. Our work demonstrates that projected climate changes have an impact not only on peak discharges but also on hydrograph volumes and on hydrograph shapes both at an annual and at a seasonal scale. These changes are not necessarily proportional which implies that climate impact assessments on future floods should consider more flood characteristics than just flood peaks. Copyright © 2017. Published by Elsevier B.V.

  12. Human Integration Design Processes (HIDP)

    NASA Technical Reports Server (NTRS)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference missions. The HIDP is a reference document that is intended to be used during the development of crewed space systems and operations to guide human-systems development process activities.

  13. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  14. The Use of Feedback in Lab Energy Conservation: Fume Hoods at MIT

    ERIC Educational Resources Information Center

    Wesolowski, Daniel; Olivetti, Elsa; Graham, Amanda; Lanou, Steve; Cooper, Peter; Doughty, Jim; Wilk, Rich; Glicksman, Leon

    2010-01-01

    Purpose: The purpose of this paper is to report on the results of an Massachusetts Institute of Technology Chemistry Department campaign to reduce energy consumption in chemical fume hoods. Hood use feedback to lab users is a crucial component of this campaign. Design/methodology/approach: Sash position sensor data on variable air volume fume…

  15. Collective Bargaining in Higher Education and the Professions. Bibliography No. 21.

    ERIC Educational Resources Information Center

    Lowe, Ida B., Ed.; Johnson, Beth Hillman, Ed.

    This bibliography of 885 citations is an annual accounting of the literature on collective bargaining in higher education and the professions for 1992. The research design and methodology used in the preparation of this volume relied on computer searches of various data bases, as well as manual retrieval of citations not available on data bases.…

  16. Collective Bargaining in Higher Education and the Professions. Bibliography No. 20.

    ERIC Educational Resources Information Center

    Lowe, Ida B.; Johnson, Beth Hillman

    This bibliography of 834 citations is an annual accounting of literature on collective bargaining in higher education and the professions for 1991. The research and design and methodology used in the preparation of this volume relied on computer searches of various databases and manual retrieval of other citations not available on database.…

  17. Modular space station phase B extension program cost and schedules. Volume 1: Cost and schedule estimating process and results

    NASA Technical Reports Server (NTRS)

    Frassinelli, G. J.

    1972-01-01

    Cost estimates and funding schedules are presented for a given configuration and costing ground rules. Cost methodology is described and the cost evolution from a baseline configuration to a selected configuration is given, emphasizing cases in which cost was a design driver. Programmatic cost avoidance techniques are discussed.

  18. Collective Bargaining in Higher Education and the Professions. Bibliography No. 22.

    ERIC Educational Resources Information Center

    Lowe, Ida B., Ed.; Johnson, Beth Hillman, Ed.

    This bibliography of 886 citations is an annual accounting of the literature on collective bargaining in higher education and the professions for 1993. The research design and methodology used in the preparation of this volume relied on computer searches of various data bases, as well as manual retrieval of citations not available on data bases.…

  19. Requirements for Information Professionals in a Digital Environment: Some Thoughts

    ERIC Educational Resources Information Center

    Ataman, Bekir Kemal

    2009-01-01

    Purpose: The purpose of this paper is to point out the increasing need to provide information professionals with a sound grounding in the technological aspects of their profession. Design/methodology/approach: The paper sets out by describing the sudden increase in volumes of information that confront our society, and then looks at how the younger…

  20. Human-Computer Interaction: A Journal of Theoretical, Empirical and Methodological Issues of User Science and of System Design. Volume 7, Number 1

    DTIC Science & Technology

    1992-01-01

    Norman .................................... University of California, San Diego, CA Dan R . Olsen, Jr ........................................ Brigham...Peter G. Poison .............................................. University of Colorado, Boulder, CO James R . Rhyne ................. IBM T J Watson...and artificial intelligence, among which are: * reasoning about concurrent systems, including program verification ( Barringer , 1985), operating

  1. Making intelligent systems team players: Case studies and design issues. Volume 1: Human-computer interaction design

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra L.; Woods, David D.; Potter, Scott S.; Johannesen, Leila; Holloway, Matthew; Forbus, Kenneth D.

    1991-01-01

    Initial results are reported from a multi-year, interdisciplinary effort to provide guidance and assistance for designers of intelligent systems and their user interfaces. The objective is to achieve more effective human-computer interaction (HCI) for systems with real time fault management capabilities. Intelligent fault management systems within the NASA were evaluated for insight into the design of systems with complex HCI. Preliminary results include: (1) a description of real time fault management in aerospace domains; (2) recommendations and examples for improving intelligent systems design and user interface design; (3) identification of issues requiring further research; and (4) recommendations for a development methodology integrating HCI design into intelligent system design.

  2. A normative price for energy from an electricity generation system: An Owner-dependent Methodology for Energy Generation (system) Assessment (OMEGA). Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.; Mcmaster, K. M.

    1981-01-01

    The utility owned solar electric system methodology is generalized and updated. The net present value of the system is determined by consideration of all financial benefits and costs (including a specified return on investment). Life cycle costs, life cycle revenues, and residual system values are obtained. Break even values of system parameters are estimated by setting the net present value to zero. While the model was designed for photovoltaic generators with a possible thermal energy byproduct, it applicability is not limited to such systems. The resulting owner-dependent methodology for energy generation system assessment consists of a few equations that can be evaluated without the aid of a high-speed computer.

  3. Non-conventional technologies for data collection in Brazilian dissertations and theses.

    PubMed

    Salvador, Pétala Tuani Candido de Oliveira; Rodrigues, Cláudia Cristiane Filgueira Martins; de Lima, Kálya Yasmine Nunes; Alves, Kisna Yasmin Andrade; Santos, Viviane Euzébia Pereira

    2015-01-01

    to characterize non-conventional technologies used for data collection of dissertations and theses available in the Catalog of Theses and Dissertations (CEPEn) of the Brazilian Nursing Association (ABEn). this is a documentary research, whose data were collected in the catalogs of theses and dissertations available at the ABEn website, from Volumes XIX to XXI. The indicators collected were: academic level; educational institution; year; qualification of the author; setting; non-conventional technology used; type of technology; association with conventional techniques; methodological design; benefits and methodological limitations. from a total of 6346 studies, only 121 (1.91%) used non-conventional technologies for data collection, representing the fi nal sample of the study. it is concluded that Brazilian Nursing researches still need methodological innovations for data collection.

  4. A methodology to derive Synthetic Design Hydrographs for river flood management

    NASA Astrophysics Data System (ADS)

    Tomirotti, Massimo; Mignosa, Paolo

    2017-12-01

    The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.

  5. Rapid Airplane Parametric Input Design (RAPID)

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1995-01-01

    RAPID is a methodology and software system to define a class of airplane configurations and directly evaluate surface grids, volume grids, and grid sensitivity on and about the configurations. A distinguishing characteristic which separates RAPID from other airplane surface modellers is that the output grids and grid sensitivity are directly applicable in CFD analysis. A small set of design parameters and grid control parameters govern the process which is incorporated into interactive software for 'real time' visual analysis and into batch software for the application of optimization technology. The computed surface grids and volume grids are suitable for a wide range of Computational Fluid Dynamics (CFD) simulation. The general airplane configuration has wing, fuselage, horizontal tail, and vertical tail components. The double-delta wing and tail components are manifested by solving a fourth order partial differential equation (PDE) subject to Dirichlet and Neumann boundary conditions. The design parameters are incorporated into the boundary conditions and therefore govern the shapes of the surfaces. The PDE solution yields a smooth transition between boundaries. Surface grids suitable for CFD calculation are created by establishing an H-type topology about the configuration and incorporating grid spacing functions in the PDE equation for the lifting components and the fuselage definition equations. User specified grid parameters govern the location and degree of grid concentration. A two-block volume grid about a configuration is calculated using the Control Point Form (CPF) technique. The interactive software, which runs on Silicon Graphics IRIS workstations, allows design parameters to be continuously varied and the resulting surface grid to be observed in real time. The batch software computes both the surface and volume grids and also computes the sensitivity of the output grid with respect to the input design parameters by applying the precompiler tool ADIFOR to the grid generation program. The output of ADIFOR is a new source code containing the old code plus expressions for derivatives of specified dependent variables (grid coordinates) with respect to specified independent variables (design parameters). The RAPID methodology and software provide a means of rapidly defining numerical prototypes, grids, and grid sensitivity of a class of airplane configurations. This technology and software is highly useful for CFD research for preliminary design and optimization processes.

  6. Automation Applications in an Advanced Air Traffic Management System : Volume 3. Methodology for Man-Machine Task Allocation

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...

  7. Application of the experimental design of experiments (DoE) for the determination of organotin compounds in water samples using HS-SPME and GC-MS/MS.

    PubMed

    Coscollà, Clara; Navarro-Olivares, Santiago; Martí, Pedro; Yusà, Vicent

    2014-02-01

    When attempting to discover the important factors and then optimise a response by tuning these factors, experimental design (design of experiments, DoE) gives a powerful suite of statistical methodology. DoE identify significant factors and then optimise a response with respect to them in method development. In this work, a headspace-solid-phase micro-extraction (HS-SPME) combined with gas chromatography tandem mass spectrometry (GC-MS/MS) methodology for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT), triphenyltin (TPhT) has been optimized using a statistical design of experiments (DOE). The analytical method is based on the ethylation with NaBEt4 and simultaneous headspace-solid-phase micro-extraction of the derivative compounds followed by GC-MS/MS analysis. The main experimental parameters influencing the extraction efficiency selected for optimization were pre-incubation time, incubation temperature, agitator speed, extraction time, desorption temperature, buffer (pH, concentration and volume), headspace volume, sample salinity, preparation of standards, ultrasonic time and desorption time in the injector. The main factors (excitation voltage, excitation time, ion source temperature, isolation time and electron energy) affecting the GC-IT-MS/MS response were also optimized using the same statistical design of experiments. The proposed method presented good linearity (coefficient of determination R(2)>0.99) and repeatibilty (1-25%) for all the compounds under study. The accuracy of the method measured as the average percentage recovery of the compounds in spiked surface and marine waters was higher than 70% for all compounds studied. Finally, the optimized methodology was applied to real aqueous samples enabled the simultaneous determination of all compounds under study in surface and marine water samples obtained from Valencia region (Spain). © 2013 Elsevier B.V. All rights reserved.

  8. A Human Factors Evaluation of a Methodology for Pressurized Crew Module Acceptability for Zero-Gravity Ingress of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sanchez, Merri J.

    2000-01-01

    This project aimed to develop a methodology for evaluating performance and acceptability characteristics of the pressurized crew module volume suitability for zero-gravity (g) ingress of a spacecraft and to evaluate the operational acceptability of the NASA crew return vehicle (CRV) for zero-g ingress of astronaut crew, volume for crew tasks, and general crew module and seat layout. No standard or methodology has been established for evaluating volume acceptability in human spaceflight vehicles. Volume affects astronauts'ability to ingress and egress the vehicle, and to maneuver in and perform critical operational tasks inside the vehicle. Much research has been conducted on aircraft ingress, egress, and rescue in order to establish military and civil aircraft standards. However, due to the extremely limited number of human-rated spacecraft, this topic has been un-addressed. The NASA CRV was used for this study. The prototype vehicle can return a 7-member crew from the International Space Station in an emergency. The vehicle's internal arrangement must be designed to facilitate rapid zero-g ingress, zero-g maneuverability, ease of one-g egress and rescue, and ease of operational tasks in multiple acceleration environments. A full-scale crew module mockup was built and outfitted with representative adjustable seats, crew equipment, and a volumetrically equivalent hatch. Human factors testing was conducted in three acceleration environments using ground-based facilities and the KC-135 aircraft. Performance and acceptability measurements were collected. Data analysis was conducted using analysis of variance and nonparametric techniques.

  9. Computational multiobjective topology optimization of silicon anode structures for lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Mitchell, Sarah L.; Ortiz, Michael

    2016-09-01

    This study utilizes computational topology optimization methods for the systematic design of optimal multifunctional silicon anode structures for lithium-ion batteries. In order to develop next generation high performance lithium-ion batteries, key design challenges relating to the silicon anode structure must be addressed, namely the lithiation-induced mechanical degradation and the low intrinsic electrical conductivity of silicon. As such this work considers two design objectives, the first being minimum compliance under design dependent volume expansion, and the second maximum electrical conduction through the structure, both of which are subject to a constraint on material volume. Density-based topology optimization methods are employed in conjunction with regularization techniques, a continuation scheme, and mathematical programming methods. The objectives are first considered individually, during which the influence of the minimum structural feature size and prescribed volume fraction are investigated. The methodology is subsequently extended to a bi-objective formulation to simultaneously address both the structural and conduction design criteria. The weighted sum method is used to derive the Pareto fronts, which demonstrate a clear trade-off between the competing design objectives. A rigid frame structure was found to be an excellent compromise between the structural and conduction design criteria, providing both the required structural rigidity and direct conduction pathways. The developments and results presented in this work provide a foundation for the informed design and development of silicon anode structures for high performance lithium-ion batteries.

  10. Multiphysics Thermal-Fluid Design Analysis of a Non-Nuclear Tester for Hot-Hydrogen Materials and Component Development

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Foote, John; Litchford, Ron

    2006-01-01

    The objective of this effort is to perform design analyses for a non-nuclear hot-hydrogen materials tester, as a first step towards developing efficient and accurate multiphysics, thermo-fluid computational methodology to predict environments for hypothetical solid-core, nuclear thermal engine thrust chamber design and analysis. The computational methodology is based on a multidimensional, finite-volume, turbulent, chemically reacting, thermally radiating, unstructured-grid, and pressure-based formulation. The multiphysics invoked in this study include hydrogen dissociation kinetics and thermodynamics, turbulent flow, convective, and thermal radiative heat transfers. The goals of the design analyses are to maintain maximum hot-hydrogen jet impingement energy and to minimize chamber wall heating. The results of analyses on three test fixture configurations and the rationale for final selection are presented. The interrogation of physics revealed that reactions of hydrogen dissociation and recombination are highly correlated with local temperature and are necessary for accurate prediction of the hot-hydrogen jet temperature.

  11. Integrated orbital servicing study for low-cost payload programs. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Derocher, W. L., Jr.

    1975-01-01

    Various operating methodologies to achieve low-cost space operations were investigated as part of the Space Transportation System (STS) planning. The emphasis was to show that the development investment, initial fleet costs, and supporting facilities for the STS could be effectively offset by exploiting the capabilities of the STS to satisfy mission requirements and reduce the cost of payload programs. The following major conclusions were reached: (1) the development of an on-orbit servicer maintenance system is compatible with many spacecraft programs and is recommended as the most cost-effective system, (2) spacecraft can be designed to be serviceable with acceptable design, weight, volume, and cost effects, (3) use of on-orbit servicing over a 12 year period results in savings ranging between four and nine billion dollars, (4) the pivoting arm on-orbit servicer was selected and a preliminary design was prepared, (5) orbital maintenance has no significant impact on the STS.

  12. Understanding Skill in EVA Mass Handling. Volume 4; An Integrated Methodology for Evaluating Space Suit Mobility and Stability

    NASA Technical Reports Server (NTRS)

    McDonald, P. Vernon; Newman, Dava

    1999-01-01

    The empirical investigation of extravehicular activity (EVA) mass handling conducted on NASA's Precision Air-Bearing Floor led to a Phase I SBIR from JSC. The purpose of the SBIR was to design an innovative system for evaluating space suit mobility and stability in conditions that simulate EVA on the surface of the Moon or Mars. The approach we used to satisfy the Phase I objectives was based on a structured methodology for the development of human-systems technology. Accordingly the project was broken down into a number of tasks and subtasks. In sequence, the major tasks were: 1) Identify missions and tasks that will involve EVA and resulting mobility requirements in the near and long term; 2) Assess possible methods for evaluating mobility of space suits during field-based EVA tests; 3) Identify requirements for behavioral evaluation by interacting with NASA stakeholders;.4) Identify necessary and sufficient technology for implementation of a mobility evaluation system; and 5) Prioritize and select technology solutions. The work conducted in these tasks is described in this final volume of the series on EVA mass handling. While prior volumes in the series focus on novel data-analytic techniques, this volume addresses technology that is necessary for minimally intrusive data collection and near-real-time data analysis and display.

  13. Understanding leachate flow in municipal solid waste landfills by combining time-lapse ERT and subsurface flow modelling - Part II: Constraint methodology of hydrodynamic models.

    PubMed

    Audebert, M; Oxarango, L; Duquennoi, C; Touze-Foltz, N; Forquet, N; Clément, R

    2016-09-01

    Leachate recirculation is a key process in the operation of municipal solid waste landfills as bioreactors. To ensure optimal water content distribution, bioreactor operators need tools to design leachate injection systems. Prediction of leachate flow by subsurface flow modelling could provide useful information for the design of such systems. However, hydrodynamic models require additional data to constrain them and to assess hydrodynamic parameters. Electrical resistivity tomography (ERT) is a suitable method to study leachate infiltration at the landfill scale. It can provide spatially distributed information which is useful for constraining hydrodynamic models. However, this geophysical method does not allow ERT users to directly measure water content in waste. The MICS (multiple inversions and clustering strategy) methodology was proposed to delineate the infiltration area precisely during time-lapse ERT survey in order to avoid the use of empirical petrophysical relationships, which are not adapted to a heterogeneous medium such as waste. The infiltration shapes and hydrodynamic information extracted with MICS were used to constrain hydrodynamic models in assessing parameters. The constraint methodology developed in this paper was tested on two hydrodynamic models: an equilibrium model where, flow within the waste medium is estimated using a single continuum approach and a non-equilibrium model where flow is estimated using a dual continuum approach. The latter represents leachate flows into fractures. Finally, this methodology provides insight to identify the advantages and limitations of hydrodynamic models. Furthermore, we suggest an explanation for the large volume detected by MICS when a small volume of leachate is injected. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A Methodology for Calculating EGS Electricity Generation Potential Based on the Gringarten Model for Heat Extraction From Fractured Rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustine, Chad

    Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGSmore » electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.« less

  15. Foundations for Measuring Volume Rendering Quality

    NASA Technical Reports Server (NTRS)

    Williams, Peter L.; Uselton, Samuel P.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The goal of this paper is to provide a foundation for objectively comparing volume rendered images. The key elements of the foundation are: (1) a rigorous specification of all the parameters that need to be specified to define the conditions under which a volume rendered image is generated; (2) a methodology for difference classification, including a suite of functions or metrics to quantify and classify the difference between two volume rendered images that will support an analysis of the relative importance of particular differences. The results of this method can be used to study the changes caused by modifying particular parameter values, to compare and quantify changes between images of similar data sets rendered in the same way, and even to detect errors in the design, implementation or modification of a volume rendering system. If one has a benchmark image, for example one created by a high accuracy volume rendering system, the method can be used to evaluate the accuracy of a given image.

  16. Reactor safeguards system assessment and design. Volume I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varnado, G.B.; Ericson, D.M. Jr.; Daniel, S.L.

    1978-06-01

    This report describes the development and application of a methodology for evaluating the effectiveness of nuclear power reactor safeguards systems. Analytic techniques are used to identify the sabotage acts which could lead to release of radioactive material from a nuclear power plant, to determine the areas of a plant which must be protected to assure that significant release does not occur, to model the physical plant layout, and to evaluate the effectiveness of various safeguards systems. The methodology was used to identify those aspects of reactor safeguards systems which have the greatest effect on overall system performance and which, therefore,more » should be emphasized in the licensing process. With further refinements, the methodology can be used by the licensing reviewer to aid in assessing proposed or existing safeguards systems.« less

  17. Fundamentals handbook of electrical and computer engineering. Volume 1 Circuits fields and electronics

    NASA Astrophysics Data System (ADS)

    Chang, S. S. L.

    State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.

  18. Integration Methodology For Oil-Free Shaft Support Systems: Four Steps to Success

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.; DellaCorte, Christopher; Bruckner, Robert J.

    2010-01-01

    Commercial applications for Oil-Free turbomachinery are slowly becoming a reality. Micro-turbine generators, highspeed electric motors, and electrically driven centrifugal blowers are a few examples of products available in today's commercial marketplace. Gas foil bearing technology makes most of these applications possible. A significant volume of component level research has led to recent acceptance of gas foil bearings in several specialized applications, including those mentioned above. Component tests identifying such characteristics as load carrying capacity, power loss, thermal behavior, rotordynamic coefficients, etc. all help the engineer design foil bearing machines, but the development process can be just as important. As the technology gains momentum and acceptance in a wider array of machinery, the complexity and variety of applications will grow beyond the current class of machines. Following a robust integration methodology will help improve the probability of successful development of future Oil-Free turbomachinery. This paper describes a previously successful four-step integration methodology used in the development of several Oil-Free turbomachines. Proper application of the methods put forward here enable successful design of Oil-Free turbomachinery. In addition when significant design changes or unique machinery are developed, this four-step process must be considered.

  19. Handbook of Research Methods in Social and Personality Psychology

    NASA Astrophysics Data System (ADS)

    Reis, Harry T.; Judd, Charles M.

    2000-03-01

    This volume provides an overview of research methods in contemporary social psychology. Coverage includes conceptual issues in research design, methods of research, and statistical approaches. Because the range of research methods available for social psychology have expanded extensively in the past decade, both traditional and innovative methods are presented. The goal is to introduce new and established researchers alike to new methodological developments in the field.

  20. Symposium on General Linear Model Approach to the Analysis of Experimental Data in Educational Research (Athens, Georgia, June 29-July 1, 1967). Final Report.

    ERIC Educational Resources Information Center

    Bashaw, W. L., Ed.; Findley, Warren G., Ed.

    This volume contains the five major addresses and subsequent discussion from the Symposium on the General Linear Models Approach to the Analysis of Experimental Data in Educational Research, which was held in 1967 in Athens, Georgia. The symposium was designed to produce systematic information, including new methodology, for dissemination to the…

  1. Proceedings of Selected Research and Development Presentations at the 1997 National Convention of the Association for Educational Communications and Technology Sponsored by the Research and Theory Division (19th, Albuquerque, NM, February 14-18, 1997).

    ERIC Educational Resources Information Center

    Abel, Omalley, Ed.; And Others

    1997-01-01

    This proceedings volume contains 57 papers. Subjects addressed include: cooperative technology education; children's learning strategies with hypermedia lessons; problem-based learning; instructional methodologies for lifelong learning; interactive television (ITV) design; theoretical bases for Human Performance Technology (HPT); use of cognitive…

  2. An engineering closure for heavily under-resolved coarse-grid CFD in large applications

    NASA Astrophysics Data System (ADS)

    Class, Andreas G.; Yu, Fujiang; Jordan, Thomas

    2016-11-01

    Even though high performance computation allows very detailed description of a wide range of scales in scientific computations, engineering simulations used for design studies commonly merely resolve the large scales thus speeding up simulation time. The coarse-grid CFD (CGCFD) methodology is developed for flows with repeated flow patterns as often observed in heat exchangers or porous structures. It is proposed to use inviscid Euler equations on a very coarse numerical mesh. This coarse mesh needs not to conform to the geometry in all details. To reinstall physics on all smaller scales cheap subgrid models are employed. Subgrid models are systematically constructed by analyzing well-resolved generic representative simulations. By varying the flow conditions in these simulations correlations are obtained. These comprehend for each individual coarse mesh cell a volume force vector and volume porosity. Moreover, for all vertices, surface porosities are derived. CGCFD is related to the immersed boundary method as both exploit volume forces and non-body conformal meshes. Yet, CGCFD differs with respect to the coarser mesh and the use of Euler equations. We will describe the methodology based on a simple test case and the application of the method to a 127 pin wire-wrap fuel bundle.

  3. Optimized Design and Analysis of Sparse-Sampling fMRI Experiments

    PubMed Central

    Perrachione, Tyler K.; Ghosh, Satrajit S.

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power. PMID:23616742

  4. Optimized design and analysis of sparse-sampling FMRI experiments.

    PubMed

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power.

  5. Phase 1 of the near team hybrid passenger vehicle development program. Appendix C: Preliminary design data package, volume 1

    NASA Technical Reports Server (NTRS)

    Piccolo, R.

    1979-01-01

    The methodology used for vehicle layout and component definition is described as well as techniques for system optimization and energy evaluation. The preliminary design is examined with particular attention given to body and structure; propulsion system; crash analysis and handling; internal combustion engine; DC motor separately excited; Ni-Zn battery; transmission; control system; vehicle auxiliarries; weight breakdown, and life cycle costs. Formulas are given for the quantification of energy consumption and results are compared with the reference vehicle.

  6. Asymmetric Base-Bleed Effect on Aerospike Plume-Induced Base-Heating Environment

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Droege, Alan; DAgostino, Mark; Lee, Young-Ching; Williams, Robert

    2004-01-01

    A computational heat transfer design methodology was developed to study the dual-engine linear aerospike plume-induced base-heating environment during one power-pack out, in ascent flight. It includes a three-dimensional, finite volume, viscous, chemically reacting, and pressure-based computational fluid dynamics formulation, a special base-bleed boundary condition, and a three-dimensional, finite volume, and spectral-line-based weighted-sum-of-gray-gases absorption computational radiation heat transfer formulation. A separate radiation model was used for diagnostic purposes. The computational methodology was systematically benchmarked. In this study, near-base radiative heat fluxes were computed, and they compared well with those measured during static linear aerospike engine tests. The base-heating environment of 18 trajectory points selected from three power-pack out scenarios was computed. The computed asymmetric base-heating physics were analyzed. The power-pack out condition has the most impact on convective base heating when it happens early in flight. The source of its impact comes from the asymmetric and reduced base bleed.

  7. Public acceptability of highway safety countermeasures. Volume 1, Background of study and methodology

    DOT National Transportation Integrated Search

    1981-06-01

    This study provides information about public attitudes towards proposed highway safety countermeasures in three program areas: alcohol and drugs, unsafe driving behaviors, and pedestrian safety. This volume describes the three research methodologies ...

  8. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    DOT National Transportation Integrated Search

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  9. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  10. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  11. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  12. High performance MPEG-audio decoder IC

    NASA Technical Reports Server (NTRS)

    Thorn, M.; Benbassat, G.; Cyr, K.; Li, S.; Gill, M.; Kam, D.; Walker, K.; Look, P.; Eldridge, C.; Ng, P.

    1993-01-01

    The emerging digital audio and video compression technology brings both an opportunity and a new challenge to IC design. The pervasive application of compression technology to consumer electronics will require high volume, low cost IC's and fast time to market of the prototypes and production units. At the same time, the algorithms used in the compression technology result in complex VLSI IC's. The conflicting challenges of algorithm complexity, low cost, and fast time to market have an impact on device architecture and design methodology. The work presented in this paper is about the design of a dedicated, high precision, Motion Picture Expert Group (MPEG) audio decoder.

  13. Design methodology analysis: design and operational energy studies in a new high-rise office building. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-02-01

    Work on energy consumption in a large office building is reported, including the following tasks: (1) evaluating and testing the effectiveness of the existing ASHRAE 90-75 and 90-80 standards; (2) evaluating the effectiveness of the BEPS; (3) evaluating the effectiveness of some envelope and lighting design variables towards achieving the BEPS budgets; and (4) comparing the computer energy analysis technique, DOE-2.1, with manual calculation procedures. These tasks are the initial activities in the energy analysis of the Park Plaza Building and will serve as the basis for further understanding the results of ongoing data collection and analysis.

  14. Brain imaging registry for neurologic diagnosis and research

    NASA Astrophysics Data System (ADS)

    Hoo, Kent S., Jr.; Wong, Stephen T. C.; Knowlton, Robert C.; Young, Geoffrey S.; Walker, John; Cao, Xinhua; Dillon, William P.; Hawkins, Randall A.; Laxer, Kenneth D.

    2002-05-01

    The purpose of this paper is to demonstrate the importance of building a brain imaging registry (BIR) on top of existing medical information systems including Picture Archiving Communication Systems (PACS) environment. We describe the design framework for a cluster of data marts whose purpose is to provide clinicians and researchers efficient access to a large volume of raw and processed patient images and associated data originating from multiple operational systems over time and spread out across different hospital departments and laboratories. The framework is designed using object-oriented analysis and design methodology. The BIR data marts each contain complete image and textual data relating to patients with a particular disease.

  15. Application of response surface methodology to maximize the productivity of scalable automated human embryonic stem cell manufacture.

    PubMed

    Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J

    2013-01-01

    Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.

  16. The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Huiling; Fan, Delin; Zhang, Yizhuo

    This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.

  17. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  18. Multidisciplinary Optimization Approach for Design and Operation of Constrained and Complex-shaped Space Systems

    NASA Astrophysics Data System (ADS)

    Lee, Dae Young

    The design of a small satellite is challenging since they are constrained by mass, volume, and power. To mitigate these constraint effects, designers adopt deployable configurations on the spacecraft that result in an interesting and difficult optimization problem. The resulting optimization problem is challenging due to the computational complexity caused by the large number of design variables and the model complexity created by the deployables. Adding to these complexities, there is a lack of integration of the design optimization systems into operational optimization, and the utility maximization of spacecraft in orbit. The developed methodology enables satellite Multidisciplinary Design Optimization (MDO) that is extendable to on-orbit operation. Optimization of on-orbit operations is possible with MDO since the model predictive controller developed in this dissertation guarantees the achievement of the on-ground design behavior in orbit. To enable the design optimization of highly constrained and complex-shaped space systems, the spherical coordinate analysis technique, called the "Attitude Sphere", is extended and merged with an additional engineering tools like OpenGL. OpenGL's graphic acceleration facilitates the accurate estimation of the shadow-degraded photovoltaic cell area. This technique is applied to the design optimization of the satellite Electric Power System (EPS) and the design result shows that the amount of photovoltaic power generation can be increased more than 9%. Based on this initial methodology, the goal of this effort is extended from Single Discipline Optimization to Multidisciplinary Optimization, which includes the design and also operation of the EPS, Attitude Determination and Control System (ADCS), and communication system. The geometry optimization satisfies the conditions of the ground development phase; however, the operation optimization may not be as successful as expected in orbit due to disturbances. To address this issue, for the ADCS operations, controllers based on Model Predictive Control that are effective for constraint handling were developed and implemented. All the suggested design and operation methodologies are applied to a mission "CADRE", which is space weather mission scheduled for operation in 2016. This application demonstrates the usefulness and capability of the methodology to enhance CADRE's capabilities, and its ability to be applied to a variety of missions.

  19. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 4: IPAD system design

    NASA Technical Reports Server (NTRS)

    Goldfarb, W.; Carpenter, L. C.; Redhed, D. D.; Hansen, S. D.; Anderson, L. O.; Kawaguchi, A. S.

    1973-01-01

    The computing system design of IPAD is described and the requirements which form the basis for the system design are discussed. The system is presented in terms of a functional design description and technical design specifications. The functional design specifications give the detailed description of the system design using top-down structured programming methodology. Human behavioral characteristics, which specify the system design at the user interface, security considerations, and standards for system design, implementation, and maintenance are also part of the technical design specifications. Detailed specifications of the two most common computing system types in use by the major aerospace companies which could support the IPAD system design are presented. The report of a study to investigate migration of IPAD software between the two candidate 3rd generation host computing systems and from these systems to a 4th generation system is included.

  20. Effect of sampling volume on dry powder inhaler (DPI)-emitted aerosol aerodynamic particle size distributions (APSDs) measured by the Next-Generation Pharmaceutical Impactor (NGI) and the Andersen eight-stage cascade impactor (ACI).

    PubMed

    Mohammed, Hlack; Roberts, Daryl L; Copley, Mark; Hammond, Mark; Nichols, Steven C; Mitchell, Jolyon P

    2012-09-01

    Current pharmacopeial methods for testing dry powder inhalers (DPIs) require that 4.0 L be drawn through the inhaler to quantify aerodynamic particle size distribution of "inhaled" particles. This volume comfortably exceeds the internal dead volume of the Andersen eight-stage cascade impactor (ACI) and Next Generation pharmaceutical Impactor (NGI) as designated multistage cascade impactors. Two DPIs, the second (DPI-B) having similar resistance than the first (DPI-A) were used to evaluate ACI and NGI performance at 60 L/min following the methodology described in the European and United States Pharmacopeias. At sampling times ≥2 s (equivalent to volumes ≥2.0 L), both impactors provided consistent measures of therapeutically important fine particle mass (FPM) from both DPIs, independent of sample duration. At shorter sample times, FPM decreased substantially with the NGI, indicative of incomplete aerosol bolus transfer through the system whose dead space was 2.025 L. However, the ACI provided consistent measures of both variables across the range of sampled volumes evaluated, even when this volume was less than 50% of its internal dead space of 1.155 L. Such behavior may be indicative of maldistribution of the flow profile from the relatively narrow exit of the induction port to the uppermost stage of the impactor at start-up. An explanation of the ACI anomalous behavior from first principles requires resolution of the rapidly changing unsteady flow and pressure conditions at start up, and is the subject of ongoing research by the European Pharmaceutical Aerosol Group. Meanwhile, these experimental findings are provided to advocate a prudent approach by retaining the current pharmacopeial methodology.

  1. System data communication structures for active-control transport aircraft, volume 1

    NASA Technical Reports Server (NTRS)

    Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.

    1981-01-01

    Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.

  2. Design Methodology for Bonded-Bolted Composite Joints. Volume I. Analysis Derivations and Illustrative Solutions

    DTIC Science & Technology

    1982-02-01

    EXPERIMENTAL EVIDENCE ...... ....... ....... ....... ... 27 2.6 LOAD REDISTRIBUTION DUE TO DISBONDS IN ADHESIVE IN STEPPED-LAP JOINTS...SINGLE FASTENER " . ;39 3.4 LOAD SHARING BETWEEN MULTIRUW FASTENERS.."."..-.."." ൴ 3.5 FAILURE CRITERIA AT FASTENER HOLES . . ... 3.6 EXPERIMENTAL ...PLASTIC C. PERFECTLY ELASTIC THROUGHOUT A. ULL PLSTC SEAR TRS•WITHOUT $1IGN REVERSAL WITHOUT S:IG13 REVERSAL IOR FULLY NEGATIVE QUIVALENT ) (OR FULLY

  3. Long life assurance study for manned spacecraft long life hardware. Volume 2: Long life assurance studies of EEE parts and packaging

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Guidelines for the design, development, and fabrication of electronic components and circuits for use in spacecraft construction are presented. The subjects discussed involve quality control procedures and test methodology for the following subjects: (1) monolithic integrated circuits, (2) hybrid integrated circuits, (3) transistors, (4) diodes, (5) tantalum capacitors, (6) electromechanical relays, (7) switches and circuit breakers, and (8) electronic packaging.

  4. Air & Space Power Journal. Volume 28, Number 3, May-June 2014

    DTIC Science & Technology

    2014-06-01

    critical role that AETC organizations—such as the Air Force Security Assistance Training Squadron; HQ AETC/A3Q; the AAA; and, potentially , the IAAFA... Potential de- fects in the design are more likely than computer hacking and are most effectively abated through comprehensive testing demanded by the best...fascinating picture of the potential employment methodologies and skill sets demanded of crews that operate assets like FQ-X. From a cyber-defense

  5. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  6. Geostatistical approach for assessing soil volumes requiring remediation: validation using lead-polluted soils underlying a former smelting works.

    PubMed

    Demougeot-Renard, Helene; De Fouquet, Chantal

    2004-10-01

    Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.

  7. Engine System Loads Development for the Fastrac 60K Flight Engine

    NASA Technical Reports Server (NTRS)

    Frady, Greg; Christensen, Eric R.; Mims, Katherine; Harris, Don; Parks, Russell; Brunty, Joseph

    2000-01-01

    Early implementation of structural dynamics finite element analyses for calculation of design loads is considered common design practice for high volume manufacturing industries such as automotive and aeronautical industries. However, with the rarity of rocket engine development programs starts, these tools are relatively new to the design of rocket engines. In the new Fastrac engine program, the focus has been to reduce the cost to weight ratio; current structural dynamics analysis practices were tailored in order to meet both production and structural design goals. Perturbation of rocket engine design parameters resulted in a number of Fastrac load cycles necessary to characterize the impact due to mass and stiffness changes. Evolution of loads and load extraction methodologies, parametric considerations and a discussion of load path sensitivities are discussed.

  8. Microstructure Optimization of Dual-Phase Steels Using a Representative Volume Element and a Response Surface Method: Parametric Study

    NASA Astrophysics Data System (ADS)

    Belgasam, Tarek M.; Zbib, Hussein M.

    2017-12-01

    Dual-phase (DP) steels have received widespread attention for their low density and high strength. This low density is of value to the automotive industry for the weight reduction it offers and the attendant fuel savings and emission reductions. Recent studies on developing DP steels showed that the combination of strength/ductility could be significantly improved when changing the volume fraction and grain size of phases in the microstructure depending on microstructure properties. Consequently, DP steel manufacturers are interested in predicting microstructure properties and in optimizing microstructure design. In this work, a microstructure-based approach using representative volume elements (RVEs) was developed. The approach examined the flow behavior of DP steels using virtual tension tests with an RVE to identify specific mechanical properties. Microstructures with varied martensite and ferrite grain sizes, martensite volume fractions, carbon content, and morphologies were studied in 3D RVE approaches. The effect of these microstructure parameters on a combination of strength/ductility of DP steels was examined numerically using the finite element method by implementing a dislocation density-based elastic-plastic constitutive model, and a Response surface methodology to determine the optimum conditions for a required combination of strength/ductility. The results from the numerical simulations are compared with experimental results found in the literature. The developed methodology proves to be a powerful tool for studying the effect and interaction of key microstructural parameters on strength and ductility and thus can be used to identify optimum microstructural conditions.

  9. Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design

    NASA Astrophysics Data System (ADS)

    Iqbal, Liaquat Ullah

    An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).

  10. Grid Generation Techniques Utilizing the Volume Grid Manipulator

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1998-01-01

    This paper presents grid generation techniques available in the Volume Grid Manipulation (VGM) code. The VGM code is designed to manipulate existing line, surface and volume grids to improve the quality of the data. It embodies an easy to read rich language of commands that enables such alterations as topology changes, grid adaption and smoothing. Additionally, the VGM code can be used to construct simplified straight lines, splines, and conic sections which are common curves used in the generation and manipulation of points, lines, surfaces and volumes (i.e., grid data). These simple geometric curves are essential in the construction of domain discretizations for computational fluid dynamic simulations. By comparison to previously established methods of generating these curves interactively, the VGM code provides control of slope continuity and grid point-to-point stretchings as well as quick changes in the controlling parameters. The VGM code offers the capability to couple the generation of these geometries with an extensive manipulation methodology in a scripting language. The scripting language allows parametric studies of a vehicle geometry to be efficiently performed to evaluate favorable trends in the design process. As examples of the powerful capabilities of the VGM code, a wake flow field domain will be appended to an existing X33 Venturestar volume grid; negative volumes resulting from grid expansions to enable flow field capture on a simple geometry, will be corrected; and geometrical changes to a vehicle component of the X33 Venturestar will be shown.

  11. Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; DeLoach, R.; Cutler, A. D.

    2002-01-01

    We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.

  12. ERic Acute StrokE Recanalization: A study using predictive analytics to assess a new device for mechanical thrombectomy.

    PubMed

    Siemonsen, Susanne; Forkert, Nils D; Bernhardt, Martina; Thomalla, Götz; Bendszus, Martin; Fiehler, Jens

    2017-08-01

    Aim and hypothesis Using a new study design, we investigate whether next-generation mechanical thrombectomy devices improve clinical outcomes in ischemic stroke patients. We hypothesize that this new methodology is superior to intravenous tissue plasminogen activator therapy alone. Methods and design ERic Acute StrokE Recanalization is an investigator-initiated prospective single-arm, multicenter, controlled, open label study to compare the safety and effectiveness of a new recanalization device and distal access catheter in acute ischemic stroke patients with symptoms attributable to acute ischemic stroke and vessel occlusion of the internal cerebral artery or middle cerebral artery. Study outcome The primary effectiveness endpoint is the volume of saved tissue. Volume of saved tissue is defined as difference of the actual infarct volume and the brain volume that is predicted to develop infarction by using an optimized high-level machine learning model that is trained on data from a historical cohort treated with IV tissue plasminogen activator. Sample size estimates Based on own preliminary data, 45 patients fulfilling all inclusion criteria need to complete the study to show an efficacy >38% with a power of 80% and a one-sided alpha error risk of 0.05 (based on a one sample t-test). Discussion ERic Acute StrokE Recanalization is the first prospective study in interventional stroke therapy to use predictive analytics as primary and secondary endpoint. Such trial design cannot replace randomized controlled trials with clinical endpoints. However, ERic Acute StrokE Recanalization could serve as an exemplary trial design for evaluating nonpivotal neurovascular interventions.

  13. The RAAF Logistics Study. Volume 4,

    DTIC Science & Technology

    1986-10-01

    Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system

  14. The Visible Heart® project and free-access website 'Atlas of Human Cardiac Anatomy'.

    PubMed

    Iaizzo, Paul A

    2016-12-01

    Pre- and post-evaluations of implantable cardiac devices require innovative and critical testing in all phases of the design process. The Visible Heart ® Project was successfully launched in 1997 and 3 years later the Atlas of Human Cardiac Anatomy website was online. The Visible Heart ® methodologies and Atlas website can be used to better understand human cardiac anatomy, disease states and/or to improve cardiac device design throughout the development process. To date, Visible ® Heart methodologies have been used to reanimate 75 human hearts, all considered non-viable for transplantation. The Atlas is a unique free-access website featuring novel images of functional and fixed human cardiac anatomies from >400 human heart specimens. Furthermore, this website includes education tutorials on anatomy, physiology, congenital heart disease and various imaging modalities. For instance, the Device Tutorial provides examples of commonly deployed devices that were present at the time of in vitro reanimation or were subsequently delivered, including: leads, catheters, valves, annuloplasty rings, leadless pacemakers and stents. Another section of the website displays 3D models of vasculature, blood volumes, and/or tissue volumes reconstructed from computed tomography (CT) and magnetic resonance images (MRI) of various heart specimens. A new section allows the user to interact with various heart models. Visible Heart ® methodologies have enabled our laboratory to reanimate 75 human hearts and visualize functional cardiac anatomies and device/tissue interfaces. The website freely shares all images, video clips and CT/MRI DICOM files in honour of the generous gifts received from donors and their families. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For Permissions, please email: journals.permissions@oup.com.

  15. Optimizing product life cycle processes in design phase

    NASA Astrophysics Data System (ADS)

    Faneye, Ola. B.; Anderl, Reiner

    2002-02-01

    Life cycle concepts do not only serve as basis in assisting product developers understand the dependencies between products and their life cycles, they also help in identifying potential opportunities for improvement in products. Common traditional concepts focus mainly on energy and material flow across life phases, necessitating the availability of metrics derived from a reference product. Knowledge of life cycle processes won from an existing product is directly reused in its redesign. Depending on sales volume nevertheless, the environmental impact before product optimization can be substantial. With modern information technologies today, computer-aided life cycle methodologies can be applied well before product use. On the basis of a virtual prototype, life cycle processes are analyzed and optimized, using simulation techniques. This preventive approach does not only help in minimizing (or even eliminating) environmental burdens caused by product, costs incurred due to changes in real product can also be avoided. The paper highlights the relationship between product and life cycle and presents a computer-based methodology for optimizing the product life cycle during design, as presented by SFB 392: Design for Environment - Methods and Tools at Technical University, Darmstadt.

  16. Health and safety impacts of nuclear, geothermal, and fossil-fuel electric generation in California. Volume 9. Methodologies for review of the health and safety aspects of proposed nuclear, geothermal, and fossil-fuel sites and facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nero, A.V.; Quinby-Hunt, M.S.

    1977-01-01

    This report sets forth methodologies for review of the health and safety aspects of proposed nuclear, geothermal, and fossil-fuel sites and facilities for electric power generation. The review is divided into a Notice of Intention process and an Application for Certification process, in accordance with the structure to be used by the California Energy Resources Conservation and Development Commission, the first emphasizing site-specific considerations, the second examining the detailed facility design as well. The Notice of Intention review is divided into three possible stages: an examination of emissions and site characteristics, a basic impact analysis, and an assessment of publicmore » impacts. The Application for Certification review is divided into five possible stages: a review of the Notice of Intention treatment, review of the emission control equipment, review of the safety design, review of the general facility design, and an overall assessment of site and facility acceptability.« less

  17. Use of a geomorphological transfer function to model design floods in small hillside catchments in semiarid Tunisia

    NASA Astrophysics Data System (ADS)

    Nasri, S.; Cudennec, C.; Albergel, J.; Berndtsson, R.

    2004-02-01

    In the beginning of the 1990s, the Tunisian Ministry of Agriculture launched an ambitious program for constructing small hillside reservoirs in the northern and central region of the country. At present, more than 720 reservoirs have been created. They consist of small compacted earth dams supplied with a horizontal overflow weir. Due to lack of hydrological data and the area's extreme floods, however, it is very difficult to design the overflow weirs. Also, catchments are very sensitive to erosion and the reservoirs are rapidly silted up. Consequently, prediction of flood volumes for important rainfall events becomes crucial. Few hydrological observations, however, exist for the catchment areas. For this purpose a geomorphological model methodology is presented to predict shape and volume of hydrographs for important floods. This model is built around a production function that defines the net storm rainfall (portion of rainfall during a storm which reaches a stream channel as direct runoff) from the total rainfall (observed rainfall in the catchment) and a transfer function based on the most complete possible definition of the surface drainage system. Observed rainfall during 5-min time steps was used in the model. The model runoff generation is based on surface drainage characteristics which can be easily extracted from maps. The model was applied to two representative experimental catchments in central Tunisia. The conceptual rainfall-runoff model based on surface topography and drainage network was seen to reproduce observed runoff satisfactory. The calibrated model was used to estimate runoff from 5, 10, 20, and 50 year rainfall return periods regarding runoff volume, maximum runoff, as well as the general shape of the runoff hydrograph. Practical conclusions to design hill reservoirs and to extrapolate results using this model methodology for ungauged small catchments in semiarid Tunisia are made.

  18. Proceedings of the Workshop on Identification and Control of Flexible Space Structures, volume 1

    NASA Technical Reports Server (NTRS)

    Rodriguez, G. (Editor)

    1985-01-01

    Identification and control of flexible space structures were studied. Exploration of the most advanced modeling estimation, identification and control methodologies to flexible space structures was discussed. The following general areas were discussed: space platforms, antennas, and flight experiments; control/structure interactions - modeling, integrated design and optimization, control and stabilization, and shape control; control technology; control of space stations; large antenna control, dynamics and control experiments, and control/structure interaction experiments.

  19. Foundational Performance Analyses of Pressure Gain Combustion Thermodynamic Benefits for Gas Turbines

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Kaemming, Thomas A.

    2012-01-01

    A methodology is described whereby the work extracted by a turbine exposed to the fundamentally nonuniform flowfield from a representative pressure gain combustor (PGC) may be assessed. The method uses an idealized constant volume cycle, often referred to as an Atkinson or Humphrey cycle, to model the PGC. Output from this model is used as input to a scalable turbine efficiency function (i.e., a map), which in turn allows for the calculation of useful work throughout the cycle. Integration over the entire cycle yields mass-averaged work extraction. The unsteady turbine work extraction is compared to steady work extraction calculations based on various averaging techniques for characterizing the combustor exit pressure and temperature. It is found that averages associated with momentum flux (as opposed to entropy or kinetic energy) provide the best match. This result suggests that momentum-based averaging is the most appropriate figure-of-merit to use as a PGC performance metric. Using the mass-averaged work extraction methodology, it is also found that the design turbine pressure ratio for maximum work extraction is significantly higher than that for a turbine fed by a constant pressure combustor with similar inlet conditions and equivalence ratio. Limited results are presented whereby the constant volume cycle is replaced by output from a detonation-based PGC simulation. The results in terms of averaging techniques and design pressure ratio are similar.

  20. Definition and preliminary design of the Laser Atmospheric Wind Sounder (LAWS) phase 1. Volume 3: Program cost estimates

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Cost estimates for phase C/D of the laser atmospheric wind sounder (LAWS) program are presented. This information provides a framework for cost, budget, and program planning estimates for LAWS. Volume 3 is divided into three sections. Section 1 details the approach taken to produce the cost figures, including the assumptions regarding the schedule for phase C/D and the methodology and rationale for costing the various work breakdown structure (WBS) elements. Section 2 shows a breakdown of the cost by WBS element, with the cost divided in non-recurring and recurring expenditures. Note that throughout this volume the cost is given in 1990 dollars, with bottom line totals also expressed in 1988 dollars (1 dollar(88) = 0.93 1 dollar(90)). Section 3 shows a breakdown of the cost by year. The WBS and WBS dictionary are included as an attachment to this report.

  1. Effects of obesity on lung volume and capacity in children and adolescents: a systematic review

    PubMed Central

    Winck, Aline Dill; Heinzmann-Filho, João Paulo; Soares, Rafaela Borges; da Silva, Juliana Severo; Woszezenki, Cristhiele Taís; Zanatta, Letiane Bueno

    2016-01-01

    Abstract Objective: To assess the effects of obesity on lung volume and capacity in children and adolescents. Data source: This is a systematic review, carried out in Pubmed, Lilacs, Scielo and PEDro databases, using the following Keywords: Plethysmography; Whole Body OR Lung Volume Measurements OR Total Lung Capacity OR Functional Residual Capacity OR Residual Volume AND Obesity. Observational studies or clinical trials that assessed the effects of obesity on lung volume and capacity in children and adolescents (0-18 years) without any other associated disease; in English; Portuguese and Spanish languages were selected. Methodological quality was assessed by the Agency for Healthcare Research and Quality. Data synthesis: Of the 1,030 articles, only four were included in the review. The studies amounted to 548 participants, predominantly males, with sample size ranging from 45 to 327 individuals. 100% of the studies evaluated nutritional status through BMI (z-score) and 50.0% reported the data on abdominal circumference. All demonstrated that obesity causes negative effects on lung volume and capacity, causing a reduction mainly in functional residual capacity in 75.0% of the studies; in the expiratory reserve volume in 50.0% and in the residual volume in 25.0%. The methodological quality ranged from moderate to high, with 75.0% of the studies classified as having high methodological quality. Conclusions: Obesity causes deleterious effects on lung volume and capacity in children and adolescents, mainly by reducing functional residual capacity, expiratory reserve volume and residual volume. PMID:27130483

  2. Mitigating mechanical failure of crystalline silicon electrodes for lithium batteries by morphological design [Morphological design of silicon electrode with anisotropic interface reaction rate for lithium ion batteries

    DOE PAGES

    An, Yonghao; Wood, Brandon C.; Ye, Jianchao; ...

    2015-06-08

    Although crystalline silicon (c-Si) anodes promise very high energy densities in Li-ion batteries, their practical use is complicated by amorphization, large volume expansion and severe plastic deformation upon lithium insertion. Recent experiments have revealed the existence of a sharp interface between crystalline Si (c-Si) and the amorphous Li xSi alloy during lithiation, which propagates with a velocity that is orientation dependent; the resulting anisotropic swelling generates substantial strain concentrations that initiate cracks even in nanostructured Si. Here we describe a novel strategy to mitigate lithiation-induced fracture by using pristine c-Si structures with engineered anisometric morphologies that are deliberately designed tomore » counteract the anisotropy in the crystalline/amorphous interface velocity. This produces a much more uniform volume expansion, significantly reducing strain concentration. Based on a new, validated methodology that improves previous models of anisotropic swelling of c-Si, we propose optimal morphological designs for c-Si pillars and particles. The advantages of the new morphologies are clearly demonstrated by mesoscale simulations and verified by experiments on engineered c-Si micropillars. The results of this study illustrate that morphological design is effective in improving the fracture resistance of micron-sized Si electrodes, which will facilitate their practical application in next-generation Li-ion batteries. In conclusion, the model and design approach present in this paper also have general implications for the study and mitigation of mechanical failure of electrode materials that undergo large anisotropic volume change upon ion insertion and extraction.« less

  3. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less

  4. Stabilized Finite Elements in FUN3D

    NASA Technical Reports Server (NTRS)

    Anderson, W. Kyle; Newman, James C.; Karman, Steve L.

    2017-01-01

    A Streamlined Upwind Petrov-Galerkin (SUPG) stabilized finite-element discretization has been implemented as a library into the FUN3D unstructured-grid flow solver. Motivation for the selection of this methodology is given, details of the implementation are provided, and the discretization for the interior scheme is verified for linear and quadratic elements by using the method of manufactured solutions. A methodology is also described for capturing shocks, and simulation results are compared to the finite-volume formulation that is currently the primary method employed for routine engineering applications. The finite-element methodology is demonstrated to be more accurate than the finite-volume technology, particularly on tetrahedral meshes where the solutions obtained using the finite-volume scheme can suffer from adverse effects caused by bias in the grid. Although no effort has been made to date to optimize computational efficiency, the finite-element scheme is competitive with the finite-volume scheme in terms of computer time to reach convergence.

  5. Double Sided-Design of Electrodes Driving Tunable Dielectrophoretic Miniature Lens.

    PubMed

    Almoallem, Yousuf; Jiang, Hongrui

    2017-10-01

    We demonstrate the design methodology, geometrical analysis, device fabrication, and testing of a double-sided design (DSD) of tunable-focus dielectrophoretic liquid miniature lenses. This design is intended to reduce the driving voltage for tuning the lens, utilizing a double-sided electrode design that enhances the electric field magnitude. Fabricated devices were tested and measurements on a goniometer showed changes of up to 14° in the contact angle when the dielectrophoretic force was applied under 25 V rms . Correspondingly, the back focal length of the liquid lens changed from 67.1 mm to 14.4 mm when the driving voltage was increased from zero to 25 V rms . The driving voltage was significantly lower than those previously reported with similar device dimensions using single-sided electrode designs. This design allows for a range of both positive and negative menisci dependent on the volume of the lens liquid initially dispensed.

  6. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  7. Space Biology Initiative. Trade Studies, volume 2

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The six studies which are the subjects of this report are entitled: Design Modularity and Commonality; Modification of Existing Hardware (COTS) vs. New Hardware Build Cost Analysis; Automation Cost vs. Crew Utilization; Hardware Miniaturization versus Cost; Space Station Freedom/Spacelab Modules Compatibility vs. Cost; and Prototype Utilization in the Development of Space Hardware. The product of these six studies was intended to provide a knowledge base and methodology that enables equipment produced for the Space Biology Initiative program to meet specific design and functional requirements in the most efficient and cost effective form consistent with overall mission integration parameters. Each study promulgates rules of thumb, formulas, and matrices that serves as a handbook for the use and guidance of designers and engineers in design, development, and procurement of Space Biology Initiative (SBI) hardware and software.

  8. Space Biology Initiative. Trade Studies, volume 1

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The six studies which are addressed are entitled: Design Modularity and Commonality; Modification of Existing Hardware (COTS) vs. New Hardware Build Cost Analysis; Automation Cost vs. Crew Utilization; Hardware Miniaturization versus Cost; Space Station Freedom/Spacelab Modules Compatibility vs. Cost; and Prototype Utilization in the Development of Space Hardware. The product of these six studies was intended to provide a knowledge base and methodology that enables equipment produced for the Space Biology Initiative program to meet specific design and functional requirements in the most efficient and cost effective form consistent with overall mission integration parameters. Each study promulgates rules of thumb, formulas, and matrices that serves has a handbook for the use and guidance of designers and engineers in design, development, and procurement of Space Biology Initiative (SBI) hardware and software.

  9. Optical and digital pattern recognition; Proceedings of the Meeting, Los Angeles, CA, Jan. 13-15, 1987

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Editor); Schenker, Paul (Editor)

    1987-01-01

    The papers presented in this volume provide an overview of current research in both optical and digital pattern recognition, with a theme of identifying overlapping research problems and methodologies. Topics discussed include image analysis and low-level vision, optical system design, object analysis and recognition, real-time hybrid architectures and algorithms, high-level image understanding, and optical matched filter design. Papers are presented on synthetic estimation filters for a control system; white-light correlator character recognition; optical AI architectures for intelligent sensors; interpreting aerial photographs by segmentation and search; and optical information processing using a new photopolymer.

  10. Teleoperator system man-machine interface requirements for satellite retrieval and satellite servicing. Volume 1: Requirements

    NASA Technical Reports Server (NTRS)

    Malone, T. B.

    1972-01-01

    Requirements were determined analytically for the man machine interface for a teleoperator system performing on-orbit satellite retrieval and servicing. Requirements are basically of two types; mission/system requirements, and design requirements or design criteria. Two types of teleoperator systems were considered: a free flying vehicle, and a shuttle attached manipulator. No attempt was made to evaluate the relative effectiveness or efficiency of the two system concepts. The methodology used entailed an application of the Essex Man-Systems analysis technique as well as a complete familiarization with relevant work being performed at government agencies and by private industry.

  11. The impact of domestic rainwater harvesting systems in storm water runoff mitigation at the urban block scale.

    PubMed

    Palla, A; Gnecco, I; La Barbera, P

    2017-04-15

    In the framework of storm water management, Domestic Rainwater Harvesting (DRWH) systems are recently recognized as source control solutions according to LID principles. In order to assess the impact of these systems in storm water runoff control, a simple methodological approach is proposed. The hydrologic-hydraulic modelling is undertaken using EPA SWMM; the DRWH is implemented in the model by using a storage unit linked to the building water supply system and to the drainage network. The proposed methodology has been implemented for a residential urban block located in Genoa (Italy). Continuous simulations are performed by using the high-resolution rainfall data series for the ''do nothing'' and DRWH scenarios. The latter includes the installation of a DRWH system for each building of the urban block. Referring to the test site, the peak and volume reduction rate evaluated for the 2125 rainfall events are respectively equal to 33 and 26 percent, on average (with maximum values of 65 percent for peak and 51 percent for volume). In general, the adopted methodology indicates that the hydrologic performance of the storm water drainage network equipped with DRWH systems is noticeable even for the design storm event (T = 10 years) and the rainfall depth seems to affect the hydrologic performance at least when the total depth exceeds 20 mm. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Membranes with artificial free-volume for biofuel production

    PubMed Central

    Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.

    2015-01-01

    Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity. PMID:26104672

  13. Membranes with artificial free-volume for biofuel production

    NASA Astrophysics Data System (ADS)

    Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.

    2015-06-01

    Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.

  14. Membranes with artificial free-volume for biofuel production

    DOE PAGES

    Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; ...

    2015-06-24

    Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. Here, we have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the termmore » artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. Moreover, we found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.« less

  15. [Purification Technology Optimization for Saponins from Ziziphi Spinosae Semen with Macroporous Adsorption Resin by Box-Behnken Design-Response Surface Methodology].

    PubMed

    Zhao, Hui-ru; Ren, Zao; Liu, Chun-ye

    2015-04-01

    To compare the purification effect of saponins from Ziziphi Spinosae Semen with different types of macroporous adsorption resin, and to optimize its purification technology. The type of macroporous resins was optimized by static adsorption method. The optimum technological conditions of saponins from Ziziphi Spinosae Semen was screened by single factor test and Box-Behnken Design-Response Surface Methodology. AB-8 macroporous resin had better purification effect of total saponins than other resins, optimum technological parameters were as follows: column height-diameter ratio was 5: 1, the concentration of sample solution was 2. 52 mg/mL, resin adsorption quantity was 8. 915 mg/g, eluted by 3 BV water, flow rate of adsorption and elution was 2 BV/h, elution solvent was 75% ethanol, elution solvent volume was 5 BV. AB-8 macroporous resin has a good purification effect on jujuboside A. The optimized technology is stable and feasible.

  16. Design optimization of an axial-field eddy-current magnetic coupling based on magneto-thermal analytical model

    NASA Astrophysics Data System (ADS)

    Fontchastagner, Julien; Lubin, Thierry; Mezani, Smaïl; Takorabet, Noureddine

    2018-03-01

    This paper presents a design optimization of an axial-flux eddy-current magnetic coupling. The design procedure is based on a torque formula derived from a 3D analytical model and a population algorithm method. The main objective of this paper is to determine the best design in terms of magnets volume in order to transmit a torque between two movers, while ensuring a low slip speed and a good efficiency. The torque formula is very accurate and computationally efficient, and is valid for any slip speed values. Nevertheless, in order to solve more realistic problems, and then, take into account the thermal effects on the torque value, a thermal model based on convection heat transfer coefficients is also established and used in the design optimization procedure. Results show the effectiveness of the proposed methodology.

  17. LL13-MatModelRadDetect-PD2Jf Final Report: Materials Modeling for High-Performance Radiation Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lordi, Vincenzo

    The aims of this project are to enable rational materials design for select high-payoff challenges in radiation detection materials by using state-of-the-art predictive atomistic modeling techniques. Three specific high-impact challenges are addressed: (i) design and optimization of electrical contact stacks for TlBr detectors to stabilize temporal response at room-temperature; (ii) identification of chemical design principles of host glass materials for large-volume, low-cost, highperformance glass scintillators; and (iii) determination of the electrical impacts of dislocation networks in Cd 1-xZn xTe (CZT) that limit its performance and usable single-crystal volume. The specific goals are to establish design and process strategies to achievemore » improved materials for high performance detectors. Each of the major tasks is discussed below in three sections, which include the goals for the task and a summary of the major results, followed by a listing of publications that contain the full details, including details of the methodologies used. The appendix lists 12 conference presentations given for this project, including 1 invited talk and 1 invited poster.« less

  18. Volume and methodological quality of randomized controlled trials in laparoscopic surgery: assessment over a 10-year period.

    PubMed

    Antoniou, Stavros A; Andreou, Alexandros; Antoniou, George A; Koch, Oliver O; Köhler, Gernot; Luketina, Ruzica-R; Bertsias, Antonios; Pointner, Rudolph; Granderath, Frank-Alexander

    2015-11-01

    Measures have been taken to improve methodological quality of randomized controlled trials (RCTs). This review systematically assessed the trends in volume and methodological quality of RCTs on minimally invasive surgery within a 10-year period. RCTs on minimally invasive surgery were searched in the 10 most cited general surgical journals and the 5 most cited journals of laparoscopic interest for the years 2002 and 2012. Bibliometric and methodological quality components were abstracted using the Scottish Intercollegiate Guidelines Network. The pooled number of RCTs from low-contribution regions demonstrated an increasing proportion of the total published RCTs, compensating for a concomitant decrease of the respective contributions from Europe and North America. International collaborations were more frequent in 2012. Acceptable or high quality RCTs accounted for 37.9% and 54.4% of RCTs published in 2002 and 2012, respectively. Components of external validity were poorly reported. Both the volume and the reporting quality of laparoscopic RCTs have increased from 2002 to 2012, but there seems to be ample room for improvement of methodological quality. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. United States Air Force Summer Research Program - 1993 Summer Research Extension Program Final Reports, Volume 4A, Wright Laboratory

    DTIC Science & Technology

    1994-11-01

    Erdman Solar to Thermal Energy Physics and Astronomy University of Iowa, Iowa City, IA PL/RK 6 A Detailed Investigation of Low-and High-Power Arcjet...Properties of Dr. Mary Potasek Strained Layer Sem Applied Physics Columbia University, New York, NY WL/ML 27 Development of Control Design Methodologies...concrete is also presented. Finally, the model is extended to include penetration into multiple layers of different target materials. Comparisons are

  20. Interactive planning workshop. Volume 2. Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-01-01

    The Division of Fossil Fuel Utilization has sponsored a series of interactive planning workshops designed to involve private citizens and representatives in industry, the academic community, public interest groups, and state and local governments in the division's planning process. The findings of the Mt. Hood Interactive Planning Workshop are presented in this summary. This conclave was held at Timberline Lodge on October 15-17, 1978, and was hosted by the Mt. Hood Community College of Gresham, Oregon. Participants examined the division's program goals, planning process, and project appraisal methodology.

  1. Development of Fatigue and Crack Propagation Design and Analysis Methodology in a Corrosive Environment for Typical Mechanically-Fastened Joints. Volume 2. State-of-the-Art Assessment.

    DTIC Science & Technology

    1983-03-01

    120] hypothesized a linear summation model to predict the corrosion -fatigue behavior above Kjscc for a high-strength steel . The model considers the...120] could satisfactorily predict the rates of corrosion -fatigue-crack growth for 18-Ni Maraging steels tested in several gaseous and aqueous...NADC-83126-60 Vol. II 6. The corrosion fatigue behavior of titanium alloys is very complex. Therefore, a better understanding of corrosion fatigue

  2. Experimental investigation of the mass flow gain factor in a draft tube with cavitation vortex rope

    NASA Astrophysics Data System (ADS)

    Landry, C.; Favrel, A.; Müller, A.; Yamamoto, K.; Alligné, S.; Avellan, F.

    2017-04-01

    At off-design operating operations, cavitating flow is often observed in hydraulic machines. The presence of a cavitation vortex rope may induce draft tube surge and electrical power swings at part load and full load operations. The stability analysis of these operating conditions requires a numerical pipe model taking into account the complexity of the two-phase flow. Among the hydroacoustic parameters describing the cavitating draft tube flow in the numerical model, the mass flow gain factor, representing the mass excitation source expressed as the rate of change of the cavitation volume as a function of the discharge, remains difficult to model. This paper presents a quasi-static method to estimate the mass flow gain factor in the draft tube for a given cavitation vortex rope volume in the case of a reduced scale physical model of a ν = 0.27 Francis turbine. The methodology is based on an experimental identification of the natural frequency of the test rig hydraulic system for different Thoma numbers. With the identification of the natural frequency, it is possible to model the wave speed, the cavitation compliance and the volume of the cavitation vortex rope. By applying this new methodology for different discharge values, it becomes possible to identify the mass flow gain factor and improve the accuracy of the system stability analysis.

  3. Comparison of supercritical fluid extraction and ultrasound-assisted extraction of fatty acids from quince (Cydonia oblonga Miller) seed using response surface methodology and central composite design.

    PubMed

    Daneshvand, Behnaz; Ara, Katayoun Mahdavi; Raofie, Farhad

    2012-08-24

    Fatty acids of Cydonia oblonga Miller cultivated in Iran were obtained by supercritical (carbon dioxide) extraction and ultrasound-assisted extraction methods. The oils were analyzed by capillary gas chromatography using mass spectrometric detections. The compounds were identified according to their retention indices and mass spectra (EI, 70eV). The experimental parameters of SFE such as pressure, temperature, modifier volume, static and dynamic extraction time were optimized using a Central Composite Design (CCD) after a 2(5) factorial design. Pressure and dynamic extraction time had significant effect on the extraction yield, while the other factors (temperature, static extraction time and modifier volume) were not identified as significant factors under the selected conditions. The results of chemometrics analysis showed the highest yield for SFE (24.32%), which was obtained at a pressure of 353bar, temperature of 35°C, modifier (methanol) volume of 150μL, and static and dynamic extraction times of 10 and 60min, respectively. Ultrasound-assisted extraction (UAE) of Fatty acids from C. oblonga Miller was optimized, using a rotatable central composite design. The optimum conditions were as follows: solvent (n-hexane) volume, 22mL; extraction time, 30min; and extraction temperature, 55°C. This resulted in a maximum oil recovery of 19.5%. The extracts with higher yield from both methods were subjected to transesterification and GC-MS analysis. The results show that the oil obtained by SFE with the optimal operating conditions allowed a fatty acid composition similar to the oil obtained by UAE in optimum condition and no significant differences were found. The major components of oil extract were Linoleic, Palmitic, Oleic, Stearic and Eicosanoic acids. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Effects of obesity on lung volume and capacity in children and adolescents: a systematic review.

    PubMed

    Winck, Aline Dill; Heinzmann-Filho, João Paulo; Soares, Rafaela Borges; da Silva, Juliana Severo; Woszezenki, Cristhiele Taís; Zanatta, Letiane Bueno

    2016-12-01

    To assess the effects of obesity on lung volume and capacity in children and adolescents. This is a systematic review, carried out in Pubmed, Lilacs, Scielo and PEDro databases, using the following Keywords: Plethysmography; Whole Body OR Lung Volume Measurements OR Total Lung Capacity OR Functional Residual Capacity OR Residual Volume AND Obesity. Observational studies or clinical trials that assessed the effects of obesity on lung volume and capacity in children and adolescents (0-18 years) without any other associated disease; in English; Portuguese and Spanish languages were selected. Methodological quality was assessed by the Agency for Healthcare Research and Quality. Of the 1,030 articles, only four were included in the review. The studies amounted to 548 participants, predominantly males, with sample size ranging from 45 to 327 individuals. 100% of the studies evaluated nutritional status through BMI (z-score) and 50.0% reported the data on abdominal circumference. All demonstrated that obesity causes negative effects on lung volume and capacity, causing a reduction mainly in functional residual capacity in 75.0% of the studies; in the expiratory reserve volume in 50.0% and in the residual volume in 25.0%. The methodological quality ranged from moderate to high, with 75.0% of the studies classified as having high methodological quality. Obesity causes deleterious effects on lung volume and capacity in children and adolescents, mainly by reducing functional residual capacity, expiratory reserve volume and residual volume. Copyright © 2016 Sociedade de Pediatria de São Paulo. Publicado por Elsevier Editora Ltda. All rights reserved.

  5. Habitat Utilization Assessment - Building in Behaviors

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Blume, Jennifer

    2004-01-01

    Habitability, and the associated architectural and design attributes of an environment, is a powerful performance shaping factor. By identifying how inhabitants use an area, we can draw conclusions about what design or architectural attributes cause what behaviors and systematically design in desired human performance. We are analyzing how a crew uses a long duration habitat and work environment during a four-day underwater mission and identifying certain architectural and design attributes that are related to, and potential enablers of, certain crew behaviors. By identifying how inhabitants use the habitat, we can draw conclusions about what habitability attributes cause what behaviors and systematically design in desired human performance (applicable to NASA's Bioastronautics Human Behavior and Performance Critical Path Roadmap question 6.12). This assessment replicates a methodology reported in a chapter titled "Sociokinetic Analysis as a Tool for Optimization of Environmental Design" by C. Adams.' That study collected video imagery of certain areas of a closed habitat during a 91 day test and from that data calculated time spent in different volumes during the mission, and characterized the behaviors occurring in certain habitat volumes thus concluding various rules for design of such habitats. This study assesses the utilization of the Aquarius Habitat, an underwater station, which will support six Aquanauts for a fourteen-day mission during which the crew will perform specific scientific and engineering studies. Video is recorded for long uninterrupted periods of time during the mission and from that data the time spent in each area is calculated. In addition, qualitative and descriptive analysis of the types of behaviors in each area is performed with the purpose of identifying any behaviors that are not typical of a certain area. If a participant uses an area in a way different from expected, a subsequent analysis of the features of that area may result in conclusions of performance shaping factors. With the addition of this study, we can make comparisons between the two different habitats and begin drawing correlation judgments about design features and behavior. Ideally, this methodology should be repeated in additional Aquarius missions and other analog environments because the real information will come from comparisons between habitats.

  6. An investigation of the self-heating phenomenon in viscoelastic materials subjected to cyclic loadings accounting for prestress

    NASA Astrophysics Data System (ADS)

    de Lima, A. M. G.; Rade, D. A.; Lacerda, H. B.; Araújo, C. A.

    2015-06-01

    It has been demonstrated by many authors that the internal damping mechanism of the viscoelastic materials offers many possibilities for practical engineering applications. However, in traditional procedures of analysis and design of viscoelastic dampers subjected to cyclic loadings, uniform, constant temperature is generally assumed and do not take into account the self-heating phenomenon. Moreover, for viscoelastic materials subjected to dynamic loadings superimposed on static preloads, such as engine mounts, these procedures can lead to poor designs or even severe failures since the energy dissipated within the volume of the material leads to temperature rises. In this paper, a hybrid numerical-experimental investigation of effects of the static preloads on the self-heating phenomenon in viscoelastic dampers subjected to harmonic loadings is reported. After presenting the theoretical foundations, the numerical and experimental results obtained in terms of the temperature evolutions at different points within the volume of the viscoelastic material for various static preloads are compared, and the main features of the methodology are discussed.

  7. Micro-machined resonator oscillator

    DOEpatents

    Koehler, Dale R.; Sniegowski, Jeffry J.; Bivens, Hugh M.; Wessendorf, Kurt O.

    1994-01-01

    A micro-miniature resonator-oscillator is disclosed. Due to the miniaturization of the resonator-oscillator, oscillation frequencies of one MHz and higher are utilized. A thickness-mode quartz resonator housed in a micro-machined silicon package and operated as a "telemetered sensor beacon" that is, a digital, self-powered, remote, parameter measuring-transmitter in the FM-band. The resonator design uses trapped energy principles and temperature dependence methodology through crystal orientation control, with operation in the 20-100 MHz range. High volume batch-processing manufacturing is utilized, with package and resonator assembly at the wafer level. Unique design features include squeeze-film damping for robust vibration and shock performance, capacitive coupling through micro-machined diaphragms allowing resonator excitation at the package exterior, circuit integration and extremely small (0.1 in. square) dimensioning. A family of micro-miniature sensor beacons is also disclosed with widespread applications as bio-medical sensors, vehicle status monitors and high-volume animal identification and health sensors. The sensor family allows measurement of temperatures, chemicals, acceleration and pressure. A microphone and clock realization is also available.

  8. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    ERIC Educational Resources Information Center

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  9. Quantifying Standing Dead Tree Volume and Structural Loss with Voxelized Terrestrial Lidar Data

    NASA Astrophysics Data System (ADS)

    Popescu, S. C.; Putman, E.

    2017-12-01

    Standing dead trees (SDTs) are an important forest component and impact a variety of ecosystem processes, yet the carbon pool dynamics of SDTs are poorly constrained in terrestrial carbon cycling models. The ability to model wood decay and carbon cycling in relation to detectable changes in tree structure and volume over time would greatly improve such models. The overall objective of this study was to provide automated aboveground volume estimates of SDTs and automated procedures to detect, quantify, and characterize structural losses over time with terrestrial lidar data. The specific objectives of this study were: 1) develop an automated SDT volume estimation algorithm providing accurate volume estimates for trees scanned in dense forests; 2) develop an automated change detection methodology to accurately detect and quantify SDT structural loss between subsequent terrestrial lidar observations; and 3) characterize the structural loss rates of pine and oak SDTs in southeastern Texas. A voxel-based volume estimation algorithm, "TreeVolX", was developed and incorporates several methods designed to robustly process point clouds of varying quality levels. The algorithm operates on horizontal voxel slices by segmenting the slice into distinct branch or stem sections then applying an adaptive contour interpolation and interior filling process to create solid reconstructed tree models (RTMs). TreeVolX estimated large and small branch volume with an RMSE of 7.3% and 13.8%, respectively. A voxel-based change detection methodology was developed to accurately detect and quantify structural losses and incorporated several methods to mitigate the challenges presented by shifting tree and branch positions as SDT decay progresses. The volume and structural loss of 29 SDTs, composed of Pinus taeda and Quercus stellata, were successfully estimated using multitemporal terrestrial lidar observations over elapsed times ranging from 71 - 753 days. Pine and oak structural loss rates were characterized by estimating the amount of volumetric loss occurring in 20 equal-interval height bins of each SDT. Results showed that large pine snags exhibited more rapid structural loss in comparison to medium-sized oak snags in this study.

  10. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  11. Railroad Classification Yard Technology Manual: Volume II : Yard Computer Systems

    DOT National Transportation Integrated Search

    1981-08-01

    This volume (Volume II) of the Railroad Classification Yard Technology Manual documents the railroad classification yard computer systems methodology. The subjects covered are: functional description of process control and inventory computer systems,...

  12. An optimization methodology for heterogeneous minor actinides transmutation

    NASA Astrophysics Data System (ADS)

    Kooyman, Timothée; Buiron, Laurent; Rimpault, Gérald

    2018-04-01

    In the case of a closed fuel cycle, minor actinides transmutation can lead to a strong reduction in spent fuel radiotoxicity and decay heat. In the heterogeneous approach, minor actinides are loaded in dedicated targets located at the core periphery so that long-lived minor actinides undergo fission and are turned in shorter-lived fission products. However, such targets require a specific design process due to high helium production in the fuel, high flux gradient at the core periphery and low power production. Additionally, the targets are generally manufactured with a high content in minor actinides in order to compensate for the low flux level at the core periphery. This leads to negative impacts on the fuel cycle in terms of neutron source and decay heat of the irradiated targets, which penalize their handling and reprocessing. In this paper, a simplified methodology for the design of targets is coupled with a method for the optimization of transmutation which takes into account both transmutation performances and fuel cycle impacts. The uncertainties and performances of this methodology are evaluated and shown to be sufficient to carry out scoping studies. An illustration is then made by considering the use of moderating material in the targets, which has a positive impact on the minor actinides consumption but a negative impact both on fuel cycle constraints (higher decay heat and neutron) and on assembly design (higher helium production and lower fuel volume fraction). It is shown that the use of moderating material is an optimal solution of the transmutation problem with regards to consumption and fuel cycle impacts, even when taking geometrical design considerations into account.

  13. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions. Volume 2, Appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake`s ground motion is a function of the earthquake`s magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is amore » stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical & Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties.« less

  14. A methodology for rapid vehicle scaling and configuration space exploration

    NASA Astrophysics Data System (ADS)

    Balaba, Davis

    2009-12-01

    The Configuration-space Exploration and Scaling Methodology (CESM) entails the representation of component or sub-system geometries as matrices of points in 3D space. These typically large matrices are reduced using minimal convex sets or convex hulls. This reduction leads to significant gains in collision detection speed at minimal approximation expense. (The Gilbert-Johnson-Keerthi algorithm [79] is used for collision detection purposes in this methodology.) Once the components are laid out, their collective convex hull (from here on out referred to as the super-hull) is used to approximate the inner mold line of the minimum enclosing envelope of the vehicle concept. A sectional slicing algorithm is used to extract the sectional dimensions of this envelope. An offset is added to these dimensions in order to come up with the sectional fuselage dimensions. Once the lift and control surfaces are added, vehicle level objective functions can be evaluated and compared to other designs. The size of the design space coupled with the fact that some key constraints such as the number of collisions are discontinuous, dictate that a domain-spanning optimization routine be used. Also, as this is a conceptual design tool, the goal is to provide the designer with a diverse baseline geometry space from which to chose. For these reasons, a domain-spanning algorithm with counter-measures against speciation and genetic drift is the recommended optimization approach. The Non-dominated Sorting Genetic Algorithm (NSGA-II) [60] is shown to work well for the proof of concept study. There are two major reasons why the need to evaluate higher fidelity, custom geometric scaling laws became a part of this body of work. First of all, historical-data based regressions become implicitly unreliable when the vehicle concept in question is designed around a disruptive technology. Second, it was shown that simpler approaches such as photographic scaling can result in highly suboptimal concepts even for very small scaling factors. Yet good scaling information is critical to the success of any conceptual design process. In the CESM methodology, it is assumed that the new technology has matured enough to permit the prediction of the scaling behavior of the various subsystems in response to requirement changes. Updated subsystem geometry data is generated by applying the new requirement settings to the affected subsystems. All collisions are then eliminated using the NSGA-II algorithm. This is done while minimizing the adverse impact on the vehicle packing density. Once all collisions are eliminated, the vehicle geometry is reconstructed and system level data such as fuselage volume can be harvested. This process is repeated for all requirement settings. Dimensional analysis and regression can be carried out using this data and all other pertinent metrics in the manner described by Mendez [124] and Segel [173]. The dominant parameters for each response show up as in the dimensionally consistent groups that form the independent variables. More importantly the impact of changes in any of these variables on system level dependent variables can be easily and rapidly evaluated. In this way, the conceptual design process can be accelerated without sacrificing analysis accuracy. Scaling laws for take-off gross weight and fuselage volume as functions of fuel cell specific power and power density for a notional General Aviation vehicle are derived for the proof of concept. CESM enables the designer to maintain design freedom by portably carrying multiple designs deeper into the design process. Also since CESM is a bottom-up approach, all proposed baseline concepts are implicitly volumetrically feasible. System level geometry parameters become fall-outs as opposed to inputs. This is a critical attribute as, without the benefit of experience, a designer would be hard pressed to set the appropriate ranges for such parameters for a vehicle built around a disruptive technology. Furthermore, scaling laws generated from custom data for each concept are subject to less design noise than say, regression based approaches. Through these laws, key physics-based characteristics of vehicle subsystems such as energy density can be mapped onto key system level metrics such as fuselage volume or take-off gross weight. These laws can then substitute some historical-data based analyses thereby improving the fidelity of the analyses and reducing design time. (Abstract shortened by UMI.)

  15. Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005

    ERIC Educational Resources Information Center

    Coffman, Julia, Ed.

    2005-01-01

    This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…

  16. [Definition of low threshold volumes for quality assurance: conceptual and methodological issues involved in the definition and evaluation of thresholds for volume outcome relations in clinical care].

    PubMed

    Wetzel, Hermann

    2006-01-01

    In a large number of mostly retrospective association studies, a statistical relationship between volume and quality of health care has been reported. However, the relevance of these results is frequently limited by methodological shortcomings. In this article, criteria for the evidence and definition of thresholds for volume-outcome relations are proposed, e.g. the specification of relevant outcomes for quality indicators, analysis of volume as a continuous variable with an adequate case-mix and risk adjustment, accounting for cluster effects and considering mathematical models for the derivation of cut-off values. Moreover, volume thresholds are regarded as surrogate parameters for the indirect classification of the quality of care, whose diagnostic validity and effectiveness in improving health care quality need to be evaluated in prospective studies.

  17. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 3

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    Structural failure is rarely a "sudden death" type of event, such sudden failures may occur only under abnormal loadings like bomb or gas explosions and very strong earthquakes. In most cases, structures fail due to damage accumulated under normal loadings such as wind loads, dead and live loads. The consequence of cumulative damage will affect the reliability of surviving components and finally causes collapse of the system. The cumulative damage effects on system reliability under time-invariant loadings are of practical interest in structural design and therefore will be investigated in this study. The scope of this study is, however, restricted to the consideration of damage accumulation as the increase in the number of failed components due to the violation of their strength limits.

  18. [Optimization of ultrasonic-assisted extraction of total flavonoids from leaves of the Artocarpus heterophyllus by response surface methodology].

    PubMed

    Wang, Hong-wu; Liu, Yan-qing; Wang, Yuan-hong

    2011-07-01

    To investigate the ultrasonic-assisted extract on of total flavonoids from leaves of the Artocarpus heterophyllus. Investigated the effects of ethanol concentration, extraction time, and liquid-solid ratio on flavonoids yield. A 17-run response surface design involving three factors at three levels was generated by the Design-Expert software and experimental data obtained were subjected to quadratic regression analysis to create a mathematical model describing flavonoids extraction. The optimum ultrasonic assisted extraction conditions were: ethanol volume fraction 69.4% and liquid-solid ratio of 22.6:1 for 32 min. Under these optimized conditions, the yield of flavonoids was 7.55 mg/g. The Box-Behnken design and response surface analysis can well optimize the ultrasonic-assisted extraction of total flavonoids from Artocarpus heterophyllus.

  19. Recovery of Navy distillate fuel from reclaimed product. Volume II. Literature review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brinkman, D.W.; Whisman, M.L.

    In an effort to assist the Navy to better utilize its waste hydrocarbons, NIPER, with support from the US Department of Energy, is conducting research designed to ultimately develop a practical technique for converting Reclaimed Product (RP) into specification Naval Distillate Fuel (F-76). This first phase of the project was focused on reviewing the literature and available information from equipment manufacturers. The literature survey has been carefully culled for methodology applicable to the conversion of RP into diesel fuel suitable for Navy use. Based upon the results of this study, a second phase has been developed and outlined in whichmore » experiments will be performed to determine the most practical recycling technologies. It is realized that the final selection of one particular technology may be site-specific due to vast differences in RP volume and available facilities. A final phase, if funded, would involve full-scale testing of one of the recommended techniques at a refueling depot. The Phase I investigations are published in two volumes. Volume 1, Technical Discussion, includes the narrative and Appendices I and II. Appendix III, a detailed Literature Review, includes both a narrative portion and an annotated bibliography containing about 800 references and abstracts. This appendix, because of its volume, has been published separately as Volume 2.« less

  20. Control volume based hydrocephalus research; a phantom study

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Voorhees, Abram; Madsen, Joseph; Wei, Timothy

    2009-11-01

    Hydrocephalus is a complex spectrum of neurophysiological disorders involving perturbation of the intracranial contents; primarily increased intraventricular cerebrospinal fluid (CSF) volume and intracranial pressure are observed. CSF dynamics are highly coupled to the cerebral blood flows and pressures as well as the mechanical properties of the brain. Hydrocephalus, as such, is a very complex biological problem. We propose integral control volume analysis as a method of tracking these important interactions using mass and momentum conservation principles. As a first step in applying this methodology in humans, an in vitro phantom is used as a simplified model of the intracranial space. The phantom's design consists of a rigid container filled with a compressible gel. Within the gel a hollow spherical cavity represents the ventricular system and a cylindrical passage represents the spinal canal. A computer controlled piston pump supplies sinusoidal volume fluctuations into and out of the flow phantom. MRI is used to measure fluid velocity and volume change as functions of time. Independent pressure measurements and momentum flow rate measurements are used to calibrate the MRI data. These data are used as a framework for future work with live patients and normal individuals. Flow and pressure measurements on the flow phantom will be presented through the control volume framework.

  1. Prospective power calculations for the Four Lab study of a multigenerational reproductive/developmental toxicity rodent bioassay using a complex mixture of disinfection by-products in the low-response region.

    PubMed

    Dingus, Cheryl A; Teuschler, Linda K; Rice, Glenn E; Simmons, Jane Ellen; Narotsky, Michael G

    2011-10-01

    In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA's Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss.

  2. Prospective Power Calculations for the Four Lab Study of A Multigenerational Reproductive/Developmental Toxicity Rodent Bioassay Using A Complex Mixture of Disinfection By-Products in the Low-Response Region

    PubMed Central

    Dingus, Cheryl A.; Teuschler, Linda K.; Rice, Glenn E.; Simmons, Jane Ellen; Narotsky, Michael G.

    2011-01-01

    In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA’s Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss. PMID:22073030

  3. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  4. Decision support for redesigning wastewater treatment technologies.

    PubMed

    McConville, Jennifer R; Künzle, Rahel; Messmer, Ulrike; Udert, Kai M; Larsen, Tove A

    2014-10-21

    This paper offers a methodology for structuring the design space for innovative process engineering technology development. The methodology is exemplified in the evaluation of a wide variety of treatment technologies for source-separated domestic wastewater within the scope of the Reinvent the Toilet Challenge. It offers a methodology for narrowing down the decision-making field based on a strict interpretation of treatment objectives for undiluted urine and dry feces and macroenvironmental factors (STEEPLED analysis) which influence decision criteria. Such an evaluation identifies promising paths for technology development such as focusing on space-saving processes or the need for more innovation in low-cost, energy-efficient urine treatment methods. Critical macroenvironmental factors, such as housing density, transportation infrastructure, and climate conditions were found to affect technology decisions regarding reactor volume, weight of outputs, energy consumption, atmospheric emissions, investment cost, and net revenue. The analysis also identified a number of qualitative factors that should be carefully weighed when pursuing technology development; such as availability of O&M resources, health and safety goals, and other ethical issues. Use of this methodology allows for coevolution of innovative technology within context constraints; however, for full-scale technology choices in the field, only very mature technologies can be evaluated.

  5. Technology CAD for integrated circuit fabrication technology development and technology transfer

    NASA Astrophysics Data System (ADS)

    Saha, Samar

    2003-07-01

    In this paper systematic simulation-based methodologies for integrated circuit (IC) manufacturing technology development and technology transfer are presented. In technology development, technology computer-aided design (TCAD) tools are used to optimize the device and process parameters to develop a new generation of IC manufacturing technology by reverse engineering from the target product specifications. While in technology transfer to manufacturing co-location, TCAD is used for process centering with respect to high-volume manufacturing equipment of the target manufacturing equipment of the target manufacturing facility. A quantitative model is developed to demonstrate the potential benefits of the simulation-based methodology in reducing the cycle time and cost of typical technology development and technology transfer projects over the traditional practices. The strategy for predictive simulation to improve the effectiveness of a TCAD-based project, is also discussed.

  6. Space Tug Docking Study. Volume 5: Cost Analysis

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The cost methodology, summary cost data, resulting cost estimates by Work Breakdown Structure (WBS), technical characteristics data, program funding schedules and the WBS for the costing are discussed. Cost estimates for two tasks of the study are reported. The first, developed cost estimates for design, development, test and evaluation (DDT&E) and theoretical first unit (TFU) at the component level (Level 7) for all items reported in the data base. Task B developed total subsystem DDT&E costs and funding schedules for the three candidate Rendezvous and Docking Systems: manual, autonomous, and hybrid.

  7. Design of transcranial magnetic stimulation coils with optimal trade-off between depth, focality, and energy.

    PubMed

    Gomez, Luis J; Goetz, Stefan M; Peterchev, Angel V

    2018-08-01

    Transcranial magnetic stimulation (TMS) is a noninvasive brain stimulation technique used for research and clinical applications. Existent TMS coils are limited in their precision of spatial targeting (focality), especially for deeper targets. This paper presents a methodology for designing TMS coils to achieve optimal trade-off between the depth and focality of the induced electric field (E-field), as well as the energy required by the coil. A multi-objective optimization technique is used for computationally designing TMS coils that achieve optimal trade-offs between E-field focality, depth, and energy (fdTMS coils). The fdTMS coil winding(s) maximize focality (minimize the volume of the brain region with E-field above a given threshold) while reaching a target at a specified depth and not exceeding predefined peak E-field strength and required coil energy. Spherical and MRI-derived head models are used to compute the fundamental depth-focality trade-off as well as focality-energy trade-offs for specific target depths. Across stimulation target depths of 1.0-3.4 cm from the brain surface, the suprathreshold volume can be theoretically decreased by 42%-55% compared to existing TMS coil designs. The suprathreshold volume of a figure-8 coil can be decreased by 36%, 44%, or 46%, for matched, doubled, or quadrupled energy. For matched focality and energy, the depth of a figure-8 coil can be increased by 22%. Computational design of TMS coils could enable more selective targeting of the induced E-field. The presented results appear to be the first significant advancement in the depth-focality trade-off of TMS coils since the introduction of the figure-8 coil three decades ago, and likely represent the fundamental physical limit.

  8. Enhancement of docosahexaenoic acid production by Schizochytrium SW1 using response surface methodology

    NASA Astrophysics Data System (ADS)

    Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul

    2015-09-01

    In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.

  9. Optimization of microwave assisted extraction of essential oils from Iranian Rosmarinus officinalis L. using RSM.

    PubMed

    Akhbari, Maryam; Masoum, Saeed; Aghababaei, Fahimeh; Hamedi, Sepideh

    2018-06-01

    In this study, the efficiencies of conventional hydro-distillation and novel microwave hydro-distillation methods in extraction of essential oil from Rosemary officinalis leaves have been compared. In order to attain the best yield and also highest quality of the essential oil in the microwave assisted method, the optimal values of operating parameters such as extraction time, microwave irradiation power and water volume to plant mass ratio were investigated using central composite design under response surface methodology. Optimal conditions for obtaining the maximum extraction yield in the microwave assisted method were predicted as follows: extraction time of 85 min, microwave power of 888 W, and water volume to plant mass ratio of 0.5 ml/g. The extraction yield at these predicted conditions was computed as 0.7756%. The qualities of the obtained essential oils under designed experiments were optimized based on total contents of four major compounds (α-pinene, 1,8-cineole, camphor and verbenone) which determined by gas chromatography equipped with mass spectroscopy (GC-MS). The highest essential oil quality (55.87%) was obtained at extraction time of 68 min; microwave irradiation power of 700 W; and water volume to plant mass ratio of zero.

  10. A Dual Wedge Microneedle for sampling of perilymph solution via round window membrane

    PubMed Central

    Watanabe, Hirobumi; Cardoso, Luis; Lalwani, Anil K.; Kysar, Jeffrey W.

    2017-01-01

    Objective Precision medicine for inner-ear disease is hampered by the absence of a methodology to sample inner-ear fluid atraumatically. The round window membrane (RWM) is an attractive portal for accessing cochlear fluids as it heals spontaneously. In this study, we report on the development of a microneedle for perilymph sampling that minimizes size of RWM perforation, facilitates quick aspiration, and provides precise volume control. Methods Considering the mechanical anisotropy of the RWM and hydrodynamics through a microneedle, a 31G stainless steel pipe was machined into wedge-shaped design via electrical discharge machining. Guinea pig RWM was penetrated in vitro, and 1 μ1 of perilymph was sampled and analyzed via UV-vis spectroscopy. Results The prototype wedge shaped needle created oval perforation with minor and major diameter of 143 and 344 μm (n=6). The sampling duration and standard deviation of aspirated volume were seconds and 6.8% respectively. The protein concentration was 1.74 mg/mL. Conclusion The prototype needle facilitated precise perforation of RWMs and rapid aspiration of cochlear fluid with precise volume control. The needle design is promising and requires testing in human cadaveric temporal bone and further optimization to become clinically viable. PMID:26888440

  11. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 2. Final report and case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The final report for the project is comprised of five volumes. The volume presents the study conclusions, summarizes the methodology used (more detail is found in Volume 3), discusses four case study applications of the model, and contains profiles of coastal communities in an Appendix.

  12. Novel model of stator design to reduce the mass of superconducting generators

    NASA Astrophysics Data System (ADS)

    Kails, Kevin; Li, Quan; Mueller, Markus

    2018-05-01

    High temperature superconductors (HTS), with much higher current density than conventional copper wires, make it feasible to develop very powerful and compact power generators. Thus, they are considered as one promising solution for large (10 + MW) direct-drive offshore wind turbines due to their low tower head mass. However, most HTS generator designs are based on a radial topology, which requires an excessive amount of HTS material and suffers from cooling and reliability issues. Axial flux machines on the other hand offer higher torque/volume ratios than the radial machines, which makes them an attractive option where space and transportation becomes an issue. However, their disadvantage is heavy structural mass. In this paper a novel stator design is introduced for HTS axial flux machines which enables a reduction in their structural mass. The stator is for the first time designed with a 45° angle that deviates the air gap closing forces into the vertical direction reducing the axial forces. The reduced axial forces improve the structural stability and consequently simplify their structural design. The novel methodology was then validated through an existing design of the HTS axial flux machine achieving a ∼10% mass reduction from 126 tonnes down to 115 tonnes. In addition, the air gap flux density increases due to the new claw pole shapes improving its power density from 53.19 to 61.90 W kg‑1. It is expected that the HTS axial flux machines designed with the new methodology offer a competitive advantage over other proposed superconducting generator designs in terms of cost, reliability and power density.

  13. Concept design theory and model for multi-use space facilities: Analysis of key system design parameters through variance of mission requirements

    NASA Astrophysics Data System (ADS)

    Reynerson, Charles Martin

    This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.

  14. Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoessel, Chris

    2013-11-13

    This project developed a new high-performance R-10/high SHGC window design, reviewed market positioning and evaluated manufacturing solutions required for broad market adoption. The project objectives were accomplished by: identifying viable technical solutions based on modeling of modern and potential coating stacks and IGU designs; development of new coating material sets for HM thin film stacks, as well as improved HM IGU designs to accept multiple layers of HM films; matching promising new coating designs with new HM IGU designs to demonstrate performance gains; and, in cooperation with a window manufacturer, assess the potential for high-volume manufacturing and cost efficiency ofmore » a HM-based R-10 window with improved solar heat gain characteristics. A broad view of available materials and design options was applied to achieve the desired improvements. Gated engineering methodologies were employed to guide the development process from concept generation to a window demonstration. The project determined that a slightly de-rated window performance allows formulation of a path to achieve the desired cost reductions to support end consumer adoption.« less

  15. Thermal power systems small power systems applications project. Volume 2: Detailed report

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    Small power system technology as applied to power plants up to 10 MW in size was considered. Markets for small power systems were characterized and cost goals were established for the project. Candidate power plant system design concepts were selected for evaluation and preliminary performance and cost assessments were made. Breakeven capital costs were determined for leading contenders among the candidate systems. The potential use of small power systems in providing part of the demand for pumping power by the extensive aqueduct system of California, was studied. Criteria and methodologies were developed for the ranking of candidate power plant system design concepts. Experimental power plant concepts of 1 MW rating were studied to define a power plant configuration for subsequent detail design construction, testing and evaluation. Site selection criteria and ground rules were developed.

  16. Preparation of modified semi-coke by microwave heating and adsorption kinetics of methylene blue.

    PubMed

    Wang, Xin; Peng, Jin-Hui; Duan, Xin-Hui; Srinivasakannan, Chandrasekar

    2013-01-01

    Preparation of modified semi-coke has been achieved, using phosphoric acid as the modifying agent, by microwave heating from virgin semi-coke. Process optimization using a Central Composite Design (CCD) design of Response Surface Methodology (RSM) technique for the preparation of modifies semi-coke is presented in this paper. The optimum conditions for producing modified semi-coke were: concentration of phosphoric acid 2.04, heating time 20 minutes and temperature 587 degrees C, with the optimum iodine of 862 mg/g and yield of 47.48%. The textural characteristics of modified semi-coke were analyzed using scanning electron microscopy (SEM) and nitrogen adsorption isotherm. The BET surface area of modified semi-coke was estimated to be 989.60 m2/g, with the pore volume of 0.74 cm3/g and a pore diameter of 3.009 nm, with micro-pore volume contributing to 62.44%. The Methylene Blue monolayer adsorption capacity was found to be mg/g at K. The adsorption capacity of the modified semi-coke highlights its suitability for liquid phase adsorption application with a potential usage in waste water treatment.

  17. Evaluating and optimizing horticultural regimes in space plant growth facilities

    NASA Technical Reports Server (NTRS)

    Berkovich, Y. A.; Chetirkin, P. V.; Wheeler, R. M.; Sager, J. C.

    2004-01-01

    In designing innovative space plant growth facilities (SPGF) for long duration space flight, various limitations must be addressed including onboard resources: volume, energy consumption, heat transfer and crew labor expenditure. The required accuracy in evaluating on board resources by using the equivalent mass methodology and applying it to the design of such facilities is not precise. This is due to the uncertainty of the structure and not completely understanding the properties of all associated hardware, including the technology in these systems. We present a simple criteria of optimization for horticultural regimes in SPGF: Qmax = max [M x (EBI)2/(V x E x T], where M is the crop harvest in terms of total dry biomass in the plant growth system; EBI is the edible biomass index (harvest index), V is volume occupied by the crop; E is the crop light energy supply during growth; T is the crop growth duration. The criterion reflects directly on the consumption of onboard resources for crop production. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  18. Micro-machined resonator oscillator

    DOEpatents

    Koehler, D.R.; Sniegowski, J.J.; Bivens, H.M.; Wessendorf, K.O.

    1994-08-16

    A micro-miniature resonator-oscillator is disclosed. Due to the miniaturization of the resonator-oscillator, oscillation frequencies of one MHz and higher are utilized. A thickness-mode quartz resonator housed in a micro-machined silicon package and operated as a telemetered sensor beacon'' that is, a digital, self-powered, remote, parameter measuring-transmitter in the FM-band. The resonator design uses trapped energy principles and temperature dependence methodology through crystal orientation control, with operation in the 20--100 MHz range. High volume batch-processing manufacturing is utilized, with package and resonator assembly at the wafer level. Unique design features include squeeze-film damping for robust vibration and shock performance, capacitive coupling through micro-machined diaphragms allowing resonator excitation at the package exterior, circuit integration and extremely small (0.1 in. square) dimensioning. A family of micro-miniature sensor beacons is also disclosed with widespread applications as bio-medical sensors, vehicle status monitors and high-volume animal identification and health sensors. The sensor family allows measurement of temperatures, chemicals, acceleration and pressure. A microphone and clock realization is also available. 21 figs.

  19. Neutralization of red mud with pickling waste liquor using Taguchi's design of experimental methodology.

    PubMed

    Rai, Suchita; Wasewar, Kailas L; Lataye, Dilip H; Mishra, Rajshekhar S; Puttewar, Suresh P; Chaddha, Mukesh J; Mahindiran, P; Mukhopadhyay, Jyoti

    2012-09-01

    'Red mud' or 'bauxite residue', a waste generated from alumina refinery is highly alkaline in nature with a pH of 10.5-12.5. Red mud poses serious environmental problems such as alkali seepage in ground water and alkaline dust generation. One of the options to make red mud less hazardous and environmentally benign is its neutralization with acid or an acidic waste. Hence, in the present study, neutralization of alkaline red mud was carried out using a highly acidic waste (pickling waste liquor). Pickling waste liquor is a mixture of strong acids used for descaling or cleaning the surfaces in steel making industry. The aim of the study was to look into the feasibility of neutralization process of the two wastes using Taguchi's design of experimental methodology. This would make both the wastes less hazardous and safe for disposal. The effect of slurry solids, volume of pickling liquor, stirring time and temperature on the neutralization process were investigated. The analysis of variance (ANOVA) shows that the volume of the pickling liquor is the most significant parameter followed by quantity of red mud with 69.18% and 18.48% contribution each respectively. Under the optimized parameters, pH value of 7 can be achieved by mixing the two wastes. About 25-30% of the total soda from the red mud is being neutralized and alkalinity is getting reduced by 80-85%. Mineralogy and morphology of the neutralized red mud have also been studied. The data presented will be useful in view of environmental concern of red mud disposal.

  20. Compulsory Education: Statistics, Methodology, Reforms and New Tendencies. Conference Papers for the 8th Session of the International Standing Conference for the History of Education (Parma, Italy, September 3-6, 1986). Volume IV.

    ERIC Educational Resources Information Center

    Genovesi, Giovanni, Ed.

    This collection, the last of four volumes on the history of compulsory education among the nations of Europe and the western hemisphere, analyzes statistics, methodology, reforms, and new tendencies. Twelve of the document's 18 articles are written in English, 3 are written in French and 3 are in Italian. Summaries accompany most articles; three…

  1. Systematic reviews and meta-analyses on treatment of asthma: critical evaluation

    PubMed Central

    Jadad, Alejandro R; Moher, Michael; Browman, George P; Booker, Lynda; Sigouin, Christopher; Fuentes, Mario; Stevens, Robert

    2000-01-01

    Objective To evaluate the clinical, methodological, and reporting aspects of systematic reviews and meta-analyses on the treatment of asthma and to compare those published by the Cochrane Collaboration with those published in paper based journals. Design Analysis of studies identified from Medline, CINAHL, HealthSTAR, EMBASE, Cochrane Library, personal collections, and reference lists. Studies Articles describing a systematic review or a meta-analysis of the treatment of asthma that were published as a full report, in any language or format, in a peer reviewed journal or the Cochrane Library. Main outcome measures General characteristics of studies reviewed and methodological characteristics (sources of articles; language restrictions; format, design, and publication status of studies included; type of data synthesis; and methodological quality). Results 50 systematic reviews and meta-analyses were included. More than half were published in the past two years. Twelve reviews were published in the Cochrane Library and 38 were published in 22 peer reviewed journals. Forced expiratory volume in one second was the most frequently used outcome, but few reviews evaluated the effect of treatment on costs or patient preferences. Forty reviews were judged to have serious or extensive flaws. All six reviews associated with industry were in this group. Seven of the 10 most rigorous reviews were published in the Cochrane Library. Conclusions Most reviews published in peer reviewed journals or funded by industry have serious methodological flaws that limit their value to guide decisions. Cochrane reviews are more rigorous and better reported than those published in peer reviewed journals. PMID:10688558

  2. Assessing the quality of the volume-outcome relationship in uro-oncology.

    PubMed

    Mayer, Erik K; Purkayastha, Sanjay; Athanasiou, Thanos; Darzi, Ara; Vale, Justin A

    2009-02-01

    To assess systematically the quality of evidence for the volume-outcome relationship in uro-oncology, and thus facilitate the formulating of health policy within this speciality, as 'Implementation of Improving Outcome Guidance' has led to centralization of uro-oncology based on published studies that have supported a 'higher volume-better outcome' relationship, but improved awareness of methodological drawbacks in health service research has questioned the strength of this proposed volume-outcome relationship. We systematically searched previous relevant reports and extracted all articles from 1980 onwards assessing the volume-outcome relationship for cystectomy, prostatectomy and nephrectomy at the institution and/or surgeon level. Studies were assessed for their methodological quality using a previously validated rating system. Where possible, meta-analytical methods were used to calculate overall differences in outcome measures between low and high volume healthcare providers. In all, 22 studies were included in the final analysis; 19 of these were published in the last 5 years. Only four studies appropriately explored the effect of both the institution and surgeon volume on outcome measures. Mortality and length of stay were the most frequently measured outcomes. The median total quality scores within each of the operation types were 8.5, 9 and 8 for cystectomy, prostatectomy and nephrectomy, respectively (possible maximum score 18). Random-effects modelling showed a higher risk of mortality in low-volume institutions than in higher-volume institutions for both cystectomy and nephrectomy (odds ratio 1.88, 95% confidence interval 1.54-2.29, and 1.28, 1.10-1.49, respectively). The methodological quality of volume-outcome research as applied to cystectomy, prostatectomy and nephrectomy is only modest at best. Accepting several limitations, pooled analysis confirms a higher-volume, lower-mortality relationship for cystectomy and nephrectomy. Future research should focus on the development of a quality framework with a validated scoring system for the bench-marking of data to improve validity and facilitate rational policy-making within the speciality of uro-oncology.

  3. Mexico City Air Quality Research Initiative; Volume 5, Strategic evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-03-01

    Members of the Task HI (Strategic Evaluation) team were responsible for the development of a methodology to evaluate policies designed to alleviate air pollution in Mexico City. This methodology utilizes information from various reports that examined ways to reduce pollutant emissions, results from models that calculate the improvement in air quality due to a reduction in pollutant emissions, and the opinions of experts as to the requirements and trade-offs that are involved in developing a program to address the air pollution problem in Mexico City. The methodology combines these data to produce comparisons between different approaches to improving Mexico City`smore » air quality. These comparisons take into account not only objective factors such as the air quality improvement or cost of the different approaches, but also subjective factors such as public acceptance or political attractiveness of the different approaches. The end result of the process is a ranking of the different approaches and, more importantly, the process provides insights into the implications of implementing a particular approach or policy.« less

  4. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  5. Optimization of green infrastructure network at semi-urbanized watersheds to manage stormwater volume, peak flow and life cycle cost: Case study of Dead Run watershed in Maryland

    NASA Astrophysics Data System (ADS)

    Heidari Haratmeh, B.; Rai, A.; Minsker, B. S.

    2016-12-01

    Green Infrastructure (GI) has become widely known as a sustainable solution for stormwater management in urban environments. Despite more recognition and acknowledgment, researchers and practitioners lack clear and explicit guidelines on how GI practices should be implemented in urban settings. This study is developing a noisy-based multi-objective, multi-scaled genetic algorithm that determines optimal GI networks for environmental, economic and social objectives. The methodology accounts for uncertainty in modeling results and is designed to perform at sub-watershed as well as patch scale using two different simulation models, SWMM and RHESSys, in a Cloud-based implementation using a Web interface. As an initial case study, a semi-urbanized watershed— DeadRun 5— in Baltimore County, Maryland, is selected. The objective of the study is to minimize life cycle cost, maximize human preference for human well-being and the difference between pre-development hydrographs generated from current rainfall events and design storms, as well as those that result from proposed GI scenarios. Initial results for DeadRun5 watershed suggest that placing GI in the proximity of the watershed outlet optimizes life cycle cost, stormwater volume, and peak flow capture. The framework can easily present outcomes of GI design scenarios to both designers and local stakeholders, and future plans include receiving feedback from users on candidate designs, and interactively updating optimal GI network designs in a crowd-sourced design process. This approach can also be helpful in deriving design guidelines that better meet stakeholder needs.

  6. Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico

    NASA Astrophysics Data System (ADS)

    Nathenson, M.; Fierstein, J.

    2012-12-01

    Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico Manuel Nathenson and Judy Fierstein U.S. Geological Survey, 345 Middlefield Road MS-910, Menlo Park, CA 94025 In a recent numerical simulation of tephra transport and deposition for the 1982 eruption, Bonasia et al. (2012) used masses for the tephra layers (A-1, B, and C) based on the volume data of Carey and Sigurdsson (1986) calculated by the methodology of Rose et al. (1973). For reasons not clear, using the same methodology we obtained volumes for layers A-1 and B much less than those previously reported. For example, for layer A-1, Carey and Sigurdsson (1986) reported a volume of 0.60 km3, whereas we obtain a volume of 0.23 km3. Moreover, applying the more recent methodology of tephra-volume calculation (Pyle, 1989; Fierstein and Nathenson, 1992) and using the isopachs maps in Carey and Sigurdsson (1986), we calculate a total tephra volume of 0.52 km3 (A-1, 0.135; B, 0.125; and C, 0.26 km3). In contrast, Carey and Sigurdsson (1986) report a much larger total volume of 2.19 km3. Such disagreement not only reflects the differing methodologies, but we propose that the volumes calculated with the methodology of Pyle and of Fierstein and Nathenson—involving the use of straight lines on a log thickness versus square root of area plot—better represent the actual fall deposits. After measuring the areas for the isomass contours for the HAZMAPP and FALL3D simulations in Bonasia et al. (2012), we applied the Pyle-Fierstein and Nathenson methodology to calculate the tephra masses deposited on the ground. These masses from five of the simulations range from 70% to 110% of those reported by Carey and Sigurdsson (1986), whereas that for layer B in the HAZMAP calculation is 160%. In the Bonasia et al. (2012) study, the mass erupted by the volcano is a critical input used in the simulation to produce an ash cloud that deposits tephra on the ground. Masses on the ground (as calculated by us) for five of the simulations range from 20% to 46% of the masses used as simulation inputs, whereas that for layer B in the HAZMAP calculation is 74%. It is not clear why the percentages are so variable, nor why the output volumes are such small percentages of the input erupted mass. From our volume calculations, the masses on the ground from the simulations are factors of 2.3 to 10 times what was actually deposited. Given this finding from our reevaluation of volumes, the simulations appear to overestimate the hazards from eruptions of sizes that occurred at El Chichón. Bonasia, R., A. Costa, A. Folch, G. Macedonio, and L. Capra, (2012), Numerical simulation of tephra transport and deposition of the 1982 El Chichón eruption and implications for hazard assessment, J. Volc. Geotherm. Res., 231-232, 39-49. Carey, S. and H. Sigurdsson, (1986), The 1982 eruptions of El Chichon volcano, Mexico: Observations and numerical modelling of tephra-fall distribution, Bull. Volcanol., 48, 127-141. Fierstein, J., and M. Nathenson, (1992), Another look at the calculation of fallout tephra volumes, Bull. Volcanol., 54, 156-167. Pyle, D.M., (1989), The thickness, volume and grainsize of tephra fall deposits, Bull. Volcanol., 51, 1-15. Rose, W.I., Jr., S. Bonis, R.E. Stoiber, M. Keller, and T. Bickford, (1973), Studies of volcanic ash from two recent Central American eruptions, Bull. Volcanol., 37, 338-364.

  7. A model for the influences of soluble and insoluble solids, and treated volume on the ultraviolet-C resistance of heat-stressed Salmonella enterica in simulated fruit juices.

    PubMed

    Estilo, Emil Emmanuel C; Gabriel, Alonzo A

    2018-02-01

    This study was conducted to determine the effects of intrinsic juice characteristics namely insoluble solids (IS, 0-3 %w/v), and soluble solids (SS, 0-70 °Brix), and extrinsic process parameter treated volume (250-1000 mL) on the UV-C inactivation rates of heat-stressed Salmonella enterica in simulated fruit juices (SFJs). A Rotatable Central Composite Design of Experiment (CCRD) was used to determine combinations of the test variables, while Response Surface Methodology (RSM) was used to characterize and quantify the influences of the test variables on microbial inactivation. The heat-stressed cells exhibited log-linear UV-C inactivation behavior (R 2 0.952 to 0.999) in all CCRD combinations with D UV-C values ranging from 10.0 to 80.2 mJ/cm 2 . The D UV-C values obtained from the CCRD significantly fitted into a quadratic model (P < 0.0001). RSM results showed that individual linear (IS, SS, volume), individual quadratic (IS 2 and volume 2 ), and factor interactions (IS × volume and SS × volume) were found to significantly influence UV-C inactivation. Validation of the model in SFJs with combinations not included in the CCRD showed that the predictions were within acceptable error margins. Copyright © 2017. Published by Elsevier Ltd.

  8. Recovery of Navy distillate fuel from reclaimed product. Volume I. Technical discussion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brinkman, D.W.; Whisman, M.L.

    1984-11-01

    In an effort to assist the Navy to better utilize its waste hydrocarbons, NIPER, with support from the US Department of Energy, is conducting research designed to ultimately develop a practical technique for converting Reclaimed Product (RP) into specification Naval Distillate Fuel (F-76). The first phase of the project was focused on reviewing the literature and available information from equipment manufacturers. The literature survey has been carefully culled for methodology applicable to the conversion of RP into diesel fuel suitable for Navy use. Based upon the results of this study, a second phase has been developed and outlined in whichmore » experiments will be performed to determine the most practical recycling technologies. It is realized that the final selection of one particular technology may be site-specific due to vast differences in RP volume and available facilities. A final phase, if funded, would involve full-scale testing of one of the recommended techniques at a refueling depot. The Phase I investigations are published in two volumes. Volume 1, Technical Discussion, includes the narrative and Appendices I and II. Appendix III, a detailed Literature Review, includes both a narrative portion and an annotated bibliography containing about 800 referenvces and abstracts. This appendix, because of its volume, has been published separately as Volume 2. 18 figures, 4 tables.« less

  9. LLCEDATA and LLCECALC for Windows version 1.0, Volume 3: Software verification and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFadden, J.G.

    1998-09-04

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file(EDF) that represents a snapshot of both the LLCE and the tank from which it originates. LLCECALC reads the EDF and the gamma assay file (AV2) that is produced by the flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, which discusses system limitations and provides recommendations to the LLCE process. Volume 3 documents LLCEDATA and LLCECALC`s verification and validation. Two of the three installation test cases, from Volume 1, are independently confirmed. Data bases used in LLCEDATA are verified and referenced. Both phases of LLCECALC process gamma and characterization, are extensively tested to verify that the methodology and algorithms used are correct.« less

  10. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  11. An algorithm for analytical solution of basic problems featuring elastostatic bodies with cavities and surface flaws

    NASA Astrophysics Data System (ADS)

    Penkov, V. B.; Levina, L. V.; Novikova, O. S.; Shulmin, A. S.

    2018-03-01

    Herein we propose a methodology for structuring a full parametric analytical solution to problems featuring elastostatic media based on state-of-the-art computing facilities that support computerized algebra. The methodology includes: direct and reverse application of P-Theorem; methods of accounting for physical properties of media; accounting for variable geometrical parameters of bodies, parameters of boundary states, independent parameters of volume forces, and remote stress factors. An efficient tool to address the task is the sustainable method of boundary states originally designed for the purposes of computerized algebra and based on the isomorphism of Hilbertian spaces of internal states and boundary states of bodies. We performed full parametric solutions of basic problems featuring a ball with a nonconcentric spherical cavity, a ball with a near-surface flaw, and an unlimited medium with two spherical cavities.

  12. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  13. Principles of Protein Stability and Their Application in Computational Design.

    PubMed

    Goldenzweig, Adi; Fleishman, Sarel

    2018-01-26

    Proteins are increasingly used in basic and applied biomedical research.Many proteins, however, are only marginally stable and can be expressed in limited amounts, thus hampering research and applications. Research has revealed the thermodynamic, cellular, and evolutionary principles and mechanisms that underlie marginal stability. With this growing understanding, computational stability design methods have advanced over the past two decades starting from methods that selectively addressed only some aspects of marginal stability. Current methods are more general and, by combining phylogenetic analysis with atomistic design, have shown drastic improvements in solubility, thermal stability, and aggregation resistance while maintaining the protein's primary molecular activity. Stability design is opening the way to rational engineering of improved enzymes, therapeutics, and vaccines and to the application of protein design methodology to large proteins and molecular activities that have proven challenging in the past. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  14. A lunar base reference mission for the phased implementation of bioregenerative life support system components

    NASA Technical Reports Server (NTRS)

    Dittmer, Laura N.; Drews, Michael E.; Lineaweaver, Sean K.; Shipley, Derek E.; Hoehn, A.

    1991-01-01

    Previous design efforts of a cost effective and reliable regenerative life support system (RLSS) provided the foundation for the characterization of organisms or 'biological processors' in engineering terms and a methodology was developed for their integration into an engineered ecological LSS in order to minimize the mass flow imbalances between consumers and producers. These techniques for the design and the evaluation of bioregenerative LSS have now been integrated into a lunar base reference mission, emphasizing the phased implementation of components of such a BLSS. In parallel, a designers handbook was compiled from knowledge and experience gained during past design projects to aid in the design and planning of future space missions requiring advanced RLSS technologies. The lunar base reference mission addresses in particular the phased implementation and integration of BLS parts and includes the resulting infrastructure burdens and needs such as mass, power, volume, and structural requirements of the LSS. Also, operational aspects such as manpower requirements and the possible need and application of 'robotics' were addressed.

  15. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are problematic for design space exploration. To begin addressing the current gaps in fuel cell aircraft development, a methodology has been developed to explore and characterize the near-term performance of fuel cell powered UAVs. The first step of the methodology is the development of a valid MDA. This is accomplished by using propagated uncertainty estimates to guide the decomposition of a MDA into key contributing analyses (CAs) that can be individually refined and validated to increase the overall accuracy of the MDA. To assist in MDA development, a flexible framework for simultaneously solving the CAs is specified. This enables the MDA to be easily adapted to changes in technology and the changes in data that occur throughout a design process. Various CAs that model a polymer electrolyte membrane fuel cell (PEMFC) UAV are developed, validated, and shown to be in agreement with hardware-in-the-loop simulations of a fully developed fuel cell propulsion system. After creating a valid MDA, the final step of the methodology is the synthesis of the MDA with an uncertainty propagation analysis, an optimization routine, and a chance constrained problem formulation. This synthesis allows an efficient calculation of the probabilistic constraint boundaries and Pareto frontiers that will govern the design space and influence design decisions relating to optimization and uncertainty mitigation. A key element of the methodology is uncertainty propagation. The methodology uses Systems Sensitivity Analysis (SSA) to estimate the uncertainty of key performance metrics due to uncertainties in design variables and uncertainties in the accuracy of the CAs. A summary of SSA is provided and key rules for properly decomposing a MDA for use with SSA are provided. Verification of SSA uncertainty estimates via Monte Carlo simulations is provided for both an example problem as well as a detailed MDA of a fuel cell UAV. Implementation of the methodology was performed on a small fuel cell UAV designed to carry a 2.2 kg payload with 24 hours of endurance. Uncertainty distributions for both design variables and the CAs were estimated based on experimental results and were found to dominate the design space. To reduce uncertainty and test the flexibility of the MDA framework, CAs were replaced with either empirical, or semi-empirical relationships during the optimization process. The final design was validated via a hardware-in-the loop simulation. Finally, the fuel cell UAV probabilistic design space was studied. A graphical representation of the design space was generated and the optima due to deterministic and probabilistic constraints were identified. The methodology was used to identify Pareto frontiers of the design space which were shown on contour plots of the design space. Unanticipated discontinuities of the Pareto fronts were observed as different constraints became active providing useful information on which to base design and development decisions.

  16. Methodologic quality and relevance of references in pharmaceutical advertisements in a Canadian medical journal.

    PubMed Central

    Lexchin, J; Holbrook, A

    1994-01-01

    OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560

  17. Systems design study of the Pioneer Venus spacecraft. Volume 2. Preliminary program development plan

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The preliminary development plan for the Pioneer Venus program is presented. This preliminary plan treats only developmental aspects that would have a significant effect on program cost. These significant development areas were: master program schedule planning; test planning - both unit and system testing for probes/orbiter/ probe bus; ground support equipment; performance assurance; and science integration Various test planning options and test method techniques were evaluated in terms of achieving a low-cost program without degrading mission performance or system reliability. The approaches studied and the methodology of the selected approach are defined.

  18. Industrial machinery noise impact modeling, volume 1

    NASA Astrophysics Data System (ADS)

    Hansen, C. H.; Kugler, B. A.

    1981-07-01

    The development of a machinery noise computer model which may be used to assess the effect of occupational noise on the health and welfare of industrial workers is discussed. The purpose of the model is to provide EPA with the methodology to evaluate the personnel noise problem, to identify the equipment types responsible for the exposure and to assess the potential benefits of a given noise control action. Due to its flexibility in design and application, the model and supportive computer program can be used by other federal agencies, state governments, labor and industry as an aid in the development of noise abatement programs.

  19. Proceedings of the Workshop on Identification and Control of Flexible Space Structures, Volume 2

    NASA Technical Reports Server (NTRS)

    Rodriguez, G. (Editor)

    1985-01-01

    The results of a workshop on identification and control of flexible space structures held in San Diego, CA, July 4 to 6, 1984 are discussed. The main objectives of the workshop were to provide a forum to exchange ideas in exploring the most advanced modeling, estimation, identification and control methodologies to flexible space structures. The workshop responded to the rapidly growing interest within NASA in large space systems (space station, platforms, antennas, flight experiments) currently under design. Dynamic structural analysis, control theory, structural vibration and stability, and distributed parameter systems are discussed.

  20. Biomedical Informatics for Computer-Aided Decision Support Systems: A Survey

    PubMed Central

    Belle, Ashwin; Kon, Mark A.; Najarian, Kayvan

    2013-01-01

    The volumes of current patient data as well as their complexity make clinical decision making more challenging than ever for physicians and other care givers. This situation calls for the use of biomedical informatics methods to process data and form recommendations and/or predictions to assist such decision makers. The design, implementation, and use of biomedical informatics systems in the form of computer-aided decision support have become essential and widely used over the last two decades. This paper provides a brief review of such systems, their application protocols and methodologies, and the future challenges and directions they suggest. PMID:23431259

  1. Globally optimal, minimum stored energy, double-doughnut superconducting magnets.

    PubMed

    Tieng, Quang M; Vegh, Viktor; Brereton, Ian M

    2010-01-01

    The use of the minimum stored energy current density map-based methodology of designing closed-bore symmetric superconducting magnets was described recently. The technique is further developed to cater for the design of interventional-type MRI systems, and in particular open symmetric magnets of the double-doughnut configuration. This extends the work to multiple magnet domain configurations. The use of double-doughnut magnets in MRI scanners has previously been hindered by the ability to deliver strong magnetic fields over a sufficiently large volume appropriate for imaging, essentially limiting spatial resolution, signal-to-noise ratio, and field of view. The requirement of dedicated interventional space restricts the manner in which the coils can be arranged and placed. The minimum stored energy optimal coil arrangement ensures that the field strength is maximized over a specific region of imaging. The design method yields open, dual-domain magnets capable of delivering greater field strengths than those used prior to this work, and at the same time it provides an increase in the field-of-view volume. Simulation results are provided for 1-T double-doughnut magnets with at least a 50-cm 1-ppm (parts per million) field of view and 0.7-m gap between the two doughnuts. Copyright (c) 2009 Wiley-Liss, Inc.

  2. An overview of urban stormwater-management practices in Miami-Dade County, Florida

    USGS Publications Warehouse

    Chin, David A.

    2004-01-01

    Agencies with jurisdiction over stormwater-management systems in Miami-Dade County, Florida, include the Miami-Dade Department of Environmental Resources Management (DERM), South Florida Water Management District (SFWMD), and Florida Department of Transportation (FDOT). These agencies are primarily concerned with minor drainage systems that handle runoff from storms with return periods of 10 years or less (DERM), major drainage systems that handle runoff from storms with return periods of 25 years or more (SFWMD), and runoff from major roadways (FDOT). All drainage regulations require retention of at least a specified water-quality volume (defined volume of surface runoff), typically the first inch of runoff. The DERM and FDOT intensity duration frequency (IDF) curves used as a basis for design are similar but different, with differences particularly apparent for short-duration storms. The SFWMD 25-year 3-day storm incorporates an IDF curve that is substantially different from both the IDF curves of DERM and FDOT. A DERM methodology for designing closed exfiltration systems is applicable to storms of 1-hour duration, but is not applicable to all storms with a given T-year return period. A trench design that is applicable to all storms with a given T-year return period is presented as an alternative approach.

  3. A methodology for finding the optimal iteration number of the SIRT algorithm for quantitative Electron Tomography.

    PubMed

    Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen

    2017-02-01

    The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  5. Feedback linearization based control of a variable air volume air conditioning system for cooling applications.

    PubMed

    Thosar, Archana; Patra, Amit; Bhattacharyya, Souvik

    2008-07-01

    Design of a nonlinear control system for a Variable Air Volume Air Conditioning (VAVAC) plant through feedback linearization is presented in this article. VAVAC systems attempt to reduce building energy consumption while maintaining the primary role of air conditioning. The temperature of the space is maintained at a constant level by establishing a balance between the cooling load generated in the space and the air supply delivered to meet the load. The dynamic model of a VAVAC plant is derived and formulated as a MIMO bilinear system. Feedback linearization is applied for decoupling and linearization of the nonlinear model. Simulation results for a laboratory scale plant are presented to demonstrate the potential of keeping comfort and maintaining energy optimal performance by this methodology. Results obtained with a conventional PI controller and a feedback linearizing controller are compared and the superiority of the proposed approach is clearly established.

  6. Airport Landside. Volume I. Planning Guide.

    DOT National Transportation Integrated Search

    1982-01-01

    This volume describes a methodology for performing airport landside planning by applying the Airport Landside Simulation Model (ALSIM) developed by TSC. For this analysis, the airport landside is defined as extending from the airport boundary to the ...

  7. Towards a Methodology for the Design of Multimedia Public Access Interfaces.

    ERIC Educational Resources Information Center

    Rowley, Jennifer

    1998-01-01

    Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…

  8. An Assessment of Alternate Thermal Protection Systems for the Space Shuttle Orbiter. Volume 1; Executive Summary

    NASA Technical Reports Server (NTRS)

    Hays, D.

    1982-01-01

    Alternate thermal protection system (TPS) concepts to the Space Shuttle Orbiter were assessed. Metallic, ablator, and carbon-carbon concepts which are the result of some previous design, manufacturing and testing effort were considered. Emphasis was placed on improved TPS durability, which could potentially reduce life cycle costs and improve Orbiter operational characteristics. Integrated concept/orbiter point designs were generated and analyzed on the basis of Shuttle design environments and criteria. A merit function evaluation methodology based on mission impact, life cycle costs, and risk was developed to compare the candidate concepts and to identify the best alternate. Voids and deficiencies in the technology were identified, along with recommended activities to overcome them. Finally, programmatic plans, including ROM costs and schedules, were developed for all activities required to bring the selected alternate system up to operational readiness.

  9. Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.

    1986-01-01

    This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).

  10. Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.

    1986-01-01

    This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).

  11. Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2

    NASA Technical Reports Server (NTRS)

    Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.

    1986-01-01

    This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).

  12. Working Papers in Dialogue Modeling, Volume 2.

    ERIC Educational Resources Information Center

    Mann, William C.; And Others

    The technical working papers that comprise the two volumes of this document are related to the problem of creating a valid process model of human communication in dialogue. In Volume 2, the first paper concerns study methodology, and raises such issues as the choice between system-building and process-building, and the advantages of studying cases…

  13. Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates

    Treesearch

    John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin

    2014-01-01

    Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...

  14. Human Rehabilitation Techniques. Project Papers. Volume IV, Part B.

    ERIC Educational Resources Information Center

    Dudek, R. A.; And Others

    Volume IV, Part B of a six-volume final report (which covers the findings of a research project on policy and technology related to rehabilitation of disabled individuals) presents a continuation of papers (Part A) giving an overview of project methodology, much of the data used in projecting consequences and policymaking impacts in project…

  15. CO2 storage resources, reserves, and reserve growth: Toward a methodology for integrated assessment of the storage capacity of oil and gas reservoirs and saline formations

    USGS Publications Warehouse

    Burruss, Robert

    2009-01-01

    Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible.

  16. CO2 storage resources, reserves, and reserve growth: Toward a methodology for integrated assessment of the storage capacity of oil and gas reservoirs and saline formations

    USGS Publications Warehouse

    Burruss, R.C.

    2009-01-01

    Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible. ?? 2009 Elsevier Ltd. All rights reserved.

  17. A cross impact methodology for the assessment of US telecommunications system with application to fiber optics development, volume 2

    NASA Technical Reports Server (NTRS)

    Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.

    1979-01-01

    The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications.

  18. Alternative occupied volume integrity (OVI) tests and analyses.

    DOT National Transportation Integrated Search

    2013-10-01

    FRA, supported by the Volpe Center, conducted research on alternative methods of evaluating occupied volume integrity (OVI) in passenger railcars. Guided by this research, an alternative methodology for evaluating OVI that ensures an equivalent or gr...

  19. Methodology update for estimating volume to service flow ratio.

    DOT National Transportation Integrated Search

    2015-12-01

    Volume/service flow ratio (VSF) is calculated by the Highway Performance Monitoring System (HPMS) software as an indicator of peak hour congestion. It is an essential input to the Kentucky Transportation Cabinets (KYTC) key planning applications, ...

  20. Airport Landside - Volume III : ALSIM Calibration and Validation.

    DOT National Transportation Integrated Search

    1982-06-01

    This volume discusses calibration and validation procedures applied to the Airport Landside Simulation Model (ALSIM), using data obtained at Miami, Denver and LaGuardia Airports. Criteria for the selection of a validation methodology are described. T...

  1. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-03-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  2. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  3. Experimental investigation of the structural behavior of equine urethra.

    PubMed

    Natali, Arturo Nicola; Carniel, Emanuele Luigi; Frigo, Alessandro; Fontanella, Chiara Giulia; Rubini, Alessandro; Avital, Yochai; De Benedictis, Giulia Maria

    2017-04-01

    An integrated experimental and computational investigation was developed aiming to provide a methodology for characterizing the structural response of the urethral duct. The investigation provides information that are suitable for the actual comprehension of lower urinary tract mechanical functionality and the optimal design of prosthetic devices. Experimental activity entailed the execution of inflation tests performed on segments of horse penile urethras from both proximal and distal regions. Inflation tests were developed imposing different volumes. Each test was performed according to a two-step procedure. The tubular segment was inflated almost instantaneously during the first step, while volume was held constant for about 300s to allow the development of relaxation processes during the second step. Tests performed on the same specimen were interspersed by 600s of rest to allow the recovery of the specimen mechanical condition. Results from experimental activities were statistically analyzed and processed by means of a specific mechanical model. Such computational model was developed with the purpose of interpreting the general pressure-volume-time response of biologic tubular structures. The model includes parameters that interpret the elastic and viscous behavior of hollow structures, directly correlated with the results from the experimental activities. Post-processing of experimental data provided information about the non-linear elastic and time-dependent behavior of the urethral duct. In detail, statistically representative pressure-volume and pressure relaxation curves were identified, and summarized by structural parameters. Considering elastic properties, initial stiffness ranged between 0.677 ± 0.026kPa and 0.262 ± 0.006kPa moving from proximal to distal region of penile urethra. Viscous parameters showed typical values of soft biological tissues, as τ 1 =0.153±0.018s, τ 2 =17.458 ± 1.644s and τ 1 =0.201 ± 0.085, τ 2 = 8.514 ± 1.379s for proximal and distal regions respectively. A general procedure for the mechanical characterization of the urethral duct has been provided. The proposed methodology allows identifying mechanical parameters that properly express the mechanical behavior of the biological tube. The approach is especially suitable for evaluating the influence of degenerative phenomena on the lower urinary tract mechanical functionality. The information are mandatory for the optimal design of potential surgical procedures and devices. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.

  5. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    PubMed

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  6. Presentations at the Tri-Service Cloud Modeling Workshop (2nd), Held at the Naval Surface Weapons Center, White Oak, Maryland, 26-27 June 1984. Volume 1.

    DTIC Science & Technology

    1984-08-01

    produce even the most basic binary cloud data and methodologies needed to support the evaluation programs." In view of this recognized deficiency, the...There was an exchange of information with non - DoD agencies, with presentations made by NASA and NOAA (see pp. 537, 569). A brief report by the steering...on cloud data bases and methodologies for users. To achieve these actions requires explicit support. *See classified supplementary volume. vi CONTENTS

  7. The volume-mortality relation for radical cystectomy in England: retrospective analysis of hospital episode statistics

    PubMed Central

    Bottle, Alex; Darzi, Ara W; Athanasiou, Thanos; Vale, Justin A

    2010-01-01

    Objectives To investigate the relation between volume and mortality after adjustment for case mix for radical cystectomy in the English healthcare setting using improved statistical methodology, taking into account the institutional and surgeon volume effects and institutional structural and process of care factors. Design Retrospective analysis of hospital episode statistics using multilevel modelling. Setting English hospitals carrying out radical cystectomy in the seven financial years 2000/1 to 2006/7. Participants Patients with a primary diagnosis of cancer undergoing an inpatient elective cystectomy. Main outcome measure Mortality within 30 days of cystectomy. Results Compared with low volume institutions, medium volume ones had a significantly higher odds of in-hospital and total mortality: odds ratio 1.72 (95% confidence interval 1.00 to 2.98, P=0.05) and 1.82 (1.08 to 3.06, P=0.02). This was only seen in the final model, which included adjustment for structural and processes of care factors. The surgeon volume-mortality relation showed weak evidence of reduced odds of in-hospital mortality (by 35%) for the high volume surgeons, although this did not reach statistical significance at the 5% level. Conclusions The relation between case volume and mortality after radical cystectomy for bladder cancer became evident only after adjustment for structural and process of care factors, including staffing levels of nurses and junior doctors, in addition to case mix. At least for this relatively uncommon procedure, adjusting for these confounders when examining the volume-outcome relation is critical before considering centralisation of care to a few specialist institutions. Outcomes other than mortality, such as functional morbidity and disease recurrence may ultimately influence towards centralising care. PMID:20305302

  8. Passenger rail vehicle safety assessment methodology. Volume II, Detailed analyses and simulation results.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...

  9. Coyote Papers: The University of Arizona Working Papers in Linguistics, Volume 11. Special Volume on Native American Languages.

    ERIC Educational Resources Information Center

    Weinberg, Jessica P., Ed.; O'Bryan, Erin L., Ed.; Moll, Laura A., Ed.; Haugan, Jason D., Ed.

    The five papers included in this volume approach the study of American Indian languages from a diverse array of methodological and theoretical approaches to linguistics. Two papers focus on approaches that come from the applied linguistics tradition, emphasizing ethnolinguistics and discourse analysis: Sonya Bird's paper "A Cross Cultural…

  10. Selected Bibliographies for Pharmaceutical Supply Systems. Volume 5: Pharmaceutical Supply Systems Bibliographies. International Health Planning Reference Series.

    ERIC Educational Resources Information Center

    Schaumann, Leif

    Intended as a companion piece to volume 7 in the Method Series, Pharmaceutical Supply System Planning (CE 024 234), this fifth of six volumes in the International Health Planning Reference Series is a combined literature review and annotated bibliography dealing with alternative methodologies for planning and analyzing pharmaceutical supply…

  11. Upward Mobility Programs in the Service Sector for Disadvantaged and Dislocated Workers. Volume II: Technical Appendices.

    ERIC Educational Resources Information Center

    Tao, Fumiyo; And Others

    This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…

  12. A methodology to address mixed AGN and starlight contributions in emission line galaxies found in the RESOLVE survey and ECO catalog

    NASA Astrophysics Data System (ADS)

    Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE

    2017-01-01

    We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.

  13. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.

  14. Effects of Direct-to-Consumer Advertising on Patient Prescription Requests and Physician Prescribing: A Systematic Review of Psychiatry-Relevant Studies

    PubMed Central

    Becker, Sara J.; Midoun, Miriam M.

    2017-01-01

    Objective To systematically analyze the effects of direct-to-consumer advertising (DTCA) on patient requests for medication and physician prescribing across psychiatry-relevant studies. Data Sources MEDLINE, PsychINFO, ISI Thompson's Web of Knowledge, and Google Scholar were searched from 1999 through 2014 using variations of the terms direct-to-consumer advertising and psychiatric. Reference lists and an online repository of DTCA manuscripts were also scrutinized. Study Selection English-language studies collecting data at the point of service, focusing on or including psychiatric medication, and assessing DTCA's effects on patient and/or physician behavior were included. Of 989 articles identified, 69 received full-text review. Four studies across five manuscripts met inclusion criteria. Data Extraction Data were extracted on participants, study design, methodological quality, and results. Methodological quality of individual studies was assessed using adapted criteria from the Effective Public Health Practice Project. Confidence in conclusions across studies was determined using principles from the well-established GRADE system. Findings Due to lack of replication across strong randomized controlled trials (RCTs), no conclusions merited high confidence. With moderate confidence, we concluded that DTCA requests: 1) are granted most of the time [1 RCT, 3 observational]; 2) prompt higher prescribing volume [1 RCT, 1 observational]; 3) promote greater adherence to minimally acceptable treatment guidelines for patients with depression [1 RCT], and 4) stimulate overprescribing among patients with an adjustment disorder [1 RCT]. Conclusions Findings suggest that DTCA requests are typically accommodated, promote higher prescribing volume, and have competing effects on treatment quality. More methodologically strong studies are needed to increase confidence in conclusions. PMID:27631149

  15. Effects of Direct-To-Consumer Advertising on Patient Prescription Requests and Physician Prescribing: A Systematic Review of Psychiatry-Relevant Studies.

    PubMed

    Becker, Sara J; Midoun, Miriam M

    2016-10-01

    To systematically analyze the effects of direct-to-consumer advertising (DTCA) on patient requests for medication and physician prescribing across psychiatry-relevant studies. MEDLINE, PsycINFO, Thomson Reuters' ISI Web of Knowledge, and Google Scholar were searched (1999-2014) using variations of the terms direct-to-consumer advertising and psychiatric. Reference lists and an online repository of DTCA manuscripts were also scrutinized. English-language studies collecting data at the point of service, focusing on or including psychiatric medication, and assessing the effects of DTCA on patient and/or physician behavior were included. Of 989 articles identified, 69 received full-text review. Four studies across 5 manuscripts met inclusion criteria. Data were extracted on participants, study design, methodological quality, and results. Methodological quality of individual studies was assessed using adapted criteria from the Effective Public Health Practice Project. Confidence in conclusions across studies was determined using principles from the well-established GRADE system. Due to lack of replication across strong randomized controlled trials (RCTs), no conclusions merited high confidence. With moderate confidence, we concluded that DTCA requests (1) are granted most of the time (1 RCT, 3 observational), (2) prompt higher prescribing volume (1 RCT, 1 observational), (3) promote greater adherence to minimally acceptable treatment guidelines for patients with depression (1 RCT), and (4) stimulate overprescribing among patients with an adjustment disorder (1 RCT). Findings suggest that DTCA requests are typically accommodated, promote higher prescribing volume, and have competing effects on treatment quality. More methodologically strong studies are needed to increase confidence in conclusions. © Copyright 2016 Physicians Postgraduate Press, Inc.

  16. Use of response surface methodology for development of new microwell-based spectrophotometric method for determination of atrovastatin calcium in tablets

    PubMed Central

    2012-01-01

    Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143

  17. Users manual for the NASA Lewis three-dimensional ice accretion code (LEWICE 3D)

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.; Potapczuk, Mark G.

    1993-01-01

    A description of the methodology, the algorithms, and the input and output data along with an example case for the NASA Lewis 3D ice accretion code (LEWICE3D) has been produced. The manual has been designed to help the user understand the capabilities, the methodologies, and the use of the code. The LEWICE3D code is a conglomeration of several codes for the purpose of calculating ice shapes on three-dimensional external surfaces. A three-dimensional external flow panel code is incorporated which has the capability of calculating flow about arbitrary 3D lifting and nonlifting bodies with external flow. A fourth order Runge-Kutta integration scheme is used to calculate arbitrary streamlines. An Adams type predictor-corrector trajectory integration scheme has been included to calculate arbitrary trajectories. Schemes for calculating tangent trajectories, collection efficiencies, and concentration factors for arbitrary regions of interest for single droplets or droplet distributions have been incorporated. A LEWICE 2D based heat transfer algorithm can be used to calculate ice accretions along surface streamlines. A geometry modification scheme is incorporated which calculates the new geometry based on the ice accretions generated at each section of interest. The three-dimensional ice accretion calculation is based on the LEWICE 2D calculation. Both codes calculate the flow, pressure distribution, and collection efficiency distribution along surface streamlines. For both codes the heat transfer calculation is divided into two regions, one above the stagnation point and one below the stagnation point, and solved for each region assuming a flat plate with pressure distribution. Water is assumed to follow the surface streamlines, hence starting at the stagnation zone any water that is not frozen out at a control volume is assumed to run back into the next control volume. After the amount of frozen water at each control volume has been calculated the geometry is modified by adding the ice at each control volume in the surface normal direction.

  18. Coal gasification systems engineering and analysis, volume 2

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The major design related features of each generic plant system were characterized in a catalog. Based on the catalog and requirements data, approximately 17 designs and cost estimates were developed for MBG and alternate products. A series of generic trade studies was conducted to support all of the design studies. A set of cost and programmatic analyses were conducted to supplement the designs. The cost methodology employed for the design and sensitivity studies was documented and implemented in a computer program. Plant design and construction schedules were developed for the K-T, Texaco, and B&W MBG plant designs. A generic work breakdown structure was prepared, based on the K-T design, to coincide with TVA's planned management approach. An extensive set of cost sensitivity analyses was completed for K-T, Texaco, and B&W design. Product price competitiveness was evaluated for MBG and the alternate products. A draft management policy and procedures manual was evaluated. A supporting technology development plan was developed to address high technology risk issues. The issues were identified and ranked in terms of importance and tractability, and a plan developed for obtaining data or developing technology required to mitigate the risk.

  19. Load and resistance factor rating (LRFR) in New York State : volume II.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  20. Load and resistance factor rating (LRFR) in NYS : volume II final report.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...

  1. Load and resistance factor rating (LRFR) in NYS : volume I final report.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...

  2. 1998 motor vehicle occupant safety survey. Volume 1, methodology report

    DOT National Transportation Integrated Search

    2000-03-01

    This is the Methodology Report for the 1998 Motor Vehicle Occupant Safety Survey. The survey is conducted on a biennial basis (initiated in 1994), and is administered by telephone to a randomly selected national sample. Two questionnaires are used, e...

  3. Load and resistance factor rating (LRFR) in New York State : volume I.

    DOT National Transportation Integrated Search

    2011-09-01

    This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...

  4. National survey of drinking and driving attitudes and behaviors : 2008. Volume 3, methodology report

    DOT National Transportation Integrated Search

    2010-08-01

    This report presents the details of the methodology used for the 2008 National Survey of Drinking and Driving Attitudes and Behaviors conducted by Gallup, Inc. for : the National Highway Traffic Safety Administration (NHTSA). This survey represents t...

  5. International Linear Collider Technical Design Report (Volumes 1 through 4)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison M.

    2013-03-27

    The design report consists of four volumes: Volume 1, Executive Summary; Volume 2, Physics; Volume 3, Accelerator (Part I, R and D in the Technical Design Phase, and Part II, Baseline Design); and Volume 4, Detectors.

  6. Calculation of Derivative Thermodynamic Hydration and Aqueous Partial Molar Properties of Ions Based on Atomistic Simulations.

    PubMed

    Dahlgren, Björn; Reif, Maria M; Hünenberger, Philippe H; Hansen, Niels

    2012-10-09

    The raw ionic solvation free energies calculated on the basis of atomistic (explicit-solvent) simulations are extremely sensitive to the boundary conditions and treatment of electrostatic interactions used during these simulations. However, as shown recently [Kastenholz, M. A.; Hünenberger, P. H. J. Chem. Phys.2006, 124, 224501 and Reif, M. M.; Hünenberger, P. H. J. Chem. Phys.2011, 134, 144104], the application of an appropriate correction scheme allows for a conversion of the methodology-dependent raw data into methodology-independent results. In this work, methodology-independent derivative thermodynamic hydration and aqueous partial molar properties are calculated for the Na(+) and Cl(-) ions at P° = 1 bar and T(-) = 298.15 K, based on the SPC water model and on ion-solvent Lennard-Jones interaction coefficients previously reoptimized against experimental hydration free energies. The hydration parameters considered are the hydration free energy and enthalpy. The aqueous partial molar parameters considered are the partial molar entropy, volume, heat capacity, volume-compressibility, and volume-expansivity. Two alternative calculation methods are employed to access these properties. Method I relies on the difference in average volume and energy between two aqueous systems involving the same number of water molecules, either in the absence or in the presence of the ion, along with variations of these differences corresponding to finite pressure or/and temperature changes. Method II relies on the calculation of the hydration free energy of the ion, along with variations of this free energy corresponding to finite pressure or/and temperature changes. Both methods are used considering two distinct variants in the application of the correction scheme. In variant A, the raw values from the simulations are corrected after the application of finite difference in pressure or/and temperature, based on correction terms specifically designed for derivative parameters at P° and T(-). In variant B, these raw values are corrected prior to differentiation, based on corresponding correction terms appropriate for the different simulation pressures P and temperatures T. The results corresponding to the different calculation schemes show that, except for the hydration free energy itself, accurate methodological independence and quantitative agreement with even the most reliable experimental parameters (ion-pair properties) are not yet reached. Nevertheless, approximate internal consistency and qualitative agreement with experimental results can be achieved, but only when an appropriate correction scheme is applied, along with a careful consideration of standard-state issues. In this sense, the main merit of the present study is to set a clear framework for these types of calculations and to point toward directions for future improvements, with the ultimate goal of reaching a consistent and quantitative description of single-ion hydration thermodynamics in molecular dynamics simulations.

  7. Scope of ACE in Australia. Volume 1: Implications for Improved Data Collection and Reporting [and] Volume 2: Analysis of Existing Information in National Education and Training Data Collection.

    ERIC Educational Resources Information Center

    Borthwick, J.; Knight, B.; Bender, A.; Loveder, P.

    These two volumes provide information on the scope of adult and community education (ACE) in Australia and implications for improved data collection and reporting. Volume 1 begins with a glossary. Chapter 1 addresses project objectives and processes and methodology. Chapter 2 analyzes the scope and diversity of ACE in terms of what is currently…

  8. Surrogate Plant Data Base : Volume 2. Appendix C : Facilities Planning Baseline Data

    DOT National Transportation Integrated Search

    1983-05-01

    This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...

  9. Surrogate Plant Data Base : Volume 4. Appendix E : Medium and Heavy Truck Manufacturing

    DOT National Transportation Integrated Search

    1983-05-01

    This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...

  10. Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency

    DTIC Science & Technology

    2013-03-01

    assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil

  11. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  12. Improving stability and strength characteristics of framed structures with nonlinear behavior

    NASA Technical Reports Server (NTRS)

    Pezeshk, Shahram

    1990-01-01

    In this paper an optimal design procedure is introduced to improve the overall performance of nonlinear framed structures. The design methodology presented here is a multiple-objective optimization procedure whose objective functions involve the buckling eigenvalues and eigenvectors of the structure. A constant volume with bounds on the design variables is used in conjunction with an optimality criterion approach. The method provides a general tool for solving complex design problems and generally leads to structures with better limit strength and stability. Many algorithms have been developed to improve the limit strength of structures. In most applications geometrically linear analysis is employed with the consequence that overall strength of the design is overestimated. Directly optimizing the limit load of the structure would require a full nonlinear analysis at each iteration which would be prohibitively expensive. The objective of this paper is to develop an algorithm that can improve the limit-load of geometrically nonlinear framed structures while avoiding the nonlinear analysis. One of the novelties of the new design methodology is its ability to efficiently model and design structures under multiple loading conditions. These loading conditions can be different factored loads or any kind of loads that can be applied to the structure simultaneously or independently. Attention is focused on optimal design of space framed structures. Three-dimensional design problems are more complicated to carry out, but they yield insight into real behavior of the structure and can help avoiding some of the problems that might appear in planar design procedure such as the need for out-of-plane buckling constraint. Although researchers in the field of structural engineering generally agree that optimum design of three-dimension building frames especially in the seismic regions would be beneficial, methods have been slow to emerge. Most of the research in this area has dealt with the optimization of truss and plane frame structures.

  13. Machine intelligence and autonomy for aerospace systems

    NASA Technical Reports Server (NTRS)

    Heer, Ewald (Editor); Lum, Henry (Editor)

    1988-01-01

    The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.

  14. The method of space-time and conservation element and solution element: A new approach for solving the Navier-Stokes and Euler equations

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung

    1995-01-01

    A new numerical framework for solving conservation laws is being developed. This new framework differs substantially in both concept and methodology from the well-established methods, i.e., finite difference, finite volume, finite element, and spectral methods. It is conceptually simple and designed to overcome several key limitations of the above traditional methods. A two-level scheme for solving the convection-diffusion equation is constructed and used to illuminate the major differences between the present method and those previously mentioned. This explicit scheme, referred to as the a-mu scheme, has two independent marching variables.

  15. Flight-vehicle materials, structures, and dynamics - Assessment and future directions. Vol. 3 - Ceramics and ceramic-matrix composites

    NASA Technical Reports Server (NTRS)

    Levine, Stanley R. (Editor)

    1992-01-01

    The present volume discusses ceramics and ceramic-matrix composites in prospective aerospace systems, monolithic ceramics, transformation-toughened and whisker-reinforced ceramic composites, glass-ceramic matrix composites, reaction-bonded Si3N4 and SiC composites, and chemical vapor-infiltrated composites. Also discussed are the sol-gel-processing of ceramic composites, the fabrication and properties of fiber-reinforced ceramic composites with directed metal oxidation, the fracture behavior of ceramic-matrix composites (CMCs), the fatigue of fiber-reinforced CMCs, creep and rupture of CMCs, structural design methodologies for ceramic-based materials systems, the joining of ceramics and CMCs, and carbon-carbon composites.

  16. A Grammatico-Semantic Exploration of the Problems of Sentence Formation and Interpretation in the Classroom, Volume 1. Final Report.

    ERIC Educational Resources Information Center

    Bateman, Donald R.; Zidonis, Frank J.

    In the introduction to this volume of a two volume document (See also TE 002 131.) written for curriculum developers, Donald Bateman identifies the recent periods in the development of linguistic thought and methodology, and presents language curriculum development as the continuing exploration of the processes of evolving linguistic structures.…

  17. A prototype software methodology for the rapid evaluation of biomanufacturing process options.

    PubMed

    Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli

    2007-10-01

    A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.

  18. Experimental Design of a UCAV-Based High-Energy Laser Weapon

    DTIC Science & Technology

    2016-12-01

    propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT

  19. Modeling an impact of road geometric design on vehicle energy consumption

    NASA Astrophysics Data System (ADS)

    Luin, Blaž; Petelin, Stojan; Al-Mansour, Fouad

    2017-11-01

    Some roads connect traffic origins and destinations directly, some use winding, indirect routes. Indirect connections result in longer distances driven and increased fuel consumption. A similar effect is observed on congested roads and mountain roads with many changes in altitude. Therefore a framework to assess road networks based on energy consumption is proposed. It has been shown that road geometry has significant impact on overall traffic energy consumption and emissions. The methodology presented in the paper analyzes impact of traffic volume, shares of vehicle classes, road network configuration on the energy used by the vehicles. It can be used to optimize energy consumption with efficient traffic management and to choose optimum new road in the design phase. This is especially important as the energy consumed by the vehicles shortly after construction supersedes the energy spent for the road construction.

  20. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  1. Parametric Optimization of Thermoelectric Generators for Waste Heat Recovery

    NASA Astrophysics Data System (ADS)

    Huang, Shouyuan; Xu, Xianfan

    2016-10-01

    This paper presents a methodology for design optimization of thermoelectric-based waste heat recovery systems called thermoelectric generators (TEGs). The aim is to maximize the power output from thermoelectrics which are used as add-on modules to an existing gas-phase heat exchanger, without negative impacts, e.g., maintaining a minimum heat dissipation rate from the hot side. A numerical model is proposed for TEG coupled heat transfer and electrical power output. This finite-volume-based model simulates different types of heat exchangers, i.e., counter-flow and cross-flow, for TEGs. Multiple-filled skutterudites and bismuth-telluride-based thermoelectric modules (TEMs) are applied, respectively, in higher and lower temperature regions. The response surface methodology is implemented to determine the optimized TEG size along and across the flow direction and the height of thermoelectric couple legs, and to analyze their covariance and relative sensitivity. A genetic algorithm is employed to verify the globality of the optimum. The presented method will be generally useful for optimizing heat-exchanger-based TEG performance.

  2. An economic analysis of robotically assisted hysterectomy.

    PubMed

    Wright, Jason D; Ananth, Cande V; Tergas, Ana I; Herzog, Thomas J; Burke, William M; Lewin, Sharyn N; Lu, Yu-Shiang; Neugut, Alfred I; Hershman, Dawn L

    2014-05-01

    To perform an econometric analysis to examine the influence of procedure volume, variation in hospital accounting methodology, and use of various analytic methodologies on cost of robotically assisted hysterectomy for benign gynecologic disease and endometrial cancer. A national sample was used to identify women who underwent laparoscopic or robotically assisted hysterectomy for benign indications or endometrial cancer from 2006 to 2012. Surgeon and hospital volume were classified as the number of procedures performed before the index surgery. Total costs as well as fixed and variable costs were modeled using multivariable quantile regression methodology. A total of 180,230 women, including 169,324 women who underwent minimally invasive hysterectomy for benign indications and 10,906 patients whose hysterectomy was performed for endometrial cancer, were identified. The unadjusted median cost of robotically assisted hysterectomy for benign indications was $8,152 (interquartile range [IQR] $6,011-10,932) compared with $6,535 (IQR $5,127-8,357) for laparoscopic hysterectomy (P<.001). The cost differential decreased with increasing surgeon and hospital volume. The unadjusted median cost of robotically assisted hysterectomy for endometrial cancer was $9,691 (IQR $7,591-12,428) compared with $8,237 (IQR $6,400-10,807) for laparoscopic hysterectomy (P<.001). The cost differential decreased with increasing hospital volume from $2,471 for the first 5 to 15 cases to $924 for more than 50 cases. Based on surgeon volume, robotically assisted hysterectomy for endometrial cancer was $1,761 more expensive than laparoscopy for those who had performed fewer than five cases; the differential declined to $688 for more than 50 procedures compared with laparoscopic hysterectomy. The cost of robotic gynecologic surgery decreases with increased procedure volume. However, in all of the scenarios modeled, robotically assisted hysterectomy remained substantially more costly than laparoscopic hysterectomy.

  3. Methodology for Determining the Avoidable and Fully Allocated Costs of Amtrak Routes : Volume II, Appendix A

    DOT National Transportation Integrated Search

    2009-08-01

    The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...

  4. Methodology for Determining the Avoidable and Fully Allocated Costs of Amtrak Routes : Volume III, Appendix B-H

    DOT National Transportation Integrated Search

    2009-08-01

    The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...

  5. Methodology for determining the avoidable and fully allocated costs of Amtrak routes, volume 1 : main report

    DOT National Transportation Integrated Search

    2009-08-01

    The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...

  6. Critical Race Design: An Emerging Methodological Approach to Anti-Racist Design and Implementation Research

    ERIC Educational Resources Information Center

    Khalil, Deena; Kier, Meredith

    2017-01-01

    This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…

  7. Preparation of activated petroleum coke for removal of naphthenic acids model compounds: Box-Behnken design optimization of KOH activation process.

    PubMed

    Niasar, Hojatallah Seyedy; Li, Hanning; Das, Sreejon; Kasanneni, Tirumala Venkateswara Rao; Ray, Madhumita B; Xu, Chunbao Charles

    2018-04-01

    This study employed Box-Behnken design and response surface methodology to optimize activation parameters for the production of activated petroleum coke (APC) adsorbent from petroleum coke (PC) to achieve highest adsorption capacity for three model naphthenic acids. Activated petroleum coke (APC) adsorbent with a BET surface area of 1726 m 2 /g and total pore volume of 0.85 cc/g was produced at the optimum activation conditions (KOH/coke mass ratio) of 3.0, activation temperature 790 °C, and activation time 3.47 h). Effects of the activation parameters on the adsorption pefromances (adsortion capaciy and kinetics) were investigated. With the APC obtained at the optimum activation condition, the maximum adsorption capacity of 451, 362, and 320 (mg/g) was achieved for 2-naphthoic acid, diphenylacetic acid and cyclohexanepentanoic acid (CP), respectively. Although, generally APC adsorbents with a higher specific surface area and pore volume provide better adsorption capacity, the textural properties (surface areas and pore volume) are not the only parameters determining the APC adsorbents' adsorption capacity. Other parameters such as surface functionalities play effective roles on the adsorption capacity of the produced APC adsorbents for NAs. The KOH activation process, in particular the acid washing step, distinctly reduced the sulfur and metals contents in the raw PC, decreasing the leaching potential of metals from APC adsorbents during adsorption. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Utility of Army Design Methodology in U.S. Coast Guard Counter Narcotic Interdiction Strategy

    DTIC Science & Technology

    2017-06-09

    UTILITY OF ARMY DESIGN METHODOLOGY IN U.S. COAST GUARD COUNTER NARCOTIC INTERDICTION STRATEGY A thesis presented to the...Thesis 3. DATES COVERED (From - To) AUG 2016 – JUN 2017 4. TITLE AND SUBTITLE Utility of Army Design Methodology in U.S. Coast Guard Counter...Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT This study investigates the utility of using Army Design Methodology (ADM) to

  9. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER

  10. Using Concentration Curves to Assess Organization-Specific Relationships between Surgeon Volumes and Outcomes.

    PubMed

    Kanter, Michael H; Huang, Yii-Chieh; Kally, Zina; Gordon, Margo A; Meltzer, Charles

    2018-06-01

    A well-documented association exists between higher surgeon volumes and better outcomes for many procedures, but surgeons may be reluctant to change practice patterns without objective, credible, and near real-time data on their performance. In addition, published thresholds for procedure volumes may be biased or perceived as arbitrary; typical reports compare surgeons grouped into discrete procedure volume categories, even though the volume-outcomes relationship is likely continuous. The concentration curves methodology, which has been used to analyze whether health outcomes vary with socioeconomic status, was adapted to explore the association between procedure volume and outcomes as a continuous relationship so that data for all surgeons within a health care organization could be included. Using widely available software and requiring minimal analytic expertise, this approach plots cumulative percentages of two variables of interest against each other and assesses the characteristics of the resulting curve. Organization-specific relationships between surgeon volumes and outcomes were examined for three example types of procedures: uncomplicated hysterectomies, infant circumcisions, and total thyroidectomies. The concentration index was used to assess whether outcomes were equally distributed unrelated to volumes. For all three procedures, the concentration curve methodology identified associations between surgeon procedure volumes and selected outcomes that were specific to the organization. The concentration indices confirmed the higher prevalence of examined outcomes among low-volume surgeons. The curves supported organizational discussions about surgical quality. Concentration curves require minimal resources to identify organization- and procedure-specific relationships between surgeon procedure volumes and outcomes and can support quality improvement. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  11. Planned Environmental Microbiology Aspects of Future Lunar and Mars Missions

    NASA Technical Reports Server (NTRS)

    Ott, C. Mark; Castro, Victoria A.; Pierson, Duane L.

    2006-01-01

    With the establishment of the Constellation Program, NASA has initiated efforts designed similar to the Apollo Program to return to the moon and subsequently travel to Mars. Early lunar sorties will take 4 crewmembers to the moon for 4 to 7 days. Later missions will increase in duration up to 6 months as a lunar habitat is constructed. These missions and vehicle designs are the forerunners of further missions destined for human exploration of Mars. Throughout the planning and design process, lessons learned from the International Space Station (ISS) and past programs will be implemented toward future exploration goals. The standards and requirements for these missions will vary depending on life support systems, mission duration, crew activities, and payloads. From a microbiological perspective, preventative measures will remain the primary techniques to mitigate microbial risk. Thus, most of the effort will focus on stringent preflight monitoring requirements and engineering controls designed into the vehicle, such as HEPA air filters. Due to volume constraints in the CEV, in-flight monitoring will be limited for short-duration missions to the measurement of biocide concentration for water potability. Once long-duration habitation begins on the lunar surface, a more extensive environmental monitoring plan will be initiated. However, limited in-flight volume constraints and the inability to return samples to Earth will increase the need for crew capabilities in determining the nature of contamination problems and method of remediation. In addition, limited shelf life of current monitoring hardware consumables and limited capabilities to dispose of biohazardous trash will drive flight hardware toward non-culture based methodologies, such as hardware that rapidly distinguishes biotic versus abiotic surface contamination. As missions progress to Mars, environmental systems will depend heavily on regeneration of air and water and biological waste remediation and regeneration systems, increasing the need for environmental monitoring. Almost complete crew autonomy will be needed for assessment and remediation of contamination problems. Cabin capacity will be limited; thus, current methods of microbial monitoring will be inadequate. Future methodology must limit consumables, and these consumables must have a shelf life of over three years. In summary, missions to the moon and Mars will require a practical design that prudently uses available resources to mitigate microbial risk to the crew.

  12. Small Power Systems Solar Electric Workshop Proceedings. Volume 1: Executive report. Volume 2: Invited papers

    NASA Technical Reports Server (NTRS)

    Ferber, R. (Editor); Evans, D. (Editor)

    1978-01-01

    The background, objectives and methodology used for the Small Power Systems Solar Electric Workshop are described, and a summary of the results and conclusions developed at the workshop regarding small solar thermal electric power systems is presented.

  13. Russkij jazyk za rubezom. Jahrgang 1974 ("The Russian Language Abroad." Volume 1974)

    ERIC Educational Resources Information Center

    Huebner, Wolfgang

    1975-01-01

    Articles in the 1974 volume of this periodical are briefly reviewed, preponderantly under the headings of teaching materials, methodology, linguistics, scientific reports, and chronicle. Reviews and supplements, tapes and other materials are also included. (Text is in German.) (IFS/WGA)

  14. Three-dimensional analysis of anisotropic spatially reinforced structures

    NASA Technical Reports Server (NTRS)

    Bogdanovich, Alexander E.

    1993-01-01

    The material-adaptive three-dimensional analysis of inhomogeneous structures based on the meso-volume concept and application of deficient spline functions for displacement approximations is proposed. The general methodology is demonstrated on the example of a brick-type mosaic parallelepiped arbitrarily composed of anisotropic meso-volumes. A partition of each meso-volume into sub-elements, application of deficient spline functions for a local approximation of displacements and, finally, the use of the variational principle allows one to obtain displacements, strains, and stresses at anypoint within the structural part. All of the necessary external and internal boundary conditions (including the conditions of continuity of transverse stresses at interfaces between adjacent meso-volumes) can be satisfied with requisite accuracy by increasing the density of the sub-element mesh. The application of the methodology to textile composite materials is described. Several numerical examples for woven and braided rectangular composite plates and stiffened panels under transverse bending are considered. Some typical effects of stress concentrations due to the material inhomogeneities are demonstrated.

  15. Engine System Loads Analysis Compared to Hot-Fire Data

    NASA Technical Reports Server (NTRS)

    Frady, Gregory P.; Jennings, John M.; Mims, Katherine; Brunty, Joseph; Christensen, Eric R.; McConnaughey, Paul R. (Technical Monitor)

    2002-01-01

    Early implementation of structural dynamics finite element analyses for calculation of design loads is considered common design practice for high volume manufacturing industries such as automotive and aeronautical industries. However with the rarity of rocket engine development programs starts, these tools are relatively new to the design of rocket engines. In the NASA MC-1 engine program, the focus was to reduce the cost-to-weight ratio. The techniques for structural dynamics analysis practices, were tailored in this program to meet both production and structural design goals. Perturbation of rocket engine design parameters resulted in a number of MC-1 load cycles necessary to characterize the impact due to mass and stiffness changes. Evolution of loads and load extraction methodologies, parametric considerations and a discussion of load path sensitivities are important during the design and integration of a new engine system. During the final stages of development, it is important to verify the results of an engine system model to determine the validity of the results. During the final stages of the MC-1 program, hot-fire test results were obtained and compared to the structural design loads calculated by the engine system model. These comparisons are presented in this paper.

  16. Volume equations for the Northern Research Station's Forest Inventory and Analysis Program as of 2010

    Treesearch

    Patrick D. Miles; Andrew D. Hill

    2010-01-01

    The U.S. Forest Service's Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. This report documents the methodology used to estimate live-tree gross, net, and sound volume for the 24 States inventoried by the Northern Research Station's (NRS) FIA unit. Sound volume is of particular interest...

  17. A Synergy between the Technological Process and a Methodology for Web Design: Implications for Technological Problem Solving and Design

    ERIC Educational Resources Information Center

    Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna

    2004-01-01

    Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…

  18. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  19. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  20. SU-E-J-250: A Methodology for Active Bone Marrow Protection for Cervical Cancer Intensity-Modulated Radiotherapy Using 18F-FLT PET/CT Image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, C; Yin, Y

    Purpose: The purpose of this study was to compare a radiation therapy treatment planning that would spare active bone marrow and whole pelvic bone marrow using 18F FLT PET/CT image. Methods: We have developed an IMRT planning methodology to incorporate functional PET imaging using 18F FLT/CT scans. Plans were generated for two cervical cancer patients, where pelvicactive bone marrow region was incorporated as avoidance regions based on the range: SUV>2., another region was whole pelvic bone marrow. Dose objectives were set to reduce the volume of active bone marrow and whole bone marraw. The volumes of received 10 (V10) andmore » 20 (V20) Gy for active bone marrow were evaluated. Results: Active bone marrow regions identified by 18F FLT with an SUV>2 represented an average of 48.0% of the total osseous pelvis for the two cases studied. Improved dose volume histograms for identified bone marrow SUV volumes and decreases in V10(average 18%), and V20(average 14%) were achieved without clinically significant changes to PTV or OAR doses. Conclusion: Incorporation of 18F FLT/CT PET in IMRT planning provides a methodology to reduce radiation dose to active bone marrow without compromising PTV or OAR dose objectives in cervical cancer.« less

  1. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2014-01-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  2. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  3. Norms and attitudes related to alcohol usage and driving : a review of the relevant literature. Volume 3, Report of individual interviews

    DOT National Transportation Integrated Search

    1982-09-01

    This project provides information about norms and attitudes related to alcohol use and driving. This volume reports the methodology, findings, discussions, and conclusions of individual interviews conducted with early adolescents (ages 13-14), middle...

  4. Surrogate Plant Data Base : Volume 3. Appendix D : Facilities Planning Data ; Operating Manpower, Manufacturing Budgets and Pre-Production Launch ...

    DOT National Transportation Integrated Search

    1983-05-01

    This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...

  5. Evaluation of innovative devices to control traffic entering from low-volume access points within a land closure.

    DOT National Transportation Integrated Search

    2014-04-01

    This report describes the methodology and results of analyses performed to identify and evaluate : alternative methods to control traffic entering a lane closure on a two-lane, two-way road from low-volume : access points. Researchers documented the ...

  6. Methodology of selecting dozers for lignite open pit mines in Serbia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stojanovic, D.; Ignjatovic, D.; Kovacevic, S.

    1996-12-31

    Apart from the main production processes (coal and overburden mining, rail conveyors transportation and storage of excavated masses) performed by great-capacity mechanization at open pit mines, numerous and different auxiliary works, that often have crucial influence on both the work efficiency of main equipment and the maintenance of optimum technical conditions of machines and plants covering technological system of open pit, are present. Successful realization of work indispensably requires a proper and adequate selection of auxiliary machines according to their type quantity, capacity, power etc. thus highly respecting specific conditions existing at each and every open pit mine. A dozermore » is certainly the most important and representative auxiliary machine at single open pit mine. It is widely used in numerous works that, in fact, are preconditions for successful work of the main mechanization and consequently the very selection of a dozer ranges among the most important operations when selecting mechanization. This paper presents the methodology of dozers selection when lignite open pit mines are concerned. A mathematical model defining the volume of work required for dozers to perform at open pit mines and consequently the number of necessary dozers was designed. The model underwent testing in practice at big open pit mines and can be used in design of future open pits mines.« less

  7. Applications of mixed-methods methodology in clinical pharmacy research.

    PubMed

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  8. Plasmonic metasurface cavity for simultaneous enhancement of optical electric and magnetic fields in deep subwavelength volume.

    PubMed

    Hong, Jongwoo; Kim, Sun-Je; Kim, Inki; Yun, Hansik; Mun, Sang-Eun; Rho, Junsuk; Lee, Byoungho

    2018-05-14

    It has been hard to achieve simultaneous plasmonic enhancement of nanoscale light-matter interactions in terms of both electric and magnetic manners with easily reproducible fabrication method and systematic theoretical design rule. In this paper, a novel concept of a flat nanofocusing device is proposed for simultaneously squeezing both electric and magnetic fields in deep-subwavelength volume (~λ 3 /538) in a large area. Based on the funneled unit cell structures and surface plasmon-assisted coherent interactions between them, the array of rectangular nanocavity connected to a tapered nanoantenna, plasmonic metasurface cavity, is constructed by periodic arrangement of the unit cell. The average enhancement factors of electric and magnetic field intensities reach about 60 and 22 in nanocavities, respectively. The proposed outstanding performance of the device is verified numerically and experimentally. We expect that this work would expand methodologies involving optical near-field manipulations in large areas and related potential applications including nanophotonic sensors, nonlinear responses, and quantum interactions.

  9. Correlation between mesopore volume of carbon supports and the immobilization of laccase from Trametes versicolor for the decolorization of Acid Orange 7.

    PubMed

    Ramírez-Montoya, Luis A; Hernández-Montoya, Virginia; Montes-Morán, Miguel A; Cervantes, Francisco J

    2015-10-01

    Immobilization of laccase from Trametes versicolor was carried out using carbon supports prepared from different lignocellulosic wastes. Enzymes were immobilized by physical adsorption. Taguchi methodology was selected for the design of experiments regarding the preparation of the carbon materials, which included the use of activating agents for the promotion of mesoporosity. A good correlation between the mesopore volumes of the carbon supports and the corresponding laccase loadings attained was observed. Specifically, the chemical activation of pecan nut shell with FeCl3 led to a highly mesoporous material that also behaved as the most efficient support for the immobilization of laccase. This particular laccase/carbon support system was used as biocatalyst for the decolorization of aqueous solutions containing Acid Orange 7. Mass spectrometry coupled to a liquid chromatograph allowed us to identify the products of the dye degradation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    DTIC Science & Technology

    2016-06-01

    characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira

  11. Total System Design (TSD) Methodology Assessment.

    DTIC Science & Technology

    1983-01-01

    hardware implementation. Author: Martin - Marietta Aerospace Title: Total System Design Methodology Source: Martin - Marietta Technical Report MCR -79-646...systematic, rational approach to computer systems design is needed. Martin - Marietta has produced a Total System Design Methodology to support such design...gathering and ordering. The purpose of the paper is to document the existing TSD methoeology at Martin - Marietta , describe the supporting tools, and

  12. Software Requirements Engineering Methodology (Development)

    DTIC Science & Technology

    1979-06-01

    Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway

  13. Design of radiation resistant metallic multilayers for advanced nuclear systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhernenkov, Mikhail, E-mail: zherne@bnl.gov, E-mail: gills@bnl.gov; Gill, Simerjeet, E-mail: zherne@bnl.gov, E-mail: gills@bnl.gov; Stanic, Vesna

    2014-06-16

    Helium implantation from transmutation reactions is a major cause of embrittlement and dimensional instability of structural components in nuclear energy systems. Development of novel materials with improved radiation resistance, which is of the utmost importance for progress in nuclear energy, requires guidelines to arrive at favorable parameters more efficiently. Here, we present a methodology that can be used for the design of radiation tolerant materials. We used synchrotron X-ray reflectivity to nondestructively study radiation effects at buried interfaces and measure swelling induced by He implantation in Cu/Nb multilayers. The results, supported by transmission electron microscopy, show a direct correlation betweenmore » reduced swelling in nanoscale multilayers and increased interface area per unit volume, consistent with helium storage in Cu/Nb interfaces in forms that minimize dimensional changes. In addition, for Cu/Nb layers, a linear relationship is demonstrated between the measured depth-dependent swelling and implanted He density from simulations, making the reflectivity technique a powerful tool for heuristic material design.« less

  14. Temperature modelling and prediction for activated sludge systems.

    PubMed

    Lippi, S; Rosso, D; Lubello, C; Canziani, R; Stenstrom, M K

    2009-01-01

    Temperature is an important factor affecting biomass activity, which is critical to maintain efficient biological wastewater treatment, and also physiochemical properties of mixed liquor as dissolved oxygen saturation and settling velocity. Controlling temperature is not normally possible for treatment systems but incorporating factors impacting temperature in the design process, such as aeration system, surface to volume ratio, and tank geometry can reduce the range of temperature extremes and improve the overall process performance. Determining how much these design or up-grade options affect the tank temperature requires a temperature model that can be used with existing design methodologies. This paper presents a new steady state temperature model developed by incorporating the best aspects of previously published models, introducing new functions for selected heat exchange paths and improving the method for predicting the effects of covering aeration tanks. Numerical improvements with embedded reference data provide simpler formulation, faster execution, easier sensitivity analyses, using an ordinary spreadsheet. The paper presents several cases to validate the model.

  15. Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study

    DOT National Transportation Integrated Search

    1980-02-01

    This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...

  16. Multiobjective optimization of hybrid regenerative life support technologies. Topic D: Technology Assessment

    NASA Technical Reports Server (NTRS)

    Manousiouthakis, Vasilios

    1995-01-01

    We developed simple mathematical models for many of the technologies constituting the water reclamation system in a space station. These models were employed for subsystem optimization and for the evaluation of the performance of individual water reclamation technologies, by quantifying their operational 'cost' as a linear function of weight, volume, and power consumption. Then we performed preliminary investigations on the performance improvements attainable by simple hybrid systems involving parallel combinations of technologies. We are developing a software tool for synthesizing a hybrid water recovery system (WRS) for long term space missions. As conceptual framework, we are employing the state space approach. Given a number of available technologies and the mission specifications, the state space approach would help design flowsheets featuring optimal process configurations, including those that feature stream connections in parallel, series, or recycles. We visualize this software tool to function as follows: given the mission duration, the crew size, water quality specifications, and the cost coefficients, the software will synthesize a water recovery system for the space station. It should require minimal user intervention. The following tasks need to be solved for achieving this goal: (1) formulate a problem statement that will be used to evaluate the advantages of a hybrid WRS over a single technology WBS; (2) model several WRS technologies that can be employed in the space station; (3) propose a recycling network design methodology (since the WRS synthesis task is a recycling network design problem, it is essential to employ a systematic method in synthesizing this network); (4) develop a software implementation for this design methodology, design a hybrid system using this software, and compare the resulting WRS with a base-case WRS; and (5) create a user-friendly interface for this software tool.

  17. Computational study of configurational and vibrational contributions to the thermodynamics of substitutional alloys: The case of Ni3Al

    NASA Astrophysics Data System (ADS)

    Michelon, M. F.; Antonelli, A.

    2010-03-01

    We have developed a methodology to study the thermodynamics of order-disorder transformations in n -component substitutional alloys that combines nonequilibrium methods, which can efficiently compute free energies, with Monte Carlo simulations, in which configurational and vibrational degrees of freedom are simultaneously considered on an equal footing basis. Furthermore, with this methodology one can easily perform simulations in the canonical and in the isobaric-isothermal ensembles, which allow the investigation of the bulk volume effect. We have applied this methodology to calculate configurational and vibrational contributions to the entropy of the Ni3Al alloy as functions of temperature. The simulations show that when the volume of the system is kept constant, the vibrational entropy does not change upon transition while constant-pressure calculations indicate that the volume increase at the order-disorder transition causes a vibrational entropy increase of 0.08kB/atom . This is significant when compared to the configurational entropy increase of 0.27kB/atom . Our calculations also indicate that the inclusion of vibrations reduces in about 30% the order-disorder transition temperature determined solely considering the configurational degrees of freedom.

  18. Child Language Research: Building on the Past, Looking to the Future.

    ERIC Educational Resources Information Center

    Perera, Katharine

    1994-01-01

    Outlines descriptive, theoretical, and methodological advances in child language research since the first volume of the "Journal of Child Language" was published. Papers in this volume build on earlier research, point the way to new research avenues, and open new lines of inquiry. (Contains 36 references.) (JP)

  19. Surrogate Plant Data Base : Volume 1. Introduction, Appendix A : The Development of Surrogate Plant Data ; Appendix B : Application of the Surrogate .

    DOT National Transportation Integrated Search

    1983-03-01

    This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...

  20. Norms and attitudes related to alcohol usage and driving : a review of the relevant literature. Volume 4, Report of focus groups

    DOT National Transportation Integrated Search

    1982-09-01

    This project provides information about norms and attitudes related to alcohol usage and driving. This volume reports the methodology, findings, discussion and conclusions of three focus groups: two with parents of teenaged drivers and one with adult...

  1. Air Pollution. Part A: Analysis.

    ERIC Educational Resources Information Center

    Ledbetter, Joe O.

    Two facets of the engineering control of air pollution (the analysis of possible problems and the application of effective controls) are covered in this two-volume text. Part A covers Analysis, and Part B, Prevention and Control. (This review is concerned with Part A only.) This volume deals with the terminology, methodology, and symptomatology…

  2. Language Learners in Study Abroad Contexts. Second Language Acquisition

    ERIC Educational Resources Information Center

    DuFon, Margaret A., Ed.; Churchill, Eton, Ed.

    2006-01-01

    Examining the overseas experience of language learners in diverse contexts through a variety of theoretical and methodological approaches, studies in this volume look at the acquisition of language use, socialization processes, learner motivation, identity and learning strategies. In this way, the volume offers a privileged window into learner…

  3. METHANE EMISSIONS FROM THE NATURAL GAS INDUSTRY VOLUME 3: GENERAL METHODOLOGY

    EPA Science Inventory

    The 15-volume report summarizes the results of a comprehensive program to quantify methane (CH4) emissions from the U.S. natural gas industry for the base year. The objective was to determine CH4 emissions from the wellhead and ending downstream at the customer's meter. The accur...

  4. METHANE EMISSIONS FROM THE NATURAL GAS INDUSTRY VOLUME 4: STATISTICAL METHODOLOGY

    EPA Science Inventory

    The 15-volume report summarizes the results of a comprehensive program to quantify methane (CH4) emissions from the U.S. natural gas industry for the base year. The objective was to determine CH4 emissions from the wellhead and ending downstream at the customer's meter. The accur...

  5. Tax Wealth in Fifty States. 1977 Supplement.

    ERIC Educational Resources Information Center

    Halstead, D. Kent; Weldon, H. Kent

    This first supplement to the basic volume presents tax capacity, effort, and collected revenue data for state and local governments for 1977. Planned for issuance every other year, the supplement consists of computer printout tables with the earlier basic volume continuing to serve as reference for theory, analysis, and methodology. Figures for…

  6. National Data Program for the Marine Environment Technical Development Plan. Final Report, Volume Two.

    ERIC Educational Resources Information Center

    System Development Corp., Santa Monica, CA.

    A national data program for the marine environment is recommended. Volume 2 includes: (1) objectives, scope, and methodology; (2) summary of the technical development plan; (3) agency development plans - Great Lakes and coastal development and (4) marine data network development plans. (Author)

  7. Focus control enhancement and on-product focus response analysis methodology

    NASA Astrophysics Data System (ADS)

    Kim, Young Ki; Chen, Yen-Jen; Hao, Xueli; Samudrala, Pavan; Gomez, Juan-Manuel; Mahoney, Mark O.; Kamalizadeh, Ferhad; Hanson, Justin K.; Lee, Shawn; Tian, Ye

    2016-03-01

    With decreasing CDOF (Critical Depth Of Focus) for 20/14nm technology and beyond, focus errors are becoming increasingly critical for on-product performance. Current on product focus control techniques in high volume manufacturing are limited; It is difficult to define measurable focus error and optimize focus response on product with existing methods due to lack of credible focus measurement methodologies. Next to developments in imaging and focus control capability of scanners and general tool stability maintenance, on-product focus control improvements are also required to meet on-product imaging specifications. In this paper, we discuss focus monitoring, wafer (edge) fingerprint correction and on-product focus budget analysis through diffraction based focus (DBF) measurement methodology. Several examples will be presented showing better focus response and control on product wafers. Also, a method will be discussed for a focus interlock automation system on product for a high volume manufacturing (HVM) environment.

  8. A design methodology for nonlinear systems containing parameter uncertainty: Application to nonlinear controller design

    NASA Technical Reports Server (NTRS)

    Young, G.

    1982-01-01

    A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.

  9. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 1; Theory and Design Procedure

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes a project at the University of Washington to design a multirate suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing.

  10. System for the Analysis of Global Energy Markets - Vol. I, Model Documentation

    EIA Publications

    2003-01-01

    Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.

  11. Harnessing the Power of Education Research Databases with the Pearl-Harvesting Methodological Framework for Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter

    2010-01-01

    Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…

  12. Drug research methodology. Volume 4, Epidemiology in drugs and highway safety : the study of drug use among drivers and its role in traffic crashes

    DOT National Transportation Integrated Search

    1980-06-01

    This report presents the findings of a workshop on epidemiology in drugs and highway safety. A cross-disciplinary panel of experts (1) identified methodological issues and constraints present in research to define the nature and magnitude of the drug...

  13. A clustering approach for the analysis of solar energy yields: A case study for concentrating solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Peruchena, Carlos M. Fernández; García-Barberena, Javier; Guisado, María Vicenta; Gastón, Martín

    2016-05-01

    The design of Concentrating Solar Thermal Power (CSTP) systems requires a detailed knowledge of the dynamic behavior of the meteorology at the site of interest. Meteorological series are often condensed into one representative year with the aim of data volume reduction and speeding-up of energy system simulations, defined as Typical Meteorological Year (TMY). This approach seems to be appropriate for rather detailed simulations of a specific plant; however, in previous stages of the design of a power plant, especially during the optimization of the large number of plant parameters before a final design is reached, a huge number of simulations are needed. Even with today's technology, the computational effort to simulate solar energy system performance with one year of data at high frequency (as 1-min) may become colossal if a multivariable optimization has to be performed. This work presents a simple and efficient methodology for selecting number of individual days able to represent the electrical production of the plant throughout the complete year. To achieve this objective, a new procedure for determining a reduced set of typical weather data in order to evaluate the long-term performance of a solar energy system is proposed. The proposed methodology is based on cluster analysis and permits to drastically reduce computational effort related to the calculation of a CSTP plant energy yield by simulating a reduced number of days from a high frequency TMY.

  14. Design and implementation of a controlled clinical trial to evaluate the effectiveness and efficiency of routine opt-out rapid human immunodeficiency virus screening in the emergency department.

    PubMed

    Haukoos, Jason S; Hopkins, Emily; Byyny, Richard L; Conroy, Amy A; Silverman, Morgan; Eisert, Sheri; Thrun, Mark; Wilson, Michael; Boyett, Brian; Heffelfinger, James D

    2009-08-01

    In 2006, the Centers for Disease Control and Prevention (CDC) released revised recommendations for performing human immunodeficiency virus (HIV) testing in health care settings, including implementing routine rapid HIV screening, the use of an integrated opt-out consent, and limited prevention counseling. Emergency departments (EDs) have been a primary focus of these efforts. These revised CDC recommendations were primarily based on feasibility studies and have not been evaluated through the application of rigorous research methods. This article describes the design and implementation of a large prospective controlled clinical trial to evaluate the CDC's recommendations in an ED setting. From April 15, 2007, through April 15, 2009, a prospective quasi-experimental equivalent time-samples clinical trial was performed to compare the clinical effectiveness and efficiency of routine (nontargeted) opt-out rapid HIV screening (intervention) to physician-directed diagnostic rapid HIV testing (control) in a high-volume urban ED. In addition, three nested observational studies were performed to evaluate the cost-effectiveness and patient and staff acceptance of the two rapid HIV testing methods. This article describes the rationale, methodologies, and study design features of this program evaluation clinical trial. It also provides details regarding the integration of the principal clinical trial and its nested observational studies. Such ED-based trials are rare, but serve to provide valid comparisons between testing approaches. Investigators should consider similar methodology when performing future ED-based health services research.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tatli, Emre; Ferroni, Paolo; Mazzoccoli, Jason

    The possible use of compact heat exchangers (HXs) in sodium-cooled fast reactors (SFR) employing a Brayton cycle is promising due to their high power density and resulting small volume in comparison with conventional shell-and-tube HXs. However, the small diameter of their channels makes them more susceptible to plugging due to Na2O deposition during accident conditions. Although cold traps are designed to reduce oxygen impurity levels in the sodium coolant, their failure, in conjunction with accidental air ingress into the sodium boundary, could result in coolant oxygen levels that are above the saturation limit in the cooler parts of the HXmore » channels. This can result in Na2O crystallization and the formation of solid deposits on cooled channel surfaces, limiting or even blocking coolant flow. The development of analysis tools capable of modeling the formation of these deposits in the presence of sodium flow will allow designers of SFRs to properly size the HX channels so that, in the scenario mentioned above, the reactor operator has sufficient time to detect and react to the affected HX. Until now, analytical methodologies to predict the formation of these deposits have been developed, but never implemented in a high-fidelity computational tool suited to modern reactor design techniques. This paper summarizes the challenges and the current status in the development of a Computational Fluid Dynamics (CFD) methodology to predict deposit formation, with particular emphasis on sensitivity studies on some parameters affecting deposition.« less

  16. Force 2025 and Beyond Strategic Force Design Analytic Model

    DTIC Science & Technology

    2017-01-12

    depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We

  17. Combination microwave ovens: an innovative design strategy.

    PubMed

    Tinga, Wayne R; Eke, Ken

    2012-01-01

    Reducing the sensitivity of microwave oven heating and cooking performance to load volume, load placement and load properties has been a long-standing challenge for microwave and microwave-convection oven designers. Conventional design problem and solution methods are reviewed to provide greater insight into the challenge and optimum operation of a microwave oven after which a new strategy is introduced. In this methodology, a special load isolating and energy modulating device called a transducer-exciter is used containing an iris, a launch box, a phase, amplitude and frequency modulator and a coupling plate designed to provide spatially distributed coupling to the oven. This system, when applied to a combined microwave-convection oven, gives astounding performance improvements to all kinds of baked and roasted foods including sensitive items such as cakes and pastries, with the only compromise being a reasonable reduction in the maximum available microwave power. Large and small metal utensils can be used in the oven with minimal or no performance penalty on energy uniformity and cooking results. Cooking times are greatly reduced from those in conventional ovens while maintaining excellent cooking performance.

  18. Gating geometry studies of thin-walled 17-4PH investment castings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, M.C.; Zanner, F.J.

    1992-11-01

    The ability to design gating systems that reliably feed and support investment castings is often the result of ``cut-and-try`` methodology. Factors such as hot tearing, porosity, cold shuts, misruns, and shrink are defects often corrected by several empirical gating design iterations. Sandia National Laboratories is developing rules that aid in removing the uncertainty involved in the design of gating systems for investment castings. In this work, gating geometries used for filling of thin walled investment cast 17-4PH stainless steel flat plates were investigated. A full factorial experiment evaluating the influence of metal pour temperature, mold preheat temperature, and mold channelmore » thickness were conducted for orientations that filled a horizontal flat plate from the edge. A single wedge gate geometry was used for the edge-gated configuration. Thermocouples placed along the top of the mold recorded metal front temperatures, and a real-time x-ray imaging system tracked the fluid flow behavior during filling of the casting. Data from these experiments were used to determine the terminal fill volumes and terminal fill times for each gate design.« less

  19. Gating geometry studies of thin-walled 17-4PH investment castings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, M.C.; Zanner, F.J.

    1992-01-01

    The ability to design gating systems that reliably feed and support investment castings is often the result of cut-and-try'' methodology. Factors such as hot tearing, porosity, cold shuts, misruns, and shrink are defects often corrected by several empirical gating design iterations. Sandia National Laboratories is developing rules that aid in removing the uncertainty involved in the design of gating systems for investment castings. In this work, gating geometries used for filling of thin walled investment cast 17-4PH stainless steel flat plates were investigated. A full factorial experiment evaluating the influence of metal pour temperature, mold preheat temperature, and mold channelmore » thickness were conducted for orientations that filled a horizontal flat plate from the edge. A single wedge gate geometry was used for the edge-gated configuration. Thermocouples placed along the top of the mold recorded metal front temperatures, and a real-time x-ray imaging system tracked the fluid flow behavior during filling of the casting. Data from these experiments were used to determine the terminal fill volumes and terminal fill times for each gate design.« less

  20. Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques

    DTIC Science & Technology

    2013-03-01

    MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated

  1. Fuel property effects on Navy aircraft fuel systems

    NASA Technical Reports Server (NTRS)

    Moses, C. A.

    1984-01-01

    Problems of ensuring compatibility of Navy aircraft with fuels that may be different than the fuels for which the equipment was designed and qualified are discussed. To avoid expensive requalification of all the engines and airframe fuel systems, methodologies to qualify future fuels by using bench-scale and component testing are being sought. Fuel blends with increasing JP5-type aromatic concentration were seen to produce less volume swell than an equivalent aromatic concentration in the reference fuel. Futhermore, blends with naphthenes, decalin, tetralin, and naphthalenes do not deviate significantly from the correlation line of aromatic blends, Similar results are found with tensile strenth and elongation. Other elastomers, sealants, and adhesives are also being tested.

  2. Cape Blanco wind farm feasibility study

    NASA Astrophysics Data System (ADS)

    1987-11-01

    The Cape Blanco Wind Farm (CBWF) Feasibility Study was undertaken as a prototype for determining the feasibility of proposals for wind energy projects at Northwest sites. It was intended to test for conditions under which wind generation of electricity could be commercially feasible, not by another abstract survey of alternative technologies, but rather through a site-specific, machine-specific analysis of one proposal. Some of the study findings would be most pertinent to the Cape Blanco site - local problems require local solutions. Other findings would be readily applicable to other sites and other machines, and study methodologies would be designed to be modified for appraisal of other proposals. This volume discusses environmental, economic, and technical issues of the Wind Farm.

  3. An Analysis of Measured Pressure Signatures From Two Theory-Validation Low-Boom Models

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.

    2003-01-01

    Two wing/fuselage/nacelle/fin concepts were designed to check the validity and the applicability of sonic-boom minimization theory, sonic-boom analysis methods, and low-boom design methodology in use at the end of the 1980is. Models of these concepts were built, and the pressure signatures they generated were measured in the wind-tunnel. The results of these measurements lead to three conclusions: (1) the existing methods could adequately predict sonic-boom characteristics of wing/fuselage/fin(s) configurations if the equivalent area distributions of each component were smooth and continuous; (2) these methods needed revision so the engine-nacelle volume and the nacelle-wing interference lift disturbances could be accurately predicted; and (3) current nacelle-configuration integration methods had to be updated. With these changes in place, the existing sonic-boom analysis and minimization methods could be effectively applied to supersonic-cruise concepts for acceptable/tolerable sonic-boom overpressures during cruise.

  4. Analyzing the requirements for mass production of small wind turbine generators

    NASA Astrophysics Data System (ADS)

    Anuskiewicz, T.; Asmussen, J.; Frankenfield, O.

    Mass producibility of small wind turbine generators to give manufacturers design and cost data for profitable production operations is discussed. A 15 kW wind turbine generator for production in annual volumes from 1,000 to 50,000 units is discussed. Methodology to cost the systems effectively is explained. The process estimate sequence followed is outlined with emphasis on the process estimate sheets compiled for each component and subsystem. These data enabled analysts to develop cost breakdown profiles crucial in manufacturing decision-making. The appraisal also led to various design recommendations including replacement of aluminum towers with cost effective carbon steel towers. Extensive cost information is supplied in tables covering subassemblies, capital requirements, and levelized energy costs. The physical layout of the plant is depicted to guide manufacturers in taking advantage of the growing business opportunity now offered in conjunction with the national need for energy development.

  5. Methodology for the systems engineering process. Volume 2: Technical parameters

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A scheme based on starting the logic networks from the development and mission factors that are of primary concern in an aerospace system is described. This approach required identifying the primary states (design, design verification, premission, mission, postmission), identifying the attributes within each state (performance capability, survival, evaluation, operation, etc), and then developing the generic relationships of variables for each branch. To illustrate this concept, a system was used that involved a launch vehicle and payload for an earth orbit mission. Examination showed that this example was sufficient to illustrate the concept. A more complicated mission would follow the same basic approach, but would have more extensive sets of generic trees and more correlation points between branches. It has been shown that in each system state (production, test, and use), a logic could be developed to order and classify the parameters involved in the translation from general requirements to specific requirements for system elements.

  6. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and "Braslet-M" Occlusion Cuffs

    NASA Technical Reports Server (NTRS)

    Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.

    2008-01-01

    Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.

  7. Worldwide trends in volume and quality of published protocols of randomized controlled trials

    PubMed Central

    Alldinger, Ingo; Cieslak, Kasia P.; Wennink, Roos; Clarke, Mike; Ali, Usama Ahmed; Besselink, Marc G. H.

    2017-01-01

    Introduction Publishing protocols of randomized controlled trials (RCT) facilitates a more detailed description of study rational, design, and related ethical and safety issues, which should promote transparency. Little is known about how the practice of publishing protocols developed over time. Therefore, this study describes the worldwide trends in volume and methodological quality of published RCT protocols. Methods A systematic search was performed in PubMed and EMBASE, identifying RCT protocols published over a decade from 1 September 2001. Data were extracted on quality characteristics of RCT protocols. The primary outcome, methodological quality, was assessed by individual methodological characteristics (adequate generation of allocation, concealment of allocation and intention-to-treat analysis). A comparison was made by publication period (First, September 2001- December 2004; Second, January 2005-May 2008; Third, June 2008-September 2011), geographical region and medical specialty. Results The number of published RCT protocols increased from 69 in the first, to 390 in the third period (p<0.0001). Internal medicine and paediatrics were the most common specialty topics. Whereas most published RCT protocols in the first period originated from North America (n = 30, 44%), in the second and third period this was Europe (respectively, n = 65, 47% and n = 190, 48%, p = 0.02). Quality of RCT protocols was higher in Europe and Australasia, compared to North America (OR = 0.63, CI = 0.40–0.99, p = 0.04). Adequate generation of allocation improved with time (44%, 58%, 67%, p = 0.001), as did concealment of allocation (38%, 53%, 55%, p = 0.03). Surgical protocols had the highest quality among the three specialty topics used in this study (OR = 1.94, CI = 1.09–3.45, p = 0.02). Conclusion Publishing RCT protocols has become popular, with a five-fold increase in the past decade. The quality of published RCT protocols also improved, although variation between geographical regions and across medical specialties was seen. This emphasizes the importance of international standards of comprehensive training in RCT methodology. PMID:28296925

  8. Vapor Compression and Thermoelectric Heat Pump Heat Exchangers for a Condensate Distillation System: Design and Experiment

    NASA Technical Reports Server (NTRS)

    Erickson, Lisa R.; Ungar, Eugene K.

    2013-01-01

    Maximizing the reuse of wastewater while minimizing the use of consumables is critical in long duration space exploration. One of the more promising methods of reclaiming urine is the distillation/condensation process used in the cascade distillation system (CDS). This system accepts a mixture of urine and toxic stabilizing agents, heats it to vaporize the water and condenses and cools the resulting water vapor. The CDS wastewater flow requires heating and its condensate flow requires cooling. Performing the heating and cooling processes separately requires two separate units, each of which would require large amounts of electrical power. By heating the wastewater and cooling the condensate in a single heat pump unit, mass, volume, and power efficiencies can be obtained. The present work describes and compares two competing heat pump methodologies that meet the needs of the CDS: 1) a series of mini compressor vapor compression cycles and 2) a thermoelectric heat exchanger. In the paper, the system level requirements are outlined, the designs of the two heat pumps are described in detail, and the results of heat pump performance tests are provided. A summary is provided of the heat pump mass, volume and power trades and a selection recommendation is made.

  9. Safety Assessment of Multi Purpose Small Payload Rack(MSPR)

    NASA Astrophysics Data System (ADS)

    Mizutani, Yoshinobu; Takada, Satomi; Murata, Kosei; Ozawa, Daisaku; Kobayashi, Ryoji; Nakamura, Yasuhiro

    2010-09-01

    We are reporting summary of preliminary safety assessment for Multi Purpose Small Payload Rack(MSPR), which is one of the micro gravity experiment facilities that are being developed for the 2nd phase JEM utilization(JEM: Japanese Experiment Module) that will be launched on H-II Transfer Vehicle(HTV) 2nd flight in 2011. MSPR is used for multi-purpose micro-g experiment providing experimental spaces and work stations. MSPR has three experimental spaces; first, there is a space called Work Volume(WV) with capacity volume of approximately 350 litters, in which multiple resources including electricity, communication, and moving image functions can be used. Within this space, installation of devices can be done by simple, prompt attachment by Velcro and pins with high degree of flexibility. Second, there is Small Experiment Area(SEA), with capacity volume of approximately 70 litters, in which electricity, communication, and moving image functions can also be used in the same way as WV. These spaces protect experiment devices and specimens from contingent loads by the crewmembers. Third, there is Work Bench with area of 0.5 square meters, on which can be used for maintenance, inspection and data operations of installed devices, etc. This bench can be stored in the rack during contingency. Chamber for Combustion Experiment(CCE) that is planned to be installed in WV is a pressure-resistant experimental container that can be used to seal hazardous materials from combustion experiments. This CCE has double sealing design in chamber itself, which resist gas leakage under normal the temperature and pressure. Electricity, communication, moving image function can be used in the same way as WV. JAXA Phase 2 Safety Review Panel(SRP) has been held in April, 2010. For safety analysis of MSPR, hazards were identified based on Fault Tree Analysis methodology and then these hazards were classified into either eight ISS standard-type hazards or eight unique-type hazards that requires special controls based on ISS common safety assessment methodology. Safety evaluation results are reported in the Safety Assessment Report(SAR) 1). Regarding structural failure, unique hazards are especially evaluated considering not only the tolerance for launch load but also load by crewmembers or orbital loads. Regarding electrical shock, electricity design up to secondary power is evaluated in unique hazard from a view point of Electrical design suitable for high voltage(32VDC or more) circuit. Regarding rupture/leakage of pressure system, hazards of fuel supply line, waste line for combustion gas, and pressure system including CCE are evaluated. Also evaluation for contamination due to hazardous gas leakage from CCE is conducted. External propagation of fire from CCE is also evaluated. In this report, we will show the overview of the result of safety assessment and future plan toward critical design phase activity.

  10. From field data to volumes: constraining uncertainties in pyroclastic eruption parameters

    NASA Astrophysics Data System (ADS)

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Donald A.; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-07-01

    In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal ( s = 62 %) and distal field ( s = 53 %) and small for the densely sampled intermediate deposit ( s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.

  11. Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph

    2011-11-01

    Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.

  12. A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.

    PubMed

    Janols, Rebecka; Lindgren, Helena

    2017-01-01

    A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.

  13. Influence of activated carbon characteristics on toluene and hexane adsorption: Application of surface response methodology

    NASA Astrophysics Data System (ADS)

    Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa

    2013-01-01

    The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.

  14. Control design for future agile fighters

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Davidson, John B.

    1991-01-01

    The CRAFT control design methodology is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The approach combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, and a graphical approach for representing control design metrics that captures numerous design goals in one composite illustration. The methodology makes use of control design metrics from four design objective areas, namely, control power, robustness, agility, and flying qualities. An example of the CRAFT methodology as well as associated design issues are presented.

  15. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume III of III: software description. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1981-10-29

    This volume is the software description for the National Utility Regulatory Model (NUREG). This is the third of three volumes provided by ICF under contract number DEAC-01-79EI-10579. These three volumes are: a manual describing the NUREG methodology; a users guide; and a description of the software. This manual describes the software which has been developed for NUREG. This includes a listing of the source modules. All computer code has been written in FORTRAN.

  16. Systemic Operational Design: Improving Operational Planning for the Netherlands Armed Forces

    DTIC Science & Technology

    2006-05-25

    This methodology is called Soft Systems Methodology . His methodology is a structured way of thinking in which not only a perceived problematic...Many similarities exist between Systemic Operational Design and Soft Systems Methodology , their epistemology is related. Furthermore, they both have...Systems Thinking: Managing Chaos and Complexity. Boston: Butterworth Heinemann, 1999. Checkland, Peter, and Jim Scholes. Soft Systems Methodology in

  17. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  18. Three-Dimensional Finite Element Ablative Thermal Response and Thermostructural Design of Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2011-01-01

    A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.

  19. De/signing Research in Education: Patchwork(ing) Methodologies with Theory

    ERIC Educational Resources Information Center

    Higgins, Marc; Madden, Brooke; Berard, Marie-France; Lenz Kothe, Elsa; Nordstrom, Susan

    2017-01-01

    Four education scholars extend the methodological space inspired by Jackson and Mazzei's "Thinking with Theory" through focusing on research design. The notion of de/sign is presented and employed to counter prescriptive method/ology that often sutures over pedagogical possibilities in research and educational settings. Key…

  20. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  1. 77 FR 50514 - Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ...] Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the... Administration (FDA) is announcing the following public workshop entitled ``Post-Approval Studies 2012 Workshop: Design, Methodology, and Role in Evidence Appraisal Throughout the Total Product Life Cycle.'' The topics...

  2. Multirate flutter suppression system design for the Benchmark Active Controls Technology Wing

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1994-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies will be applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing (also called the PAPA wing). Eventually, the designs will be implemented in hardware and tested on the BACT wing in a wind tunnel. This report describes a project at the University of Washington to design a multirate flutter suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing. The contributions of this project are (1) development of an algorithm for synthesizing robust low order multirate control laws (the algorithm is capable of synthesizing a single compensator which stabilizes both the nominal plant and multiple plant perturbations; (2) development of a multirate design methodology, and supporting software, for modeling, analyzing and synthesizing multirate compensators; and (3) design of a multirate flutter suppression system for NASA's BACT wing which satisfies the specified design criteria. This report describes each of these contributions in detail. Section 2.0 discusses our design methodology. Section 3.0 details the results of our multirate flutter suppression system design for the BACT wing. Finally, Section 4.0 presents our conclusions and suggestions for future research. The body of the report focuses primarily on the results. The associated theoretical background appears in the three technical papers that are included as Attachments 1-3. Attachment 4 is a user's manual for the software that is key to our design methodology.

  3. A multifractal approach to space-filling recovery for PET quantification.

    PubMed

    Willaime, Julien M Y; Aboagye, Eric O; Tsoumpas, Charalampos; Turkheimer, Federico E

    2014-11-01

    A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV mean) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic (18)F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical (18)F-fluorothymidine PET test-retest dataset. TLA estimates were stable for a range of resolutions typical in PET oncology (4-6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV mean or TV measurements across imaging protocols. The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  4. Handbook of the Economics of Education. Volume 4

    ERIC Educational Resources Information Center

    Hanushek, Erik A., Ed.; Machin, Stephen J., Ed.; Woessmann, Ludger, Ed.

    2011-01-01

    What is the value of an education? Volume 4 of the Handbooks in the Economics of Education combines recent data with new methodologies to examine this and related questions from diverse perspectives. School choice and school competition, educator incentives, the college premium, and other considerations help make sense of the investments and…

  5. Handbook of the Economics of Education. Volume 3

    ERIC Educational Resources Information Center

    Hanushek, Eric A., Ed.; Machin, Stephen J., Ed.; Woessmann, Ludger, Ed.

    2011-01-01

    How does education affect economic and social outcomes, and how can it inform public policy? Volume 3 of the Handbooks in the Economics of Education uses newly available high quality data from around the world to address these and other core questions. With the help of new methodological approaches, contributors cover econometric methods and…

  6. Race and Ethnicity in Research Methods. Sage Focus Editions, Volume 157.

    ERIC Educational Resources Information Center

    Stanfield, John H., II, Ed.; Dennis, Rutledge M., Ed.

    The contributions in this volume examine the array of methods used in quantitative, qualitative, and comparative and historical research to show how research sensitive to ethnic issues can best be conducted. Rethinking and revising traditional methodologies and applying new ones can portray racial and ethnic issues as they really exist. The…

  7. Research Informing Practice--Practice Informing Research: Innovative Teaching Methologies for World Language Teachers. Research in Second Language Learning

    ERIC Educational Resources Information Center

    Schwarzer, David, Ed.; Petron, Mary, Ed.; Luke, Christopher, Ed.

    2011-01-01

    "Research Informing Practice--Practice Informing Research: Innovative Teaching Methodologies for World Language Educators" is an edited volume that focuses on innovative, nontraditional methods of teaching and learning world languages. Using teacher-research projects, each author in the volume guides readers through their own personal…

  8. Becoming Life-Long Learners--"A Pedagogy for Learning about Visionary Leadership"

    ERIC Educational Resources Information Center

    McNeil, Mary, Ed.; Nevin, Ann, Ed.

    2014-01-01

    In this volume we apply a personal narrative methodology to understanding what we have learned about visionary leadership. Authors in this volume developed their reflections of life-long learning as they investigated existing leadership theories and theories about future leadership. Graduate program faculty and authors read and critically reviewed…

  9. Analysis of radiation exposure for naval units of Operation Crossroads. Volume 3. (Appendix B) support ships. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitz, R.; Thomas, C.; Klemm, J.

    1982-03-03

    External radiation doses are reconstructed for crews of support and target ships of Joint Task Force One at Operation CROSSROADS, 1946. Volume I describes the reconstruction methodology, which consists of modeling the radiation environment, to include the radioactivity of lagoon water, target ships, and support ship contamination; retracing ship paths through this environment; and calculating the doses to shipboard personnel. The USS RECLAIMER, a support ship, is selected as a representative ship to demonstrate this methodology. Doses for all other ships are summarized. Volume II (Appendix A) details the results for target ship personnel. Volume III (Appendix B) details themore » results for support ship personnel. Calculated doses for more than 36,000 personnel aboard support ships while at Bikini range from zero to 1.7 rem. Of those approximately 34,000 are less than 0.5 rem. From the models provided, doses due to target ship reboarding and doses accrued after departure from Bikini can be calculated, based on the individual circumstances of exposure.« less

  10. Analysis of radiation exposure for naval units of Operation Crossroads. Volume 2. (Appendix A) target ships. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitz, R.; Thomas, C.; Klemm, J.

    1982-03-03

    External radiation doses are reconstructed for crews of support and target ships of Joint Task Force One at Operation CROSSROADS, 1946. Volume I describes the reconstruction methodology, which consists of modeling the radiation environment, to include the radioactivity of lagoon water, target ships, and support ship contamination; retracing ship paths through this environment; and calculating the doses to shipboard personnel. The USS RECLAIMER, a support ship, is selected as a representative ship to demonstrate this methodology. Doses for all other ships are summarized. Volume II (Appendix A) details the results for target ship personnel. Volume III (Appendix B) details themore » results for support ship personnel. Calculated doses for more than 36,000 personnel aboard support ships while at Bikini range from zero to 1.7 rem. Of those, approximately 34,000 are less than 0.5 rem. From the models provided, doses due to target ship reboarding and doses accrued after departure from Bikini can be calculated, based on the individual circumstances of exposure.« less

  11. STORMWATER BEST MANAGEMENT PRACTICES DESIGN GUIDE VOLUME 1 - GENERAL CONSIDERATIONS

    EPA Science Inventory

    This document is Volume 1 of a three volume series that provides guidance on the selection and design of stormwater management Best Management Practices (BMPs). This first volume provides general considerations associated with the selection and design of BMPs.
    Volume I provi...

  12. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  13. The development and application of a multi-criteria optimization method to the design of a 20-seat regional jet airliner

    NASA Astrophysics Data System (ADS)

    Au, How Meng

    The aircraft design process traditionally starts with a given set of top-level requirements. These requirements can be aircraft performance related such as the fuel consumption, cruise speed, or takeoff field length, etc., or aircraft geometry related such as the cabin height or cabin volume, etc. This thesis proposes a new aircraft design process in which some of the top-level requirements are not explicitly specified. Instead, these previously specified parameters are now determined through the use of the Price-Per-Value-Factor (PPVF) index. This design process is well suited for design projects where general consensus of the top-level requirements does not exist. One example is the design of small commuter airliners. The above mentioned value factor is comprised of productivity, cabin volume, cabin height, cabin pressurization, mission fuel consumption, and field length, each weighted to a different exponent. The relative magnitude and positive/negative signs of these exponents are in agreement with general experience. The value factors of the commuter aircraft are shown to have improved over a period of four decades. In addition, the purchase price is shown to vary linearly with the value factor. The initial aircraft sizing process can be manpower intensive if the calculations are done manually. By incorporating automation into the process, the design cycle can be shortened considerably. The Fortran program functions and subroutines in this dissertation, in addition to the design and optimization methodologies described above, contribute to the reduction of manpower required for the initial sizing process. By combining the new design process mentioned above and the PPVF as the objective function, an optimization study is conducted on the design of a 20-seat regional jet. Handbook methods for aircraft design are written into a Fortran code. A genetic algorithm is used as the optimization scheme. The result of the optimization shows that aircraft designed to this PPVF index can be competitive compared to existing turboprop commuter aircraft. The process developed can be applied to other classes of aircraft with the designer modifying the cost function based upon the design goals.

  14. Assuring data transparency through design methodologies

    NASA Technical Reports Server (NTRS)

    Williams, Allen

    1990-01-01

    This paper addresses the role of design methodologies and practices in the assurance of technology transparency. The development of several subsystems on large, long life cycle government programs was analyzed to glean those characteristics in the design, development, test, and evaluation that precluded or enabled the insertion of new technology. The programs examined were Minuteman, DSP, B1-B, and space shuttle. All these were long life cycle, technology-intensive programs. The design methodologies (or lack thereof) and design practices for each were analyzed in terms of the success or failure in incorporating evolving technology. Common elements contributing to the success or failure were extracted and compared to current methodologies being proposed by the Department of Defense and NASA. The relevance of these practices to the design and deployment of Space Station Freedom were evaluated. In particular, appropriate methodologies now being used on the core development contract were examined.

  15. Remote sensing for site characterization

    USGS Publications Warehouse

    Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.; Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.

    2000-01-01

    This volume, Remote Sensing for Site Characterization, describes the feasibility of aircraft- and satellite-based methods of revealing environmental-geological problems. A balanced ratio between explanations of the methodological/technical side and presentations of case studies is maintained. The comparison of case studies from North America and Germany show how the respective territorial conditions lead to distinct methodological approaches.

  16. Quality in the Basic Grant Delivery System: Volume 3, Methodology.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., McLean, VA.

    The research methodology of a study to assess 1980-1981 award accuracy of the Basic Educational Opportunity Grants (BEOG), or Pell grants, is described. The study is the first stage of a three-stage quality control project. During the spring of 1981 a nationally representative sample of 305 public, private, and proprietary institutions was…

  17. Give Design a Chance: A Case for a Human Centered Approach to Operational Art

    DTIC Science & Technology

    2017-03-30

    strategy development and operational art. This demands fuller integration of the Army Design Methodology (ADM) and the Military Decision Making Process...MDMP). This monograph proposes a way of thinking and planning that goes beyond current Army doctrinal methodologies to address the changing...between conceptual and detailed planning. 15. SUBJECT TERMS Design; Army Design Methodology (ADM); Human Centered; Strategy; Operational Art

  18. Development of tf coil support concepts by design methodology in the case of a Bitter-type magnet. [Bitter-type magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brossmann, U.B.

    1981-01-01

    The application of the methodological design is demonstrated for the development of support concepts in the case of a Bitter-type magnet designed for a compact tokamak experimentat aiming at ignition of a DT plasma. With this methodology all boundary conditions and design criteria are more easily satisfied in a technical and economical way.

  19. A methodological assessment of studies that use voxel-based morphometry to study neural changes in tinnitus patients.

    PubMed

    Scott-Wittenborn, Nicholas; Karadaghy, Omar A; Piccirillo, Jay F; Peelle, Jonathan E

    2017-11-01

    The scientific understanding of tinnitus and its etiology has transitioned from thinking of tinnitus as solely a peripheral auditory problem to an increasing awareness that cortical networks may play a critical role in tinnitus percept or bother. With this change, studies that seek to use structural brain imaging techniques to better characterize tinnitus patients have become more common. These studies include using voxel-based morphometry (VBM) to determine if there are differences in regional gray matter volume in individuals who suffer from tinnitus and those who do not. However, studies using VBM in patients with tinnitus have produced inconsistent and sometimes contradictory results. This paper is a systematic review of all of the studies to date that have used VBM to study regional gray matter volume in people with tinnitus, and explores ways in which methodological differences in these studies may account for their heterogeneous results. We also aim to provide guidance on how to conduct future studies using VBM to produce more reproducible results to further our understanding of disease processes such as tinnitus. Studies about tinnitus and VBM were searched for using PubMed and Embase. These returned 15 and 25 results respectively. Of these, nine met the study criteria and were included for review. An additional 5 studies were identified in the literature as pertinent to the topic at hand and were added to the review, for a total of 13 studies. There was significant heterogeneity among the studies in several areas, including inclusion and exclusion criteria, software programs, and statistical analysis. We were not able to find publicly shared data or code for any study. The differences in study design, software analysis, and statistical methodology make direct comparisons between the different studies difficult. Especially problematic are the differences in the inclusion and exclusion criteria of the study, and the statistical design of the studies, both of which could radically alter findings. Thus, heterogeneity has complicated efforts to explore the etiology of tinnitus using structural MRI. There is a pressing need to standardize the use of VBM when evaluating tinnitus patients. While some heterogeneity is expected given the rapid advances in the field, more can be done to ensure that there is internal validity between studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Generation of random microstructures and prediction of sound velocity and absorption for open foams with spherical pores.

    PubMed

    Zieliński, Tomasz G

    2015-04-01

    This paper proposes and discusses an approach for the design and quality inspection of the morphology dedicated for sound absorbing foams, using a relatively simple technique for a random generation of periodic microstructures representative for open-cell foams with spherical pores. The design is controlled by a few parameters, namely, the total open porosity and the average pore size, as well as the standard deviation of pore size. These design parameters are set up exactly and independently, however, the setting of the standard deviation of pore sizes requires some number of pores in the representative volume element (RVE); this number is a procedure parameter. Another pore structure parameter which may be indirectly affected is the average size of windows linking the pores, however, it is in fact weakly controlled by the maximal pore-penetration factor, and moreover, it depends on the porosity and pore size. The proposed methodology for testing microstructure-designs of sound absorbing porous media applies the multi-scale modeling where some important transport parameters-responsible for sound propagation in a porous medium-are calculated from microstructure using the generated RVE, in order to estimate the sound velocity and absorption of such a designed material.

  1. The Methodology of Selecting the Transport Mode for Companies on the Slovak Transport Market

    NASA Astrophysics Data System (ADS)

    Černá, Lenka; Zitrický, Vladislav; Daniš, Jozef

    2017-03-01

    Transport volume in the Slovak Republic is growing continuously every year. This rising trend is influenced by the development of car industry and its suppliers. Slovak republic has also a geographic strategy position in middle Europe from the side of transport corridors (east-west and north-south). The development of transport volume in freight transport depends on the transport and business processes between the European Union and China and it is an opportunity for Slovak republic to obtain transit transport flows. In the Slovak Republic, road transport has a dominant position in the transport market. The volume of road transport has gradually increased over the past years. The increase of road transport is reflected on the highways and speed roads in regions which have higher economic potential. The increase of rail transport as seen on the main rail corridors is not as significant as in road transport. Trade globalization also has an influence on the increase of transport volume in intermodal transport. Predicted increase in transport volume for this transport mode is from 2,3 mil ton per year at present to 8 mil ton in the year 2020. Selection of transport mode and carrier is an important aspect for logistic management, because companies (customers) want to reduce the number of carriers which they trade and they create the system of several key carriers. Bigger transport volume and more qualitative transport service give a possibility to reduce transport costs. This trend is positive for carriers too, because the carriers can focus only on the selected customers and provide more qualitative services. The paper is focused on the selection of transport mode based on the proposed methodology. The aims of the paper are, definition of criteria which directly influence the selection of transport modes, determination of criteria based on the subjectively methods, creation of process for the selection of transport modes and practical application of proposed methodology.

  2. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    PubMed

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only few influences that would lead to incomplete distillation. Our validation parameters have shown that the performance of the method corresponds to the data presented for the reference method and we believe that automated steam distillation, can be used for the purpose of labelling control of alcoholic beverages.

  3. An evaluation of three-dimensional photogrammetric and morphometric techniques for estimating volume and mass in Weddell seals Leptonychotes weddellii

    PubMed Central

    Ruscher-Hill, Brandi; Kirkham, Amy L.; Burns, Jennifer M.

    2018-01-01

    Body mass dynamics of animals can indicate critical associations between extrinsic factors and population vital rates. Photogrammetry can be used to estimate mass of individuals in species whose life histories make it logistically difficult to obtain direct body mass measurements. Such studies typically use equations to relate volume estimates from photogrammetry to mass; however, most fail to identify the sources of error between the estimated and actual mass. Our objective was to identify the sources of error that prevent photogrammetric mass estimation from directly predicting actual mass, and develop a methodology to correct this issue. To do this, we obtained mass, body measurements, and scaled photos for 56 sedated Weddell seals (Leptonychotes weddellii). After creating a three-dimensional silhouette in the image processing program PhotoModeler Pro, we used horizontal scale bars to define the ground plane, then removed the below-ground portion of the animal’s estimated silhouette. We then re-calculated body volume and applied an expected density to estimate animal mass. We compared the body mass estimates derived from this silhouette slice method with estimates derived from two other published methodologies: body mass calculated using photogrammetry coupled with a species-specific correction factor, and estimates using elliptical cones and measured tissue densities. The estimated mass values (mean ± standard deviation 345±71 kg for correction equation, 346±75 kg for silhouette slice, 343±76 kg for cones) were not statistically distinguishable from each other or from actual mass (346±73 kg) (ANOVA with Tukey HSD post-hoc, p>0.05 for all pairwise comparisons). We conclude that volume overestimates from photogrammetry are likely due to the inability of photo modeling software to properly render the ventral surface of the animal where it contacts the ground. Due to logistical differences between the “correction equation”, “silhouette slicing”, and “cones” approaches, researchers may find one technique more useful for certain study programs. In combination or exclusively, these three-dimensional mass estimation techniques have great utility in field studies with repeated measures sampling designs or where logistic constraints preclude weighing animals. PMID:29320573

  4. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  5. Differences in regional grey matter volumes in currently ill patients with anorexia nervosa.

    PubMed

    Phillipou, Andrea; Rossell, Susan Lee; Gurvich, Caroline; Castle, David Jonathan; Abel, Larry Allen; Nibbs, Richard Grant; Hughes, Matthew Edward

    2018-01-01

    Neurobiological findings in anorexia nervosa (AN) are inconsistent, including differences in regional grey matter volumes. Methodological limitations often contribute to the inconsistencies reported. The aim of this study was to improve on these methodologies by utilising voxel-based morphometry (VBM) analysis with the use of diffeomorphic anatomic registration through an exponentiated lie algebra algorithm (DARTEL), in a relatively large group of individuals with AN. Twenty-six individuals with AN and 27 healthy controls underwent a T1-weighted magnetic resonance imaging (MRI) scan. AN participants were found to have reduced grey matter volumes in a number of areas including regions of the basal ganglia (including the ventral striatum), and parietal and temporal cortices. Body mass index (BMI) and global scores on the Eating Disorder Examination Questionnaire (EDE-Q) were also found to correlate with grey matter volumes in a region of the brainstem (including the substantia nigra and ventral tegmental area) in AN, and predicted 56% of the variance in grey matter volumes in this area. The brain regions associated with grey matter reductions in AN are consistent with regions responsible for cognitive deficits associated with the illness including anhedonia, deficits in affect perception and saccadic eye movement abnormalities. Overall, the findings suggest reduced grey matter volumes in AN that are associated with eating disorder symptomatology. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  6. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  7. MOD-5A wind turbine generator program design report: Volume 1: Executive Summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The design, development and analysis of the 7.3 MW MOD-5A wind turbine generator covering work performed between July 1980 and June 1984 is discussed. The report is divided into four volumes: Volume 1 summarizes the entire MOD-5A program, Volume 2 discusses the conceptual and preliminary design phases, Volume 3 describes the final design of the MOD-5A, and Volume 4 contains the drawings and specifications developed for the final design. Volume 1, the Executive Summary, summarizes all phases of the MOD-5A program. The performance and cost of energy generated by the MOD-5A are presented. Each subsystem - the rotor, drivetrain, nacelle, tower and foundation, power generation, and control and instrumentation subsystems - is described briefly. The early phases of the MOD-5A program, during which the design was analyzed and optimized, and new technologies and materials were developed, are discussed. Manufacturing, quality assurance, and safety plans are presented. The volume concludes with an index of volumes 2 and 3.

  8. On the suitability of the copula types for the joint modelling of flood peaks and volumes along the Danube River

    NASA Astrophysics Data System (ADS)

    Kohnová, Silvia; Papaioannou, George; Bacigál, Tomáš; Szolgay, Ján; Hlavčová, Kamila; Loukas, Athanasios; Výleta, Roman

    2017-04-01

    Flood frequency analysis is often performed as a univariate analysis of flood peaks using a suitable theoretical probability distribution of the annual maximum flood peaks or peak over threshold values. However, also other flood attributes, such as flood volume and duration, are often necessary for the design of hydrotechnical structures and projects. In this study, the suitability of various copula families for a bivariate analysis of peak discharges and flood volumes has been tested on the streamflow data from gauging stations along the whole Danube River. Kendall's rank correlation coefficient (tau) quantifies the dependence between flood peak discharge and flood volume settings. The methodology is tested on two different data samples: 1) annual maximum flood (AMF) peaks with corresponding flood volumes, which is a typical choice for engineering studies and 2). annual maximum flood (AMF) peaks combined with annual maximum flow volumes of fixed durations at 5, 10, 15, 20, 25, 30 and 60 days, which can be regarded as a regime analysis of the dependence between the extremes of both variables in a given year. The bivariate modelling of the peak discharge - flood volume couples is achieved with the use of the the following copulas: Ali-Mikhail-Haq (AMH), Clayton, Frank, Joe, Gumbel, HuslerReiss, Galambos, Tawn, Normal, Plackett and FGM, respectively. Scatterplots of the observed and simulated peak discharge - flood volume pairs and goodness-of-fit tests have been used to assess the overall applicability of the copulas as well as observing any changes in suitable models along the Danube River. The results indicate that, almost all of the considered Archimedean class copulas (e.g. Frank, Clayton and Ali-Mikhail-Haq) perform better than the other copula families selected for this study, and that for the second data samples mostly the upper-tail-flat copulas were suitable.

  9. Research design in end-of-life research: state of science.

    PubMed

    George, Linda K

    2002-10-01

    The volume of research on end-of-life care, death, and dying has exploded during the past decade. This article reviews the conceptual and methodological adequacy of end-of-life research to date, focusing on limitations of research to date and ways of improving future research. A systematic search was conducted to identify the base of end-of-life research. Approximately 400 empirical articles were identified and are the basis of this review. Although much has been learned from research to date, limitations in the knowledge base are substantial. The most fundamental problems identified are conceptual and include failure to define dying; neglect of the distinctions among quality of life, quality of death, and quality of end-of-life care. Methodologically, the single greatest problem is the lack of longitudinal studies that cover more than the time period immediately before death. Gaps in the research base include insufficient attention to psychological and spiritual issues, the prevalence of psychiatric disorder and the effectiveness of the treatment of such disorders among dying persons, provider and health system variables, social and cultural diversity, and the effects of comorbidity on trajectories of dying.

  10. Application of response surface methodology (RSM) for optimizing coagulation process of paper recycling wastewater using Ocimum basilicum.

    PubMed

    Mosaddeghi, Mohammad Reza; Pajoum Shariati, Farshid; Vaziri Yazdi, Seyed Ali; Nabi Bidhendi, Gholamreza

    2018-06-21

    The wastewater produced in a pulp and paper industry is one of the most polluted industrial wastewaters, and therefore its treatment requires complex processes. One of the simple and feasible processes in pulp and paper wastewater treatment is coagulation and flocculation. Overusing a chemical coagulant can produce a large volume of sludge and increase costs and health concerns. Therefore, the use of natural and plant-based coagulants has been recently attracted the attention of researchers. One of the advantages of using Ocimum basilicum as a coagulant is a reduction in the amount of chemical coagulant required. In this study, the effect of basil mucilage has been investigated as a plant-based coagulant together with alum for treatment of paper recycling wastewater. Response surface methodology (RSM) was used to optimize the process of chemical coagulation based on a central composite rotatable design (CCRD). Quadratic models for colour reduction and TSS removal with coefficients of determination of R 2 >96 were obtained using the analysis of variance. Under optimal conditions, removal efficiencies of colour and total suspended solids (TSS) were 85% and 82%, respectively.

  11. Employing WebGL to develop interactive stereoscopic 3D content for use in biomedical visualization

    NASA Astrophysics Data System (ADS)

    Johnston, Semay; Renambot, Luc; Sauter, Daniel

    2013-03-01

    Web Graphics Library (WebGL), the forthcoming web standard for rendering native 3D graphics in a browser, represents an important addition to the biomedical visualization toolset. It is projected to become a mainstream method of delivering 3D online content due to shrinking support for third-party plug-ins. Additionally, it provides a virtual reality (VR) experience to web users accommodated by the growing availability of stereoscopic displays (3D TV, desktop, and mobile). WebGL's value in biomedical visualization has been demonstrated by applications for interactive anatomical models, chemical and molecular visualization, and web-based volume rendering. However, a lack of instructional literature specific to the field prevents many from utilizing this technology. This project defines a WebGL design methodology for a target audience of biomedical artists with a basic understanding of web languages and 3D graphics. The methodology was informed by the development of an interactive web application depicting the anatomy and various pathologies of the human eye. The application supports several modes of stereoscopic displays for a better understanding of 3D anatomical structures.

  12. Millimeter wave satellite concepts, volume 1

    NASA Technical Reports Server (NTRS)

    Hilsen, N. B.; Holland, L. D.; Thomas, R. E.; Wallace, R. W.; Gallagher, J. G.

    1977-01-01

    The identification of technologies necessary for development of millimeter spectrum communication satellites was examined from a system point of view. Development of methodology based on the technical requirements of potential services that might be assigned to millimeter wave bands for identifying the viable and appropriate technologies for future NASA millimeter research and development programs, and testing of this methodology with selected user applications and services were the goals of the program. The entire communications network, both ground and space subsystems was studied. Cost, weight, and performance models for the subsystems, conceptual design for point-to-point and broadcast communications satellites, and analytic relationships between subsystem parameters and an overall link performance are discussed along with baseline conceptual systems, sensitivity studies, model adjustment analyses, identification of critical technologies and their risks, and brief research and development program scenarios for the technologies judged to be moderate or extensive risks. Identification of technologies for millimeter satellite communication systems, and assessment of the relative risks of these technologies, was accomplished through subsystem modeling and link optimization for both point-to-point and broadcast applications.

  13. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    PubMed

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  14. Designing for fiber composite structural durability in hygrothermomechanical environment

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1985-01-01

    A methodology is described which can be used to design/analyze fiber composite structures subjected to complex hygrothermomechanical environments. This methodology includes composite mechanics and advanced structural analysis methods (finite element). Select examples are described to illustrate the application of the available methodology. The examples include: (1) composite progressive fracture; (2) composite design for high cycle fatigue combined with hot-wet conditions; and (3) general laminate design.

  15. Navy Community of Practice for Programmers and Developers

    DTIC Science & Technology

    2016-12-01

    execute cyber missions. The methodology employed in this research is human-centered design via a social interaction prototype, which allows us to learn...for Navy programmers and developers. Chapter V details the methodology used to design the proposed CoP. This chapter summarizes the results from...thirty years the term has evolved to incorporate ideas from numerous design methodologies and movements [57]. In the 1980s, revealed design began to

  16. Toward a Formal Model of the Design and Evolution of Software

    DTIC Science & Technology

    1988-12-20

    should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the

  17. From field data to volumes: constraining uncertainties in pyroclastic eruption parameters

    USGS Publications Warehouse

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Don; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-01-01

    In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal (s = 62 %) and distal field (s = 53 %) and small for the densely sampled intermediate deposit (s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.

  18. Design Criteria for Microbiological Facilities at Fort Detrick. Volume II: Design Criteria

    ERIC Educational Resources Information Center

    Army Biological Labs., Fort Detrick, MD. Industrial Health and Safety Div.

    Volume II of a two-volume manual of design criteria, based primarily on biological safety considerations. It is prepared for the use of architect-engineers in designing new or modified microbiological facilities for Fort Detrick, Maryland. Volume II is divided into the following sections: (1) architectural, (2) heating, ventilating, and air…

  19. Writing Educational Biography: Explorations in Qualitative Research. Critical Education Practice; Volume 13. Garland Reference Library of Social Science, Volume 1098.

    ERIC Educational Resources Information Center

    Kridel, Craig, Ed.

    This collection examines many influences of biographical inquiry in education and discusses methodological issues from the perspectives of veteran and novice biographers. The section on qualitative research and educational biography contains the following chapters: "Musings on Life Writing: Biography and Case Studies in Teacher Education" (Robert…

  20. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 4. Noise Tests.

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  1. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 3. Ride Quality Tests.

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  2. A Study of Operator and Mechanic Training Needs in the Transit Industry. Volume I, Findings and Conclusions. Final Report.

    ERIC Educational Resources Information Center

    Henderson, Harold L.; And Others

    Surveys of 188 transit properties and on-site visits were conducted to determine training needs of operators and mechanics in the urban mass transportation industry. Volume I presents findings and conclusions of the study with reference to survey methodology, site visit interviews and observations, questionnaire results, and specific…

  3. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 6. SOAC Instrumentation System.

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  4. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 2. Performance Tests.

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  5. An Introduction to Advertising Research; A Report from the Communications Research Center.

    ERIC Educational Resources Information Center

    Haskins, Jack B.

    The purpose of this volume is to present, in nontechnical language, most of the basic concepts of advertising research. Since the volume is intended to be comprehensible to the lay person, discussion does not go too deeply into the technical details of advertising or research methodology. However, used as an introduction and outline to be…

  6. Methodological Research on Knowledge Use and School Improvement. Volume III. Measuring Knowledge Use: A Procedural Inventory.

    ERIC Educational Resources Information Center

    Dunn, William N.; And Others

    This volume presents in one collection a systematic inventory of research and analytic procedures appropriate for generating information on knowledge production, diffusion, and utilization, gathered by the University of Pittsburgh Program for the Study of Knowledge Use. The main concern is with those procedures that focus on the utilization of…

  7. A Compatible Stem Taper-Volume-Weight System For Intensively Managed Fast Growing Loblolly Pine

    Treesearch

    Yugia Zhang; Bruce E. Borders; Robert L Bailey

    2002-01-01

    eometry-oriented methodology yielded a compatible taper-volume-weight system of models whose parameters were estimated using data from intensively managed loblolly pine (Pinus taeda L.) plantations in the lower coastal plain of Georgia. Data analysis showed that fertilization has significantly reduced taper (inside and outside bark) on the upper...

  8. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 1. Program Description and Test Summary

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  9. Haptic Technologies for MEMS Design

    NASA Astrophysics Data System (ADS)

    Calis, Mustafa; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents for the first time a design methodology for MEMS/NEMS based on haptic sensing technologies. The software tool created as a result of this methodology will enable designers to model and interact in real time with their virtual prototype. One of the main advantages of haptic sensing is the ability to bring unusual microscopic forces back to the designer's world. Other significant benefits for developing such a methodology include gain productivity and the capability to include manufacturing costs within the design cycle.

  10. Proceedings of the Seminar on the DOD Computer Security Initiative (4th) Held at the National Bureau of Standards, Gaithersburg, Maryland on August 10-12, 1981.

    DTIC Science & Technology

    1981-01-01

    comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes

  11. Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience

    ERIC Educational Resources Information Center

    Zanotti, Francesco

    2012-01-01

    Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…

  12. TRAC Innovative Visualization Techniques

    DTIC Science & Technology

    2016-11-14

    Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and

  13. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  14. Army Training Study: Battalion Training Survey. Volumes 1 and 2.

    DTIC Science & Technology

    1978-08-08

    mathematical logic in the methodology. II. MAGN ITUJDE-ESTI MAT ION SCALLING A. General Description A unique methodology, Magnitude-Estimation...to 142.) I b " p .’ . -, / 1 ’- " ’. " " . -’ -" ..’- ’ ;’ ’- . "’ .- ’,, • "." -- -. -. -.-. The base conditio (represen.d in T1- sIA , IIA, and IIIA

  15. Space Engineering Projects in Design Methodology

    NASA Technical Reports Server (NTRS)

    Crawford, R.; Wood, K.; Nichols, S.; Hearn, C.; Corrier, S.; DeKunder, G.; George, S.; Hysinger, C.; Johnson, C.; Kubasta, K.

    1993-01-01

    NASA/USRA is an ongoing sponsor of space design projects in the senior design courses of the Mechanical Engineering Department at The University of Texas at Austin. This paper describes the UT senior design sequence, focusing on the first-semester design methodology course. The philosophical basis and pedagogical structure of this course is summarized. A history of the Department's activities in the Advanced Design Program is then presented. The paper includes a summary of the projects completed during the 1992-93 Academic Year in the methodology course, and concludes with an example of two projects completed by student design teams.

  16. Toward quantifying the effectiveness of water trading under uncertainty.

    PubMed

    Luo, B; Huang, G H; Zou, Y; Yin, Y Y

    2007-04-01

    This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.

  17. On the Design of Attitude-Heading Reference Systems Using the Allan Variance.

    PubMed

    Hidalgo-Carrió, Javier; Arnold, Sascha; Poulakis, Pantelis

    2016-04-01

    The Allan variance is a method to characterize stochastic random processes. The technique was originally developed to characterize the stability of atomic clocks and has also been successfully applied to the characterization of inertial sensors. Inertial navigation systems (INS) can provide accurate results in a short time, which tend to rapidly degrade in longer time intervals. During the last decade, the performance of inertial sensors has significantly improved, particularly in terms of signal stability, mechanical robustness, and power consumption. The mass and volume of inertial sensors have also been significantly reduced, offering system-level design and accommodation advantages. This paper presents a complete methodology for the characterization and modeling of inertial sensors using the Allan variance, with direct application to navigation systems. Although the concept of sensor fusion is relatively straightforward, accurate characterization and sensor-information filtering is not a trivial task, yet they are essential for good performance. A complete and reproducible methodology utilizing the Allan variance, including all the intermediate steps, is described. An end-to-end (E2E) process for sensor-error characterization and modeling up to the final integration in the sensor-fusion scheme is explained in detail. The strength of this approach is demonstrated with representative tests on novel, high-grade inertial sensors. Experimental navigation results are presented from two distinct robotic applications: a planetary exploration rover prototype and an autonomous underwater vehicle (AUV).

  18. A Methodology for the Assessment of Experiential Learning Lean: The Lean Experience Factory Study

    ERIC Educational Resources Information Center

    De Zan, Giovanni; De Toni, Alberto Felice; Fornasier, Andrea; Battistella, Cinzia

    2015-01-01

    Purpose: The purpose of this paper is to present a methodology to assess the experiential learning processes of learning lean in an innovative learning environment: the lean model factories. Design/methodology/approach: A literature review on learning and lean management literatures was carried out to design the methodology. Then, a case study…

  19. Solar energy program evaluation: an introduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    deLeon, P.

    The Program Evaluation Methodology provides an overview of the practice and methodology of program evaluation and defines more precisely the evaluation techniques and methodologies that would be most appropriate to government organizations which are actively involved in the research, development, and commercialization of solar energy systems. Formal evaluation cannot be treated as a single methodological approach for assessing a program. There are four basic types of evaluation designs - the pre-experimental design; the quasi-experimental design based on time series; the quasi-experimental design based on comparison groups; and the true experimental design. This report is organized to first introduce the rolemore » and issues of evaluation. This is to provide a set of issues to organize the subsequent sections detailing the national solar energy programs. Then, these two themes are integrated by examining the evaluation strategies and methodologies tailored to fit the particular needs of the various individual solar energy programs. (MCW)« less

  20. Plasma volume methodology: Evans blue, hemoglobin-hematocrit, and mass density transformations

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Hinghofer-Szalkay, H.

    1985-01-01

    Methods for measuring absolute levels and changes in plasma volume are presented along with derivations of pertinent equations. Reduction in variability of the Evans blue dye dilution technique using chromatographic column purification suggests that the day-to-day variability in the plasma volume in humans is less than + or - 20 m1. Mass density determination using the mechanical-oscillator technique provides a method for measuring vascular fluid shifts continuously for assessing the density of the filtrate, and for quantifying movements of protein across microvascular walls. Equations for the calculation of volume and density of shifted fluid are presented.

  1. DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS

    DTIC Science & Technology

    2017-10-01

    DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ

  2. Enhancing the Front-End Phase of Design Methodology

    ERIC Educational Resources Information Center

    Elias, Erasto

    2006-01-01

    Design methodology (DM) is defined by the procedural path, expressed in design models, and techniques or methods used to untangle the various activities within a design model. Design education in universities is mainly based on descriptive design models. Much knowledge and organization have been built into DM to facilitate design teaching.…

  3. Simulation and optimization of volume holographic imaging systems in Zemax.

    PubMed

    Wissmann, Patrick; Oh, Se Baek; Barbastathis, George

    2008-05-12

    We present a new methodology for ray-tracing analysis of volume holographic imaging (VHI) systems. Using the k-sphere formulation, we apply geometrical relationships to describe the volumetric diffraction effects imposed on rays passing through a volume hologram. We explain the k-sphere formulation in conjunction with ray tracing process and describe its implementation in a Zemax UDS (User Defined Surface). We conclude with examples of simulation and optimization results and show proof of consistency and usefulness of the proposed model.

  4. Higher order solution of the Euler equations on unstructured grids using quadratic reconstruction

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Frederickson, Paul O.

    1990-01-01

    High order accurate finite-volume schemes for solving the Euler equations of gasdynamics are developed. Central to the development of these methods are the construction of a k-exact reconstruction operator given cell-averaged quantities and the use of high order flux quadrature formulas. General polygonal control volumes (with curved boundary edges) are considered. The formulations presented make no explicit assumption as to complexity or convexity of control volumes. Numerical examples are presented for Ringleb flow to validate the methodology.

  5. Methodologies for Root Locus and Loop Shaping Control Design with Comparisons

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    2017-01-01

    This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.

  6. Dark Energy Survey Year 1 Results: galaxy mock catalogues for BAO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avila, S.; et al.

    Mock catalogues are a crucial tool in the analysis of galaxy surveys data, both for the accurate computation of covariance matrices, and for the optimisation of analysis methodology and validation of data sets. In this paper, we present a set of 1800 galaxy mock catalogues designed to match the Dark Energy Survey Year-1 BAO sample (Crocce et al. 2017) in abundance, observational volume, redshift distribution and uncertainty, and redshift dependent clustering. The simulated samples were built upon HALOGEN (Avila et al. 2015) halo catalogues, based on a $2LPT$ density field with an exponential bias. For each of them, a lightconemore » is constructed by the superposition of snapshots in the redshift range $0.45« less

  7. Life sciences payload definition and integration study. Volume 4: Appendix, costs, and data management requirements of the dedicated 30-day laboratory. [carry-on laboratory for Spacelab

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The results of the updated 30-day life sciences dedicated laboratory scheduling and costing activities are documented, and the 'low cost' methodology used to establish individual equipment item costs is explained in terms of its allowances for equipment that is commerical off-the-shelf, modified commercial, and laboratory prototype; a method which significantly lowers program costs. The costs generated include estimates for non-recurring development, recurring production, and recurring operations costs. A cost for a biomedical emphasis laboratory and a Delta cost to provide a bioscience and technology laboratory were also generated. All cost reported are commensurate with the design and schedule definitions available.

  8. Critical Evaluation of Low-Energy Collision Damage Theories and Design Methodologies. Volume II. Literature Search and Review.

    DTIC Science & Technology

    1978-07-01

    i e r , except t h a t a number of di rge tanks would r e q u i r e protection. The entire length of an oil t a...p late perforated by a circular d r i f t . However , i t is q u i t e clear tha t the behavior of the shell p lating of oil tankers assumes v i t... i ~- i ions. V ms oil en e- e - m t a i n s e l e cted u l a t a it suggests a nrc- h I . ’ si -n - i ~ - n i t ’ l l m u n g I he numher ot

  9. Tailored metal matrix composites for high-temperature performance

    NASA Technical Reports Server (NTRS)

    Morel, M. R.; Saravanos, D. A.; Chamis, C. C.

    1992-01-01

    A multi-objective tailoring methodology is presented to maximize stiffness and load carrying capacity of a metal matrix cross-ply laminated at elevated temperatures. The fabrication process and fiber volume ratio are used as the design variables. A unique feature is the concurrent effects from fabrication, residual stresses, material nonlinearity, and thermo-mechanical loading on the laminate properties at the post-fabrication phase. For a (0/90)(sub s) graphite/copper laminate, strong coupling was observed between the fabrication process, laminate characteristics, and thermo-mechanical loading. The multi-objective tailoring was found to be more effective than single objective tailoring. Results indicate the potential to increase laminate stiffness and load carrying capacity by controlling the critical parameters of the fabrication process and the laminate.

  10. Multi-physics optimization of three-dimensional microvascular polymeric components

    NASA Astrophysics Data System (ADS)

    Aragón, Alejandro M.; Saksena, Rajat; Kozola, Brian D.; Geubelle, Philippe H.; Christensen, Kenneth T.; White, Scott R.

    2013-01-01

    This work discusses the computational design of microvascular polymeric materials, which aim at mimicking the behavior found in some living organisms that contain a vascular system. The optimization of the topology of the embedded three-dimensional microvascular network is carried out by coupling a multi-objective constrained genetic algorithm with a finite-element based physics solver, the latter validated through experiments. The optimization is carried out on multiple conflicting objective functions, namely the void volume fraction left by the network, the energy required to drive the fluid through the network and the maximum temperature when the material is subjected to thermal loads. The methodology presented in this work results in a viable alternative for the multi-physics optimization of these materials for active-cooling applications.

  11. Two-Nucleon Systems in a Finite Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briceno, Raul

    2014-11-01

    I present the formalism and methodology for determining the nucleon-nucleon scattering parameters from the finite volume spectra obtained from lattice quantum chromodynamics calculations. Using the recently derived energy quantization conditions and the experimentally determined scattering parameters, the bound state spectra for finite volume systems with overlap with the 3S1-3D3 channel are predicted for a range of volumes. It is shown that the extractions of the infinite-volume deuteron binding energy and the low-energy scattering parameters, including the S-D mixing angle, are possible from Lattice QCD calculations of two-nucleon systems with boosts of |P| <= 2pi sqrt{3}/L in volumes with spatial extentsmore » L satisfying fm <~ L <~ 14 fm.« less

  12. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 4: Conceptual design report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. In the first step of this task, a methodology was developed to ensure that all relevant design dimensions were addressed, and that all feasible designs could be considered. The development effort yielded the following method for generating and comparing designs in task 4: (1) Extract SCS system requirements (functions) from the system specification; (2) Develop design evaluation criteria; (3) Identify system architectural dimensions relevant to SCS system designs; (4) Develop conceptual designs based on the system requirements and architectural dimensions identified in step 1 and step 3 above; (5) Evaluate the designs with respect to the design evaluation criteria developed in step 2 above. The results of the method detailed in the above 5 steps are discussed. The results of the task 4 work provide the set of designs which two or three candidate designs are to be selected by MSFC as input to task 5-refine SCS conceptual designs. The designs selected for refinement will be developed to a lower level of detail, and further analyses will be done to begin to determine the size and speed of the components required to implement these designs.

  13. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  14. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    ERIC Educational Resources Information Center

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…

  15. Partial volume correction of brain perfusion estimates using the inherent signal data of time-resolved arterial spin labeling.

    PubMed

    Ahlgren, André; Wirestam, Ronnie; Petersen, Esben Thade; Ståhlberg, Freddy; Knutsson, Linda

    2014-09-01

    Quantitative perfusion MRI based on arterial spin labeling (ASL) is hampered by partial volume effects (PVEs), arising due to voxel signal cross-contamination between different compartments. To address this issue, several partial volume correction (PVC) methods have been presented. Most previous methods rely on segmentation of a high-resolution T1 -weighted morphological image volume that is coregistered to the low-resolution ASL data, making the result sensitive to errors in the segmentation and coregistration. In this work, we present a methodology for partial volume estimation and correction, using only low-resolution ASL data acquired with the QUASAR sequence. The methodology consists of a T1 -based segmentation method, with no spatial priors, and a modified PVC method based on linear regression. The presented approach thus avoids prior assumptions about the spatial distribution of brain compartments, while also avoiding coregistration between different image volumes. Simulations based on a digital phantom as well as in vivo measurements in 10 volunteers were used to assess the performance of the proposed segmentation approach. The simulation results indicated that QUASAR data can be used for robust partial volume estimation, and this was confirmed by the in vivo experiments. The proposed PVC method yielded probable perfusion maps, comparable to a reference method based on segmentation of a high-resolution morphological scan. Corrected gray matter (GM) perfusion was 47% higher than uncorrected values, suggesting a significant amount of PVEs in the data. Whereas the reference method failed to completely eliminate the dependence of perfusion estimates on the volume fraction, the novel approach produced GM perfusion values independent of GM volume fraction. The intra-subject coefficient of variation of corrected perfusion values was lowest for the proposed PVC method. As shown in this work, low-resolution partial volume estimation in connection with ASL perfusion estimation is feasible, and provides a promising tool for decoupling perfusion and tissue volume. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  17. A multifractal approach to space-filling recovery for PET quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O.; Tsoumpas, Charalampos

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal andmore » synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.« less

  18. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  19. A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1979-01-01

    The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.

  20. Mod-5A wind turbine generator program design report. Volume 3: Final design and system description, book 2

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The design, development and analysis of the 7.3MW MOD-5A wind turbine generator is documented. The report is divided into four volumes: Volume 1 summarizes the entire MOD-5A program, Volume 2 discusses the conceptual and preliminary design phases, Volume 3 describes the final design of the MOD-5A, and Volume 4 contains the drawings and specifications developed for the final design. Volume 3, book 2 describes the performance and characteristics of the MOD-5A wind turbine generator in its final configuration. The subsystem for power generation, control, and instrumentation subsystems is described in detail. The manufacturing and construction plans, and the preparation of a potential site on Oahu, Hawaii, are documented. The quality assurance and safety plan, and analyses of failure modes and effects, and reliability, availability and maintainability are presented.

  1. Mechanistic-empirical Pavement Design Guide Implementation

    DOT National Transportation Integrated Search

    2010-06-01

    The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...

  2. Evaluation of methodologies for interpolation of data for hydrological modeling in glacierized basins with limited information

    NASA Astrophysics Data System (ADS)

    Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier

    2017-04-01

    The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount of precipitation in high-altitudinal zones, and 2) ordinary Kriging (OK) whose variograms were calculated with the multi-annual monthly mean precipitation applying them to the whole study period. OK leads to better results in both low and high altitudinal zones. For ice volume, the aim was to estimate values from historical data: 1) with the GlabTop algorithm which needs digital elevation models, but these are available in an appropriate scale since 2009, 2) with a widely applied but controversially discussed glacier area-volume relation whose parameters were calibrated with results from the GlabTop model. Both methodologies provide reasonable results, but for historical data, the area-volume scaling only requires the glacial area easy to calculate from satellite images since 1986. In conclusion, the simple correlation, the OK and the calibrated relation for ice volume showed the best ways to interpolate glacio-climatic information. However, these methods must be carefully applied and revisited for the specific situation with high complexity. This is a first step in order to identify the most appropriate methods to interpolate and extend observed data in glacierized basins with limited information. New research should be done evaluating another methodologies and meteorological data in order to improve hydrological models and water management policies.

  3. Miniaturized Air-to-Refrigerant Heat Exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radermacher, Reinhard; Bacellar, Daniel; Aute, Vikrant

    Air-to-refrigerant Heat eXchangers (HX) are an essential component of Heating, Ventilation, Air-Conditioning, and Refrigeration (HVAC&R) systems, serving as the main heat transfer component. The major limiting factor to HX performance is the large airside thermal resistance. Recent literature aims at improving heat transfer performance by utilizing enhancement methods such as fins and small tube diameters; this has lead to almost exhaustive research on the microchannel HX (MCHX). The objective of this project is to develop a miniaturized air-to-refrigerant HX with at least 20% reduction in volume, material volume, and approach temperature compared to current state-of-the-art multiport flat tube designs andmore » also be capable of production within five years. Moreover, the proposed HX’s are expected to have good water drainage and should succeed in both evaporator and condenser applications. The project leveraged Parallel-Parametrized Computational Fluid Dynamics (PPCFD) and Approximation-Assisted Optimization (AAO) techniques to perform multi-scale analysis and shape optimization with the intent of developing novel HX designs whose thermal-hydraulic performance exceeds that of state-of-the-art MCHX. Nine heat exchanger geometries were initially chosen for detailed analysis, selected from 35+ geometries which were identified in previous work at the University of Maryland, College Park. The newly developed optimization framework was exercised for three design optimization problems: (DP I) 1.0kW radiator, (DP II) 10kW radiator and (DP III) 10kW two-phase HX. DP I consisted of the design and optimization of 1.0kW air-to-water HX’s which exceeded the project requirements of 20% volume/material reduction and 20% better performance. Two prototypes for the 1.0kW HX were prototyped, tested and validated using newly-designed airside and refrigerant side test facilities. DP II, a scaled version DP I for 10kW air-to-water HX applications, also yielded optimized HX designs which met project requirements. Attempts to prototype a 10kW have presented unique manufacturing challenges, especially regarding tube blockages and structural stability. DP III comprised optimizing two-phase HX’s for a 3.0Ton capacity in a heat pump / air-conditioning unit for cooling mode application using R410A as the working fluid. The HX’s theoretically address the project requirements. System-level analysis showed the HX’s achieved up to 15% improvement in COP while also reducing overall unit charge by 30-40%. The project methodology was capable of developing HX’s which can outperform current state-of-the-art MCHX by at least 20% reduction in volume, material volume, and approach temperature. Additionally, the capability for optimization using refrigerant charge as an objective function was developed. The five-year manufacturing feasibility of the proposed HX’s was shown to have a good outlook. Successful prototyping through both conventional manufacturing methods and next generation methods such as additive manufacturing was achieved.« less

  4. Results, Knowledge, and Attitudes Regarding an Incentive Compensation Plan in a Hospital-Based, Academic, Employed Physician Multispecialty Group.

    PubMed

    Dolan, Robert W; Nesto, Richard; Ellender, Stacey; Luccessi, Christopher

    Hospitals and healthcare systems are introducing incentive metrics into compensation plans that align with value-based payment methodologies. These incentive measures should be considered a practical application of the transition from volume to value and will likely replace traditional productivity-based compensation in the future. During the transition, there will be provider resistance and implementation challenges. This article examines a large multispecialty group's experience with a newly implemented incentive compensation plan including the structure of the plan, formulas for calculation of the payments, the mix of quality and productivity metrics, and metric threshold achievement. Three rounds of surveys with comments were collected to measure knowledge and attitudes regarding the plan. Lessons learned and specific recommendations for success are described. The participant's knowledge and attitudes regarding the plan are important considerations and affect morale and engagement. Significant provider dissatisfaction with the plan was found. Careful metric selection, design, and management are critical activities that will facilitate provider acceptance and support. Improvements in data collection and reporting will be needed to produce reliable metrics that can supplant traditional volume-based productivity measures.

  5. Quality characteristics of gluten free bread from barnyard millet-soy flour blends.

    PubMed

    Chakraborty, Subir K; Gupta, Saumya; Kotwaliwale, Nachiket

    2016-12-01

    The effects of formulation of leavened bread by using varying levels (for 50 g base flour) of soy flour-barnyard millet blends (with 5.74, 6.25, 7, 7.75 and 8.26 g of soy flour), yeast (1.83, 2, 2.25, 2.5 and 2.67 g) and salt (0.63, 0.8, 1.05, 1.30 and 1.47 g) on textural, colour and specific volume were determined. A central composite rotatable design of response surface methodology was used to plan the experiments. The second order models obtained were observed to be statistically significant and capable of demonstrating the effects input variables on responses. All the textural properties were affected significantly by amount of soy flour and yeast in the dough. Soy flour had a significant effect on the colour of the bread making it more brown. Interaction of soy flour and yeast affected the specific volume to maximum extent. Two-tailed t test established that the efficacy of the models as no significant was observed between the predicted and the actual values.

  6. Recent Progress in Synthesis and Application of Low-Dimensional Silicon Based Anode Material for Lithium Ion Battery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yuandong; Liu, Kewei; Zhu, Yu

    Silicon is regarded as the next generation anode material for LIBs with its ultra-high theoretical capacity and abundance. Nevertheless, the severe capacity degradation resulting from the huge volume change and accumulative solid-electrolyte interphase (SEI) formation hinders the silicon based anode material for further practical applications. Hence, a variety of methods have been applied to enhance electrochemical performances in terms of the electrochemical stability and rate performance of the silicon anodes such as designing nanostructured Si, combining with carbonaceous material, exploring multifunctional polymer binders, and developing artificial SEI layers. Silicon anodes with low-dimensional structures (0D, 1D, and 2D), compared with bulkymore » silicon anodes, are strongly believed to have several advanced characteristics including larger surface area, fast electron transfer, and shortened lithium diffusion pathway as well as better accommodation with volume changes, which leads to improved electrochemical behaviors. Finally, in this review, recent progress of silicon anode synthesis methodologies generating low-dimensional structures for lithium ion batteries (LIBs) applications is listed and discussed.« less

  7. Recent Progress in Synthesis and Application of Low-Dimensional Silicon Based Anode Material for Lithium Ion Battery

    DOE PAGES

    Sun, Yuandong; Liu, Kewei; Zhu, Yu

    2017-07-31

    Silicon is regarded as the next generation anode material for LIBs with its ultra-high theoretical capacity and abundance. Nevertheless, the severe capacity degradation resulting from the huge volume change and accumulative solid-electrolyte interphase (SEI) formation hinders the silicon based anode material for further practical applications. Hence, a variety of methods have been applied to enhance electrochemical performances in terms of the electrochemical stability and rate performance of the silicon anodes such as designing nanostructured Si, combining with carbonaceous material, exploring multifunctional polymer binders, and developing artificial SEI layers. Silicon anodes with low-dimensional structures (0D, 1D, and 2D), compared with bulkymore » silicon anodes, are strongly believed to have several advanced characteristics including larger surface area, fast electron transfer, and shortened lithium diffusion pathway as well as better accommodation with volume changes, which leads to improved electrochemical behaviors. Finally, in this review, recent progress of silicon anode synthesis methodologies generating low-dimensional structures for lithium ion batteries (LIBs) applications is listed and discussed.« less

  8. Contrast media extravasations in patients undergoing computerized tomography scanning: a systematic review and meta-analysis of risk factors and interventions.

    PubMed

    Ding, Sandrine; Meystre, Nicole Richli; Campeanu, Cosmin; Gullo, Giuseppe

    2018-01-01

    To identify risk factors and interventions preventing or reducing contrast medium extravasation. Computed tomography (CT) is a radiological examination essential for the diagnosis and monitoring of many diseases. It is often performed with the intravenous (IV) injection of contrast agents. Use of these products can result in a significant complication, extravasation, which is the accidental leakage of IV material into the surrounding tissue. Patients may feel a sharp pain and skin ulceration or necrosis may develop. This review considered studies that included patients (adults and children) undergoing a CT with IV administration of contrast media. The risk factors considered were patient demographics, comorbidities and medication history. This review also investigated any strategies related to: contrast agent, injection per se, material used for injection, apparatus used, healthcare professionals involved, and patient risk assessment performed by the radiology personnel. The comparators were other interventions or usual care. This review investigated randomized controlled trials and non-randomized controlled trials. When neither of these were available, other study designs, such as prospective and retrospective cohort studies, case-control studies and case series, were considered for inclusion. Primary outcomes considered were: extravasation frequency, volume, severity and complications. The databases PubMed, CINAHL, Embase, the Cochrane Register of Controlled Trials, Web of Science PsycINFO, ProQuest Dissertations and Theses A&I, TRIP Database and ClinicalTrials.gov were searched to find both published and unpublished studies from 1980 to September 2016. Papers were assessed by two independent reviewers for methodological validity using the Joanna Briggs Institute System for the Unified Management, Assessment and Review of Information (JBI SUMARI). Data were extracted using the standardized data extraction tool from JBI SUMARI. In one case, quantitative data from two cohort studies were pooled in a statistical meta-analysis. However, generally, statistical pooling was not possible due to heterogeneity of the interventions, populations of interest or outcomes. Accordingly, the findings have been presented in narrative form. Fifteen articles were selected from a total of 2151 unique studies identified. Two were randomized controlled trials and 13 were quasi-experimental and observational studies. The quality of these studies was judged to be low to moderate. Some patient characteristics, such as female sex and inpatient status, appeared to be risk factors for extravasation. Additionally, injection rate, venous access site and catheter dwelling time could affect the volume extravasated. Preliminary studies seemed to indicate the potential of extravasation detection accessories to identify extravasation and reduce the volume extravasated. The other interventions either did not result in significant reduction in the frequency/volume of extravasation, or the results were mixed across the studies. The majority of the studies included in this review evaluated the outcomes of extravasation frequency and volume. Given the quality of the primary studies, this systematic review identified only potential risk factors and interventions. It further highlighted the research gap in this area and the importance of conducting trials with solid methodological designs.

  9. A blind trial evaluation of a crime scene methodology for deducing impact velocity and droplet size from circular bloodstains.

    PubMed

    Hulse-Smith, Lee; Illes, Mike

    2007-01-01

    In a previous study, mechanical engineering models were utilized to deduce impact velocity and droplet volume of circular bloodstains by measuring stain diameter and counting spines radiating from their outer edge. A blind trial study was subsequently undertaken to evaluate the accuracy of this technique, using an applied, crime scene methodology. Calculations from bloodstains produced on paper, drywall, and wood were used to derive surface-specific equations to predict 39 unknown mock crime scene bloodstains created over a range of impact velocities (2.2-5.7 m/sec) and droplet volumes (12-45 microL). Strong correlations were found between expected and observed results, with correlation coefficients ranging between 0.83 and 0.99. The 95% confidence limit associated with predictions of impact velocity and droplet volume was calculated for paper (0.28 m/sec, 1.7 microL), drywall (0.37 m/sec, 1.7 microL), and wood (0.65 m/sec, 5.2 microL).

  10. Systematic review of communication partner training in aphasia: methodological quality.

    PubMed

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  11. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  12. Dual-mode plasmonic nanorod type antenna based on the concept of a trapped dipole.

    PubMed

    Panaretos, Anastasios H; Werner, Douglas H

    2015-04-06

    In this paper we theoretically investigate the feasibility of creating a dual-mode plasmonic nanorod antenna. The proposed design methodology relies on adapting to optical wavelengths the principles of operation of trapped dipole antennas, which have been widely used in the low MHz frequency range. This type of antenna typically employs parallel LC circuits, also referred to as "traps", which are connected along the two arms of the dipole. By judiciously choosing the resonant frequency of these traps, as well as their position along the arms of the dipole, it is feasible to excite the λ/2 resonance of both the original dipole as well as the shorter section defined by the length of wire between the two traps. This effectively enables the dipole antenna to have a dual-mode of operation. Our analysis reveals that the implementation of this concept at the nanoscale requires that two cylindrical pockets (i.e. loading volumes) be introduced along the length of the nanoantenna, inside which plasmonic core-shell particles are embedded. By properly selecting the geometry and constitution of the core-shell particle as well as the constitution of the host material of the two loading volumes and their position along the nanorod, the equivalent effect of a resonant parallel LC circuit can be realized. This effectively enables a dual-mode operation of the nanorod antenna. The proposed methodology introduces a compact approach for the realization of dual-mode optical sensors while at the same time it clearly illustrates the inherent tuning capabilities that core-shell particles can offer in a practical framework.

  13. Analysis of on-line clinical laboratory manuals and practical recommendations.

    PubMed

    Beckwith, Bruce; Schwartz, Robert; Pantanowitz, Liron

    2004-04-01

    On-line clinical laboratory manuals are a valuable resource for medical professionals. To our knowledge, no recommendations currently exist for their content or design. To analyze publicly accessible on-line clinical laboratory manuals and to propose guidelines for their content. We conducted an Internet search for clinical laboratory manuals written in English with individual test listings. Four individual test listings in each manual were evaluated for 16 data elements, including sample requirements, test methodology, units of measure, reference range, and critical values. Web sites were also evaluated for supplementary information and search functions. We identified 48 on-line laboratory manuals, including 24 academic or community hospital laboratories and 24 commercial or reference laboratories. All manuals had search engines and/or test indices. No single manual contained all 16 data elements evaluated. An average of 8.9 (56%) elements were present (range, 4-14). Basic sample requirements (specimen and volume needed) were the elements most commonly present (98% of manuals). The frequency of the remaining data elements varied from 10% to 90%. On-line clinical laboratory manuals originate from both hospital and commercial laboratories. While most manuals were user-friendly and contained adequate specimen-collection information, other important elements, such as reference ranges, were frequently absent. To ensure that clinical laboratory manuals are of maximal utility, we propose the following 13 data elements be included in individual test listings: test name, synonyms, test description, test methodology, sample requirements, volume requirements, collection guidelines, transport guidelines, units of measure, reference range, critical values, test availability, and date of latest revision.

  14. IMPAC: An Integrated Methodology for Propulsion and Airframe Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.

    1991-01-01

    The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.

  15. Modeling and characterization of through-the-thickness properties of 3D woven composites

    NASA Technical Reports Server (NTRS)

    Hartranft, Dru; Pravizi-Majidi, Azar; Chou, Tsu-Wei

    1995-01-01

    The through-the-thickness properties of three-dimensionally (3D) woven carbon/epoxy composites have been studied. The investigation aimed at the evaluation and development of test methodologies for the property characterization in the thickness direction, and the establishment of fiber architectures were studied: layer-to-layer Angle Interlock, through-the-thickness Orthogonal woven preform with surface pile was also designed and manufactured for the fabrication of tensile test coupons with integrated grips. All the preforms were infiltrated by the resin transfer molding technique. The microstructures of the composites were characterized along the warp and fill (weft) directions to determine the degree of yarn undulations, yarn cross-sectional shapes, and microstructural dimensions. These parameters were correlated to the fiber architecture. Specimens were designed and tested for the direct measurement of the through-the-thickness tensile, compressive and shear properties of the composites. Design optimization was conducted through the analysis of the stress fields within the specimen coupled with experimental verification. The experimentally-derived elastic properties in the thickness direction compared well with analytical predictions obtained from a volume averaging model.

  16. Thermal power systems, small power systems application project. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    Current small power system technology as applied to power plants up to 10 MWe in size was assessed. Markets for small power systems were characterized and cost goals were established. Candidate power plant system design concepts were selected for evaluation and preliminary performance and cost assessments were made. Economic studies were conducted and breakeven capital costs were determined for leading contenders among the candidate systems. An application study was made of the potential use of small power systems in providing part of the demand for pumping power by the extensive aqueduct system of California, estimated to be 1000 MWe by 1985. Criteria and methodologies were developed for application to the ranking of candidate power plant system design concepts. Experimental power plants concepts of 1 MWe rating were studied leading toward the definition of a power plant configuration for subsequent detail design, construction, testing and evaluation as Engineering Experiment No. 1 (EE No. 1). Site selection criteria and ground rules for the solicitation of EE No. 1 site participation proposals by DOE were developed.

  17. STORMWATER BEST MANAGEMENT PRACTICES DESIGN GUIDE VOLUME 2 - VEGETATIVE BIOFILTERS

    EPA Science Inventory

    This document is Volume 2 of a three volume document that provides guidance on the selection and design of stormwater management Best Management Practices (BMPs). This second volume provides specific design guidance for a group of onsite BMP control practices that are referred t...

  18. Experimental Validation of an Integrated Controls-Structures Design Methodology

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.

    1996-01-01

    The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.

  19. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  20. Towards Methodologies for Building Knowledge-Based Instructional Systems.

    ERIC Educational Resources Information Center

    Duchastel, Philippe

    1992-01-01

    Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…

  1. One Controller at a Time (1-CAT): A mimo design methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Lucas, J. C.

    1987-01-01

    The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.

  2. General Methodology for Designing Spacecraft Trajectories

    NASA Technical Reports Server (NTRS)

    Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.

    2012-01-01

    A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.

  3. Study of Manpower Requirements by Occupation for Alternative Technologies in the Energy-Related Industries, 1970-1990. Volumes I, IIA, and III.

    ERIC Educational Resources Information Center

    Gutmanis, Ivars; And Others

    The report presents the methodology used by the National Planning Association (NPA), under contract to the Federal Energy Administration (FEA), to estimate direct labor usage coefficients in some sixty different occupational categories involved in construction, operation, and maintenance of energy facilities. Volume 1 presents direct labor usage…

  4. Discourse Formation in Comparative Education. 4th, Revised Edition. Comparative Studies Series. Volume 10

    ERIC Educational Resources Information Center

    Schriewer, Jurgen, Ed.

    2012-01-01

    New theories and theory-based methodological approaches have found their way into Comparative Education--just as into Comparative Social Science more generally--in increasing number in the recent past. The essays of this volume express and critically discuss quite a range of these positions such as, inter alia, the theory of self-organizing social…

  5. Critical Behaviors in Psychiatric-Mental Health Nursing. Volume 1: A Survey of Mental Health Nursing Practices.

    ERIC Educational Resources Information Center

    Jacobs, Angeline Marchese; And Others

    This document describes the methodology followed in obtaining abstracts (see volumes 2 and 3) of more than 8,000 critical behaviors of nurses and attendants in delivering care in 50 psychiatric and mental health facilities throughout the country. The abstracts were derived from reports of actual observations by 1,785 mental health practitioners in…

  6. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 5. Structural, Voltage, and Radio Frequency Interference Tests

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  7. Curriculum: Toward New Identities. Critical Education Practice, Volume 12. Garland Reference Library of Social Science, Volume 1135.

    ERIC Educational Resources Information Center

    Pinar, William F., Ed.

    This collection of essays draws upon research in political, feminist, theological, literary, and racial theory to examine research methodologies relating to curriculum studies. The essays are: (1) "Storying the Self: Life Politics and the Study of the Teacher's Life and Work" (Ivor F. Goodson); (2) "Curriculum, Transcendence, and Zen/Taoism:…

  8. Proceedings of the Workshop on Identification and Control of Flexible Space Structures, Volume 3

    NASA Technical Reports Server (NTRS)

    Rodriguez, G. (Editor)

    1985-01-01

    The results of a workshop on identification and control of flexible space structures are reported. This volume deals mainly with control theory and methodologies as they apply to space stations and large antennas. Integration and dynamics and control experimental findings are reported. Among the areas of control theory discussed were feedback, optimization, and parameter identification.

  9. Cuadernos de Autoformacion en Participacion Social. Principios y Valores. Volumen 1 (Self Instructional Notebooks on Social Participation. Principles and Values. Volume 1).

    ERIC Educational Resources Information Center

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  10. Detection and quantification of MS lesions using fuzzy topological principles

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.

    1996-04-01

    Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.

  11. Modeling of the effect of freezer conditions on the hardness of ice cream using response surface methodology.

    PubMed

    Inoue, K; Ochi, H; Habara, K; Taketsuka, M; Saito, H; Ichihashi, N; Iwatsuki, K

    2009-12-01

    The effect of conventional continuous freezer parameters [mix flow (L/h), overrun (%), drawing temperature ( degrees C), cylinder pressure (kPa), and dasher speed (rpm)] on the hardness of ice cream under varying measured temperatures (-5, -10, and -15 degrees C) was investigated systematically using response surface methodology (central composite face-centered design), and the relationships were expressed as statistical models. The range (maximum and minimum values) of each freezer parameter was set according to the actual capability of the conventional freezer and applicability to the manufacturing process. Hardness was measured using a penetrometer. These models showed that overrun and drawing temperature had significant effects on hardness. The models can be used to optimize freezer conditions to make ice cream of the least possible hardness under the highest overrun (120%) and a drawing temperature of approximately -5.5 degrees C (slightly warmer than the lowest drawing temperature of -6.5 degrees C) within the range of this study. With reference to the structural elements of the ice cream, we suggest that the volume of overrun and ice crystal content, ice crystal size, and fat globule destabilization affect the hardness of ice cream. In addition, the combination of a simple instrumental parameter and response surface methodology allows us to show the relation between freezer conditions and one of the most important properties-hardness-visually and quantitatively on the practical level.

  12. 42 CFR 413.312 - Methodology for calculating rates.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospectively Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning...

  13. 42 CFR 413.312 - Methodology for calculating rates.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES; OPTIONAL PROSPECTIVELY DETERMINED PAYMENT RATES FOR SKILLED NURSING FACILITIES Prospectively Determined Payment Rates for Low-Volume Skilled Nursing Facilities, for Cost Reporting Periods Beginning...

  14. Analysis of wind tunnel test results for a 9.39-per cent scale model of a VSTOL fighter/attack aircraft. Volume 2: Evaluation of prediction methodologies

    NASA Technical Reports Server (NTRS)

    Lummus, J. R.; Joyce, G. T.; Omalley, C. D.

    1980-01-01

    An evaluation of current prediction methodologies to estimate the aerodynamic uncertainties identified for the E205 configuration is presented. This evaluation was accomplished by comparing predicted and wind tunnel test data in three major categories: untrimmed longitudinal aerodynamics; trimmed longitudinal aerodynamics; and lateral-directional aerodynamic characteristics.

  15. Proceedings of the tenth annual DOE low-level waste management conference: Session 2: Site performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-01

    This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)

  16. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  17. The Applications of NASA Mission Technologies to the Greening of Human Impact

    NASA Technical Reports Server (NTRS)

    Sims, Michael H.

    2009-01-01

    I will give an overview talk about flight software systems, robotics technologies and modeling for energy minimization as applied to vehicles and buildings infrastructures. A dominant issue in both design and operations of robotic spacecraft is the minimization of energy use. In the design and building of spacecraft increased power is acquired only at the cost of additional mass and volumes and ultimately cost. Consequently, interplanetary spacecrafts are designed to have the minimum essential power and those designs often incorporate careful timing of all power use. Operationally, the availability of power is the most influential constraint for the use of planetary surface robots, such as the Mars Exploration Rovers. The amount of driving done, the amount of science accomplished and indeed the survivability of the spacecraft itself is determined by the power available for use. For the Mars Exploration Rovers there are four tools which are used: (1) models of the rover and it s thermal and power use (2) predictive environmental models of power input and thermal environment (3) fine grained manipulation of power use (4) optimization modeling and planning tools. In this talk I will discuss possible applications of this methodology to minimizing power use on Earth, especially in buildings.

  18. On-Chip Magnetic Bead Manipulation and Detection Using a Magnetoresistive Sensor-Based Micro-Chip: Design Considerations and Experimental Characterization

    PubMed Central

    Gooneratne, Chinthaka P.; Kodzius, Rimantas; Li, Fuquan; Foulds, Ian G.; Kosel, Jürgen

    2016-01-01

    The remarkable advantages micro-chip platforms offer over cumbersome, time-consuming equipment currently in use for bio-analysis are well documented. In this research, a micro-chip that includes a unique magnetic actuator (MA) for the manipulation of superparamagnetic beads (SPBs), and a magnetoresistive sensor for the detection of SPBs is presented. A design methodology, which takes into account the magnetic volume of SPBs, diffusion and heat transfer phenomena, is presented with the aid of numerical analysis to optimize the parameters of the MA. The MA was employed as a magnetic flux generator and experimental analysis with commercially available COMPEL™ and Dynabeads® demonstrated the ability of the MA to precisely transport a small number of SPBs over long distances and concentrate SPBs to a sensing site for detection. Moreover, the velocities of COMPEL™ and Dynabead® SPBs were correlated to their magnetic volumes and were in good agreement with numerical model predictions. We found that 2.8 μm Dynabeads® travel faster, and can be attracted to a magnetic source from a longer distance, than 6.2 μm COMPEL™ beads at magnetic flux magnitudes of less than 10 mT. The micro-chip system could easily be integrated with electronic circuitry and microfluidic functions, paving the way for an on-chip biomolecule quantification device. PMID:27571084

  19. Vapor Compression and Thermoelectric Heat Pumps for a Cascade Distillation Subsystem: Design and Experiment

    NASA Technical Reports Server (NTRS)

    Erickson, Lisa R.; Ungar, Eugene K.

    2012-01-01

    Humans on a spacecraft require significant amounts of water for drinking, food, hydration, and hygiene. Maximizing the reuse of wastewater while minimizing the use of consumables is critical for long duration space exploration. One of the more promising consumable-free methods of reclaiming wastewater is the distillation/condensation process used in the Cascade Distillation Subsystem (CDS). The CDS heats wastewater to the point of vaporization then condenses and cools the resulting water vapor. The CDS wastewater flow requires heating for evaporation and the product water flow requires cooling for condensation. Performing the heating and cooling processes separately would require two separate units, each of which would demand large amounts of electrical power. Mass, volume, and power efficiencies can be obtained by heating the wastewater and cooling the condensate in a single heat pump unit. The present work describes and compares two competing heat pump methodologies that meet the needs of the CDS: 1) a series of mini compressor vapor compression cycles and 2) a thermoelectric heat exchanger. In the paper, the CDS system level requirements are outlined, the designs of the two heat pumps are described in detail, and the results of heat pump analysis and performance tests are provided. The mass, volume, and power requirement for each heat pump option is compared and the advantages and disadvantages of each system are listed.

  20. On-Chip Magnetic Bead Manipulation and Detection Using a Magnetoresistive Sensor-Based Micro-Chip: Design Considerations and Experimental Characterization.

    PubMed

    Gooneratne, Chinthaka P; Kodzius, Rimantas; Li, Fuquan; Foulds, Ian G; Kosel, Jürgen

    2016-08-26

    The remarkable advantages micro-chip platforms offer over cumbersome, time-consuming equipment currently in use for bio-analysis are well documented. In this research, a micro-chip that includes a unique magnetic actuator (MA) for the manipulation of superparamagnetic beads (SPBs), and a magnetoresistive sensor for the detection of SPBs is presented. A design methodology, which takes into account the magnetic volume of SPBs, diffusion and heat transfer phenomena, is presented with the aid of numerical analysis to optimize the parameters of the MA. The MA was employed as a magnetic flux generator and experimental analysis with commercially available COMPEL™ and Dynabeads(®) demonstrated the ability of the MA to precisely transport a small number of SPBs over long distances and concentrate SPBs to a sensing site for detection. Moreover, the velocities of COMPEL™ and Dynabead(®) SPBs were correlated to their magnetic volumes and were in good agreement with numerical model predictions. We found that 2.8 μm Dynabeads(®) travel faster, and can be attracted to a magnetic source from a longer distance, than 6.2 μm COMPEL™ beads at magnetic flux magnitudes of less than 10 mT. The micro-chip system could easily be integrated with electronic circuitry and microfluidic functions, paving the way for an on-chip biomolecule quantification device.

Top