Science.gov

Sample records for risk-informed design methods

  1. Risk-Informed Monitoring, Verification and Accounting (RI-MVA). An NRAP White Paper Documenting Methods and a Demonstration Model for Risk-Informed MVA System Design and Operations in Geologic Carbon Sequestration

    SciTech Connect

    Unwin, Stephen D.; Sadovsky, Artyom; Sullivan, E. C.; Anderson, Richard M.

    2011-09-30

    This white paper accompanies a demonstration model that implements methods for the risk-informed design of monitoring, verification and accounting (RI-MVA) systems in geologic carbon sequestration projects. The intent is that this model will ultimately be integrated with, or interfaced with, the National Risk Assessment Partnership (NRAP) integrated assessment model (IAM). The RI-MVA methods described here apply optimization techniques in the analytical environment of NRAP risk profiles to allow systematic identification and comparison of the risk and cost attributes of MVA design options.

  2. An Example of Risk Informed Design

    NASA Technical Reports Server (NTRS)

    Banke, Rick; Grant, Warren; Wilson, Paul

    2014-01-01

    NASA Engineering requested a Probabilistic Risk Assessment (PRA) to compare the difference in the risk of Loss of Crew (LOC) and Loss of Mission (LOM) between different designs of a fluid assembly. They were concerned that the configuration favored by the design team was more susceptible to leakage than a second proposed design, but realized that a quantitative analysis to compare the risks between the two designs might strengthen their argument. The analysis showed that while the second design did help improve the probability of LOC, it did not help from a probability of LOM perspective. This drove the analysis team to propose a minor design change that would drive the probability of LOM down considerably. The analysis also demonstrated that there was another major risk driver that was not immediately obvious from a typical engineering study of the design and was therefore unexpected. None of the proposed alternatives were addressing this risk. This type of trade study demonstrates the importance of performing a PRA in order to completely understand a system's design. It allows managers to use risk as another one of the commodities (e.g., mass, cost, schedule, fault tolerance) that can be traded early in the design of a new system.

  3. Risk Informed Design as Part of the Systems Engineering Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This slide presentation reviews the importance of Risk Informed Design (RID) as an important feature of the systems engineering process. RID is based on the principle that risk is a design commodity such as mass, volume, cost or power. It also reviews Probabilistic Risk Assessment (PRA) as it is used in the product life cycle in the development of NASA's Constellation Program.

  4. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants - Final Technical Report

    SciTech Connect

    Ritterbusch, Stanley; Golay, Michael; Duran, Felicia; Galyean, William; Gupta, Abhinav; Dimitrijevic, Vesna; Malsch, Marty

    2003-01-29

    OAK B188 Summary of methods proposed for risk informing the design and regulation of future nuclear power plants. All elements of the historical design and regulation process are preserved, but the methods proposed for new plants use probabilistic risk assessment methods as the primary decision making tool.

  5. Impact of NDE reliability developments on risk-informed methods

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk informed inspection procedures are being developed to more effectively and economically manage degradation in plant piping systems. A key element of this process is applying nondestructive examination (NDE) procedures capable of detecting specific damage mechanisms that may be operative in particular locations. Thus, the needs of risk informed analysis are closely coupled with a firm understanding of the capability of NDE.

  6. Risk Informed Design and Analysis Criteria for Nuclear Structures

    SciTech Connect

    Salmon, Michael W.

    2015-06-17

    Target performance can be achieved by defining design basis ground motion from results of a probabilistic seismic hazards assessment, and introducing known levels of conservatism in the design above the DBE. ASCE 4, 43, DOE-STD-1020 defined the DBE at 4x10-4 and introduce only slight levels of conservatism in response. ASCE 4, 43, DOE-STD-1020 assume code capacities shoot for about 98% NEP. There is a need to have a uniform target (98% NEP) for code developers (ACI, AISC, etc.) to aim for. In considering strengthening options, one must also consider cost/risk reduction achieved.

  7. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  8. Risk-informed assessment of regulatory and design requirements for future nuclear power plants. Annual report

    SciTech Connect

    2000-08-01

    OAK B188 Risk-informed assessment of regulatory and design requirements for future nuclear power plants. Annual report. The overall goal of this research project is to support innovation in new nuclear power plant designs. This project is examining the implications, for future reactors and future safety regulation, of utilizing a new risk-informed regulatory system as a replacement for the current system. This innovation will be made possible through development of a scientific, highly risk-formed approach for the design and regulation of nuclear power plants. This approach will include the development and/or confirmation of corresponding regulatory requirements and industry standards. The major impediment to long term competitiveness of new nuclear plants in the U.S. is the capital cost component--which may need to be reduced on the order of 35% to 40% for Advanced Light Water Reactors (ALWRS) such as System 80+ and Advanced Boiling Water Reactor (ABWR). The required cost reduction for an ALWR such as AP600 or AP1000 would be expected to be less. Such reductions in capital cost will require a fundamental reevaluation of the industry standards and regulatory bases under which nuclear plants are designed and licensed. Fortunately, there is now an increasing awareness that many of the existing regulatory requirements and industry standards are not significantly contributing to safety and reliability and, therefore, are unnecessarily adding to nuclear plant costs. Not only does this degrade the economic competitiveness of nuclear energy, it results in unnecessary costs to the American electricity consumer. While addressing these concerns, this research project will be coordinated with current efforts of industry and NRC to develop risk-informed, performance-based regulations that affect the operation of the existing nuclear plants; however, this project will go further by focusing on the design of new plants.

  9. Integrating Safety Assessment Methods using the Risk Informed Safety Margins Characterization (RISMC) Approach

    SciTech Connect

    Curtis Smith; Diego Mandelli

    2013-03-01

    Safety is central to the design, licensing, operation, and economics of nuclear power plants (NPPs). As the current light water reactor (LWR) NPPs age beyond 60 years, there are possibilities for increased frequency of systems, structures, and components (SSC) degradations or failures that initiate safety significant events, reduce existing accident mitigation capabilities, or create new failure modes. Plant designers commonly “over-design” portions of NPPs and provide robustness in the form of redundant and diverse engineered safety features to ensure that, even in the case of well-beyond design basis scenarios, public health and safety will be protected with a very high degree of assurance. This form of defense-in-depth is a reasoned response to uncertainties and is often referred to generically as “safety margin.” Historically, specific safety margin provisions have been formulated primarily based on engineering judgment backed by a set of conservative engineering calculations. The ability to better characterize and quantify safety margin is important to improved decision making about LWR design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development (R&D) in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. To support decision making related to economics, readability, and safety, the RISMC Pathway provides methods and tools that enable mitigation options known as margins management strategies. The purpose of the RISMC Pathway R&D is to support plant decisions for risk-informed

  10. Application of damage mechanism-specific NDE methods in support of risk-informed inspections

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk-informed inservice inspection (RISI) programs effectively concentrate limited and costly examination resources on systems and locations most relevant to plant safety. The thought process used in the selection of nondestructive evaluation (NDE) methods and procedures in a RISI program is expected to change toward integrating NDE into integrity management, with a concentration on understanding failure mechanisms. Identifying which damage mechanisms may be operative in specific locations and applying appropriate NDE methods to detect the presence of these damage mechanisms is fundamental to effective RISI application. Considerable information is already available on inspection for damage mechanisms such as intergranular stress corrosion cracking (IGSCC), thermal fatigue, and erosion-corrosion. Similar procedures are under development for other damage mechanisms that may occur individually or in combination with other mechanisms. Guidance is provided on application of NDE procedures in an RISI framework to facilitate implementation by utility staff (Gosselin, 1996).

  11. Nuclear Energy Research Initiative. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants. Annual Report

    SciTech Connect

    Ritterbusch, S.E.

    2000-08-01

    The overall goal of this research project is to support innovation in new nuclear power plant designs. This project is examining the implications, for future reactors and future safety regulation, of utilizing a new risk-informed regulatory system as a replacement for the current system. This innovation will be made possible through development of a scientific, highly risk-informed approach for the design and regulation of nuclear power plants. This approach will include the development and.lor confirmation of corresponding regulatory requirements and industry standards. The major impediment to long term competitiveness of new nuclear plants in the U.S. is the capital cost component--which may need to be reduced on the order of 35% to 40% for Advanced Light Water Reactors (ALWRs) such as System 80+ and Advanced Boiling Water Reactor (ABWR). The required cost reduction for an ALWR such as AP600 or AP1000 would be expected to be less. Such reductions in capital cost will require a fundamental reevaluation of the industry standards and regulatory bases under which nuclear plants are designed and licensed. Fortunately, there is now an increasing awareness that many of the existing regulatory requirements and industry standards are not significantly contributing to safety and reliability and, therefore, are unnecessarily adding to nuclear plant costs. Not only does this degrade the economic competitiveness of nuclear energy, it results in unnecessary costs to the American electricity consumer. While addressing these concerns, this research project will be coordinated with current efforts of industry and NRC to develop risk-informed, performance-based regulations that affect the operation of the existing nuclear plants; however, this project will go farther by focusing on the design of new plants.

  12. A Design Heritage-Based Forecasting Methodology for Risk Informed Management of Advanced Systems

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Fragola, Joseph R.

    1999-01-01

    The development of next generation systems often carries with it the promise of improved performance, greater reliability, and reduced operational costs. These expectations arise from the use of novel designs, new materials, advanced integration and production technologies intended for functionality replacing the previous generation. However, the novelty of these nascent technologies is accompanied by lack of operational experience and, in many cases, no actual testing as well. Therefore some of the enthusiasm surrounding most new technologies may be due to inflated aspirations from lack of knowledge rather than actual future expectations. This paper proposes a design heritage approach for improved reliability forecasting of advanced system components. The basis of the design heritage approach is to relate advanced system components to similar designs currently in operation. The demonstrated performance of these components could then be used to forecast the expected performance and reliability of comparable advanced technology components. In this approach the greater the divergence of the advanced component designs from the current systems the higher the uncertainty that accompanies the associated failure estimates. Designers of advanced systems are faced with many difficult decisions. One of the most common and more difficult types of these decisions are those related to the choice between design alternatives. In the past decision-makers have found these decisions to be extremely difficult to make because they often involve the trade-off between a known performing fielded design and a promising paper design. When it comes to expected reliability performance the paper design always looks better because it is on paper and it addresses all the know failure modes of the fielded design. On the other hand there is a long, and sometimes very difficult road, between the promise of a paper design and its fulfillment; with the possibility that sometimes the reliability

  13. Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology

    SciTech Connect

    Price, Joseph Daniel; Anderson, Robert Stephen

    2015-06-01

    Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operation can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.

  14. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  15. A pilot application of risk-informed methods to establish inservice inspection priorities for nuclear components at Surry Unit 1 Nuclear Power Station. Revision 1

    SciTech Connect

    Vo, T.V.; Phan, H.K.; Gore, B.F.; Simonen, F.A.; Doctor, S.R.

    1997-02-01

    As part of the Nondestructive Evaluation Reliability Program sponsored by the US Nuclear Regulatory Commission, the Pacific Northwest National Laboratory has developed risk-informed approaches for inservice inspection plans of nuclear power plants. This method uses probabilistic risk assessment (PRA) results to identify and prioritize the most risk-important components for inspection. The Surry Nuclear Power Station Unit 1 was selected for pilot application of this methodology. This report, which incorporates more recent plant-specific information and improved risk-informed methodology and tools, is Revision 1 of the earlier report (NUREG/CR-6181). The methodology discussed in the original report is no longer current and a preferred methodology is presented in this Revision. This report, NUREG/CR-6181, Rev. 1, therefore supersedes the earlier NUREG/CR-6181 published in August 1994. The specific systems addressed in this report are the auxiliary feedwater, the low-pressure injection, and the reactor coolant systems. The results provide a risk-informed ranking of components within these systems.

  16. Risk Informed Margins Management as part of Risk Informed Safety Margin Characterization

    SciTech Connect

    Curtis Smith

    2014-06-01

    The ability to better characterize and quantify safety margin is important to improved decision making about Light Water Reactor (LWR) design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plant safety and performance will become known. To support decision making related to economics, readability, and safety, the Risk Informed Safety Margin Characterization (RISMC) Pathway provides methods and tools that enable mitigation options known as risk informed margins management (RIMM) strategies.

  17. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants (Cooperative Agreement DE-FC03-99SF21902, Am. M004) Final Technical Report

    SciTech Connect

    Stanley E. Ritterbusch, et. al.

    2003-01-29

    OAK-B135 Research under this project addresses the barriers to long term use of nuclear-generated electricity in the United States. It was agreed that a very basic and significant change to the current method of design and regulation was needed. That is, it was believed that the cost reduction goal could not be met by fixing the current system (i.e., an evolutionary approach) and a new, more advanced approach for this project would be needed. It is believed that a completely new design and regulatory process would have to be developed--a ''clean sheet of paper'' approach. This new approach would start with risk-based methods, would establish probabilistic design criteria, and would implement defense-in-depth only when necessary (1) to meet public policy issues (e.g., use of a containment building no matter how low the probability of a large release is) and (2) to address uncertainties in probabilistic methods and equipment performance. This new approach is significantly different from the Nuclear Regulatory Commission's (NRC) current risk-informed program for operating plants. For our new approach, risk-based methods are the primary means for assuring plant safety, whereas in the NRC's current approach, defense-in-depth remains the primary means of assuring safety. The primary accomplishments in the first year--Phase 1 were (1) the establishment of a new, highly risk-informed design and regulatory framework, (2) the establishment of the preliminary version of the new, highly risk-informed design process, (3) core damage frequency predictions showing that, based on new, lower pipe rupture probabilities, the design of the emergency core cooling system equipment can be simplified without reducing plant safety, and (4) the initial development of methods for including uncertainties in a new integrated structures-systems design model. Under the new regulatory framework, options for the use of ''design basis accidents'' were evaluated. It is expected that design basis

  18. Designing ROW Methods

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.

    1996-01-01

    There are many aspects to consider when designing a Rosenbrock-Wanner-Wolfbrandt (ROW) method for the numerical integration of ordinary differential equations (ODE's) solving initial value problems (IVP's). The process can be simplified by constructing ROW methods around good Runge-Kutta (RK) methods. The formulation of a new, simple, embedded, third-order, ROW method demonstrates this design approach.

  19. Progress toward risk informed regulation

    SciTech Connect

    Rogers, K.C.

    1997-01-01

    For the last several years, the NRC, with encouragement from the industry, has been moving in the direction of risk informed regulation. This is consistent with the regulatory principle of efficiency, formally adopted by the Nuclear Regulatory Commission in 1991, which requires that regulatory activities be consistent with the degree of risk reduction they achieve. Probabilistic risk analysis has become the tool of choice for selecting the best of several alternatives. Closely related to risk informed regulation is the development of performance based rules. Such rules focus on the end result to be achieved. They do not specify the process, but instead establish the goals to be reached and how the achievement of those goals is to be judged. The inspection and enforcement activity is based on whether or not the goals have been met. The author goes on to offer comments on the history of the development of this process and its probable development in the future. He also addresses some issues which must be resolved or at least acknowledged. The success of risk informed regulation ultimately depends on having sufficiently reliable data to allow quantification of regulatory alternatives in terms of relative risk. Perhaps the area of human reliability and organizational performance has the greatest potential for improvement in reactor safety. The ability to model human performance is significantly less developed that the ability to model mechanical or electrical systems. The move toward risk informed, performance based regulation provides an unusual, perhaps unique, opportunity to establish a more rational, more effective basis for regulation.

  20. Using quantitative risk information in decisions about statins: a qualitative study in a community setting

    PubMed Central

    Polak, Louisa; Green, Judith

    2015-01-01

    Background A large literature informs guidance for GPs about communicating quantitative risk information so as to facilitate shared decision making. However, relatively little has been written about how patients utilise such information in practice. Aim To understand the role of quantitative risk information in patients’ accounts of decisions about taking statins. Design and setting This was a qualitative study, with participants recruited and interviewed in community settings. Method Semi-structured interviews were conducted with 34 participants aged >50 years, all of whom had been offered statins. Data were analysed thematically, using elements of the constant comparative method. Results Interviewees drew frequently on numerical test results to explain their decisions about preventive medication. In contrast, they seldom mentioned quantitative risk information, and never offered it as a rationale for action. Test results were spoken of as objects of concern despite an often-explicit absence of understanding, so lack of understanding seems unlikely to explain the non-use of risk estimates. Preventive medication was seen as ‘necessary’ either to treat test results, or because of personalised, unequivocal advice from a doctor. Conclusion This study’s findings call into question the assumption that people will heed and use numerical risk information once they understand it; these data highlight the need to consider the ways in which different kinds of knowledge are used in practice in everyday contexts. There was little evidence from this study that understanding probabilistic risk information was a necessary or valued condition for making decisions about statin use. PMID:25824187

  1. Communicating risk information and warnings

    USGS Publications Warehouse

    Mileti, D. S.

    1990-01-01

    Major advances have occurred over the last 20 years about how to effectively communicate risk information and warnings to the public. These lessons have been hard won. Knowledge has mounted on the finding from social scientific studies of risk communication failures, successes and those which fell somewhere in between. Moreover, the last 2 decades have borne witness to the brith, cultivation, and blossoming of information sharing between those physical scientists who discover new information about risk and those communcation scientists who trace its diffusion and then measure pbulic reaction. 

  2. Air Risk Information Support Center

    SciTech Connect

    Shoaf, C.R.; Guth, D.J.

    1990-12-31

    The Air Risk Information Support Center (Air RISC) was initiated in early 1988 by the US Environmental Protection Agency`s (EPA) Office of Health and Environmental Assessment (OHEA) and the Office of Air Quality Planning and Standards (OAQPS) as a technology transfer effort that would focus on providing information to state and local environmental agencies and to EPA Regional Offices in the areas of health, risk, and exposure assessment for toxic air pollutants. Technical information is fostered and disseminated by Air RISCs three primary activities: (1) a {open_quotes}hotline{close_quotes}, (2) quick turn-around technical assistance projects, and (3) general technical guidance projects. 1 ref., 2 figs.

  3. Control system design method

    DOEpatents

    Wilson, David G.; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  4. Integrated risk information system (IRIS)

    SciTech Connect

    Tuxen, L.

    1990-12-31

    The Integrated Risk Information System (IRIS) is an electronic information system developed by the US Environmental Protection Agency (EPA) containing information related to health risk assessment. IRIS is the Agency`s primary vehicle for communication of chronic health hazard information that represents Agency consensus following comprehensive review by intra-Agency work groups. The original purpose for developing IRIS was to provide guidance to EPA personnel in making risk management decisions. This original purpose for developing IRIS was to guidance to EPA personnel in making risk management decisions. This role has expanded and evolved with wider access and use of the system. IRIS contains chemical-specific information in summary format for approximately 500 chemicals. IRIS is available to the general public on the National Library of Medicine`s Toxicology Data Network (TOXNET) and on diskettes through the National Technical Information Service (NTIS).

  5. The FEM-2 design method

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.; Adams, L. M.; Mehrotra, P.; Vanrosendale, J.; Voigt, R. G.; Patrick, M.

    1983-01-01

    The FEM-2 parallel computer is designed using methods differing from those ordinarily employed in parallel computer design. The major distinguishing aspects are: (1) a top-down rather than bottom-up design process; (2) the design considers the entire system structure in terms of layers of virtual machines; and (3) each layer of virtual machine is defined formally during the design process. The result is a complete hardware/software system design. The basic design method is discussed and the advantages of the method are considered. A status report on the FEM-2 design is included.

  6. NASA Risk-Informed Decision Making Handbook

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Stamatelatos, Michael; Maggio, Gaspare; Everett, Christopher; Youngblood, Robert; Rutledge, Peter; Benjamin, Allan; Williams, Rodney; Smith, Curtis; Guarro, Sergio

    2010-01-01

    This handbook provides guidance for conducting risk-informed decision making in the context of NASA risk management (RM), with a focus on the types of direction-setting key decisions that are characteristic of the NASA program and project life cycles, and which produce derived requirements in accordance with existing systems engineering practices that flow down through the NASA organizational hierarchy. The guidance in this handbook is not meant to be prescriptive. Instead, it is meant to be general enough, and contain a sufficient diversity of examples, to enable the reader to adapt the methods as needed to the particular decision problems that he or she faces. The handbook highlights major issues to consider when making decisions in the presence of potentially significant uncertainty, so that the user is better able to recognize and avoid pitfalls that might otherwise be experienced.

  7. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Variations in design methods for aircraft digital flight control are evaluated and compared. The methods fall into two categories; those where the design is done in the continuous domain (or s plane) and those where the design is done in the discrete domain (or z plane). Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the uncompensated s plane design method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  8. RISK-INFORMED SAFETY MARGIN CHARACTERIZATION

    SciTech Connect

    Nam Dinh; Ronaldo Szilard

    2009-07-01

    The concept of safety margins has served as a fundamental principle in the design and operation of commercial nuclear power plants (NPPs). Defined as the minimum distance between a system’s “loading” and its “capacity”, plant design and operation is predicated on ensuring an adequate safety margin for safety-significant parameters (e.g., fuel cladding temperature, containment pressure, etc.) is provided over the spectrum of anticipated plant operating, transient and accident conditions. To meet the anticipated challenges associated with extending the operational lifetimes of the current fleet of operating NPPs, the United States Department of Energy (USDOE), the Idaho National Laboratory (INL) and the Electric Power Research Institute (EPRI) have developed a collaboration to conduct coordinated research to identify and address the technological challenges and opportunities that likely would affect the safe and economic operation of the existing NPP fleet over the postulated long-term time horizons. In this paper we describe a framework for developing and implementing a Risk-Informed Safety Margin Characterization (RISMC) approach to evaluate and manage changes in plant safety margins over long time horizons.

  9. Risk-Informed Assessment Methodology Development and Application

    SciTech Connect

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-07-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  10. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Tashker, M. G.; Powell, J. D.

    1975-01-01

    Investigations were conducted in two main areas: the first area is control system design, and the goals were to define the limits of 'digitized S-Plane design techniques' vs. sample rate, to show the results of a 'direct digital design technique', and to compare the two methods; the second area was to evaluate the roughness of autopilot designs parametrically versus sample rate. Goals of the first area were addressed by (1) an analysis of a 2nd order example using both design methods, (2) a linear analysis of the complete 737 aircraft with an autoland obtained using the digitized S-plane technique, (3) linear analysis of a high frequency 737 approximation with the autoland from a direct digital design technique, and (4) development of a simulation for evaluation of the autopilots with disturbances and nonlinearities included. Roughness evaluation was studied by defining an experiment to be carried out on the Langley motion simulator and coordinated with analysis at Stanford.

  11. PRISM: a planned risk information seeking model.

    PubMed

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone. PMID:20512716

  12. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  13. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  14. Design method of supercavitating pumps

    NASA Astrophysics Data System (ADS)

    Kulagin, V.; Likhachev, D.; Li, F. C.

    2016-05-01

    The problem of effective supercavitating (SC) pump is solved, and optimum load distribution along the radius of the blade is found taking into account clearance, degree of cavitation development, influence of finite number of blades, and centrifugal forces. Sufficient accuracy can be obtained using the equivalent flat SC-grid for design of any SC-mechanisms, applying the “grid effect” coefficient and substituting the skewed flow calculated for grids of flat plates with the infinite attached cavitation caverns. This article gives the universal design method and provides an example of SC-pump design.

  15. Risk-informed Maintenance for Non-coherent Systems

    NASA Astrophysics Data System (ADS)

    Tao, Ye

    Probabilistic Safety Assessment (PSA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity. The information provided by PSA has been increasingly implemented for regulatory purposes but rarely used in providing information for operation and maintenance activities. As one of the key parts in PSA, Fault Tree Analysis (FTA) attempts to model and analyze failure processes of engineering and biological systems. The fault trees are composed of logic diagrams that display the state of the system and are constructed using graphical design techniques. Risk Importance Measures (RIMs) are information that can be obtained from both qualitative and quantitative aspects of FTA. Components within a system can be ranked with respect to each specific criterion defined by each RIM. Through a RIM, a ranking of the components or basic events can be obtained and provide valuable information for risk-informed decision making. Various RIMs have been applied in various applications. In order to provide a thorough understanding of RIMs and interpret the results, they are categorized with respect to risk significance (RS) and safety significance (SS) in this thesis. This has also tied them into different maintenance activities. When RIMs are used for maintenance purposes, it is called risk-informed maintenance. On the other hand, the majority of work produced on the FTA method has been concentrated on failure logic diagrams restricted to the direct or implied use of AND and OR operators. Such systems are considered as coherent systems. However, the NOT logic can also contribute to the information produced by PSA. The importance analysis of non-coherent systems is rather limited, even though the field has received more and more attention over the years. The non-coherent systems introduce difficulties in both qualitative and quantitative assessment of the fault tree compared with the coherent systems. In this thesis, a set

  16. DISPLACEMENT BASED SEISMIC DESIGN METHODS.

    SciTech Connect

    HOFMAYER,C.MILLER,C.WANG,Y.COSTELLO,J.

    2003-07-15

    A research effort was undertaken to determine the need for any changes to USNRC's seismic regulatory practice to reflect the move, in the earthquake engineering community, toward using expected displacement rather than force (or stress) as the basis for assessing design adequacy. The research explored the extent to which displacement based seismic design methods, such as given in FEMA 273, could be useful for reviewing nuclear power stations. Two structures common to nuclear power plants were chosen to compare the results of the analysis models used. The first structure is a four-story frame structure with shear walls providing the primary lateral load system, referred herein as the shear wall model. The second structure is the turbine building of the Diablo Canyon nuclear power plant. The models were analyzed using both displacement based (pushover) analysis and nonlinear dynamic analysis. In addition, for the shear wall model an elastic analysis with ductility factors applied was also performed. The objectives of the work were to compare the results between the analyses, and to develop insights regarding the work that would be needed before the displacement based analysis methodology could be considered applicable to facilities licensed by the NRC. A summary of the research results, which were published in NUREGICR-6719 in July 2001, is presented in this paper.

  17. A historical perspective of risk-informed regulation

    SciTech Connect

    Campbell, P.L.

    1996-12-01

    In Federal studies, the process of using risk information is described as having two general components: (1) risk assessment - the application of credible scientific principles and statistical methods to develop estimates of the likely effects of natural phenomena and human factors and the characterization of these estimates in a form appropriate for the intended audience (e.g., agency decisionmakers, public); and (2) risk management - the process of weighing policy alternatives and selecting the most appropriate regulatory action, integrating the results of risk assessment with engineering data with social, economic, and political concerns to reach a decision. This paper discusses largely the second component.

  18. Design of diffractive optical surfaces within the SMS design method

    NASA Astrophysics Data System (ADS)

    Mendes-Lopes, João.; Benítez, Pablo; Miñano, Juan C.

    2015-08-01

    The Simultaneous Multiple Surface (SMS) method was initially developed as a design method in Nonimaging Optics and later, the method was extended for designing Imaging Optics. We present the extension of the SMS method to design diffractive optical surfaces. This method involves the simultaneous calculation of N/2 diffractive surfaces, using the phase-shift properties of diffractive surfaces as an extra degree of freedom, such that N one-parameter wavefronts can be perfectly coupled. Moreover, the SMS method for diffractive surfaces is a direct method, i.e., it is not based in multi-parametric optimization techniques. Representative diffractive systems designed by the SMS method are presented.

  19. An airfoil design method for viscous flows

    NASA Technical Reports Server (NTRS)

    Malone, J. B.; Narramore, J. C.; Sankar, L. N.

    1990-01-01

    An airfoil design procedure is described that has been incorporated into an existing two-dimensional Navier-Stokes airfoil analysis method. The resulting design method, an iterative procedure based on a residual-correction algorithm, permits the automated design of airfoil sections with prescribed surface pressure distributions. This paper describes the inverse design method and the technique used to specify target pressure distributions. An example airfoil design problem is described to demonstrate application of the inverse design procedure. It shows that this inverse design method develops useful airfoil configurations with a reasonable expenditure of computer resources.

  20. RISK-INFORMED BALANCING OF SAFETY, NONPROLIFERATION, AND ECONOMICS FOR THE SFR

    SciTech Connect

    Apostolakis, George; Driscoll, Michael; Golay, Michael; Kadak, Andrew; Todreas, Neil; Aldmir, Tunc; Denning, Richard; Lineberry, Michael

    2011-10-20

    A substantial barrier to the implementation of Sodium-cooled Fast Reactor (SFR) technology in the short term is the perception that they would not be economically competitive with advanced light water reactors. With increased acceptance of risk-informed regulation, the opportunity exists to reduce the costs of a nuclear power plant at the design stage without applying excessive conservatism that is not needed in treating low risk events. In the report, NUREG-1860, the U.S. Nuclear Regulatory Commission describes developmental activities associated with a risk-informed, scenario-based technology neutral framework (TNF) for regulation. It provides quantitative yardsticks against which the adequacy of safety risks can be judged. We extend these concepts to treatment of proliferation risks. The objective of our project is to develop a risk-informed design process for minimizing the cost of electricity generation within constraints of adequate safety and proliferation risks. This report describes the design and use of this design optimization process within the context of reducing the capital cost and levelized cost of electricity production for a small (possibly modular) SFR. Our project provides not only an evaluation of the feasibility of a risk-informed design process but also a practical test of the applicability of the TNF to an actual advanced, non-LWR design. The report provides results of five safety related and one proliferation related case studies of innovative design alternatives. Applied to previously proposed SFR nuclear energy system concepts We find that the TNF provides a feasible initial basis for licensing new reactors. However, it is incomplete. We recommend improvements in terms of requiring acceptance standards for total safety risks, and we propose a framework for regulation of proliferation risks. We also demonstrate methods for evaluation of proliferation risks. We also suggest revisions to scenario-specific safety risk acceptance standards

  1. Review of freeform TIR collimator design methods

    NASA Astrophysics Data System (ADS)

    Talpur, Taimoor; Herkommer, Alois

    2016-04-01

    Total internal reflection (TIR) collimators are essential illumination components providing high efficiency and uniformity in a compact geometry. Various illumination design methods have been developed for designing such collimators, including tailoring methods, design via optimization, the mapping and feedback method, and the simultaneous multiple surface (SMS) method. This paper provides an overview of the different methods and compares the performance of the methods along with their advantages and their limitations.

  2. Computational methods for stealth design

    SciTech Connect

    Cable, V.P. )

    1992-08-01

    A review is presented of the utilization of computer models for stealth design toward the ultimate goal of designing and fielding an aircraft that remains undetected at any altitude and any range. Attention is given to the advancements achieved in computational tools and their utilization. Consideration is given to the development of supercomputers for large-scale scientific computing and the development of high-fidelity, 3D, radar-signature-prediction tools for complex shapes with nonmetallic and radar-penetrable materials.

  3. The value of personalised risk information: a qualitative study of the perceptions of patients with prostate cancer

    PubMed Central

    Han, Paul K J; Hootsmans, Norbert; Neilson, Michael; Roy, Bethany; Kungel, Terence; Gutheil, Caitlin; Diefenbach, Michael; Hansen, Moritz

    2013-01-01

    Objective To explore the experiences of patients with prostate cancer with risk information and their perceptions of the value of personalised risk information in treatment decisions. Design A qualitative study was conducted using focus groups. Semistructured interviews explored participants’ experiences with using risk information, and their perceptions of the potential value of personalised risk information produced by clinical prediction models. Participants English-speaking patients, ages 54–82, diagnosed with prostate cancer within the past 3 years, residing in rural and non-rural geographic locations in Maine (USA), and attending prostate cancer patient support groups. Setting 6 focus groups were conducted with 27 patients; separate groups were held for patients with low-risk, medium-risk and high-risk disease defined by National Comprehensive Cancer Network guidelines. Results Several participants reported receiving risk information that was imprecise rather than precise, qualitative rather than quantitative, indirect rather than direct and focused on biomarker values rather than clinical outcomes. Some participants felt that personalised risk information could be useful in helping them make better informed decisions, but expressed scepticism about its value. Many participants favoured decision-making strategies that were heuristic-based and intuitive rather than risk-based and deliberative, and perceived other forms of evidence—emotions, recommendations of trusted physicians, personal narratives—as more reliable and valuable in treatment decisions. Conclusions Patients with prostate cancer appear to have little experience using personalised risk information, may favour heuristic-based over risk-based decision-making strategies and may perceive personalised risk information as less valuable than other types of evidence. These decision-making approaches and perceptions represent potential barriers to the clinical use of personalised risk information

  4. Spacesuit Radiation Shield Design Methods

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Anderson, Brooke M.; Cucinotta, Francis A.; Ware, J.; Zeitlin, Cary J.

    2006-01-01

    Meeting radiation protection requirements during EVA is predominantly an operational issue with some potential considerations for temporary shelter. The issue of spacesuit shielding is mainly guided by the potential of accidental exposure when operational and temporary shelter considerations fail to maintain exposures within operational limits. In this case, very high exposure levels are possible which could result in observable health effects and even be life threatening. Under these assumptions, potential spacesuit radiation exposures have been studied using known historical solar particle events to gain insight on the usefulness of modification of spacesuit design in which the control of skin exposure is a critical design issue and reduction of blood forming organ exposure is desirable. Transition to a new spacesuit design including soft upper-torso and reconfigured life support hardware gives an opportunity to optimize the next generation spacesuit for reduced potential health effects during an accidental exposure.

  5. Computational Methods in Nanostructure Design

    NASA Astrophysics Data System (ADS)

    Bellesia, Giovanni; Lampoudi, Sotiria; Shea, Joan-Emma

    Self-assembling peptides can serve as building blocks for novel biomaterials. Replica exchange molecular dynamics simulations are a powerful means to probe the conformational space of these peptides. We discuss the theoretical foundations of this enhanced sampling method and its use in biomolecular simulations. We then apply this method to determine the monomeric conformations of the Alzheimer amyloid-β(12-28) peptide that can serve as initiation sites for aggregation.

  6. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  7. Communicating Cancer Risk Information: The Challenges of Uncertainty.

    ERIC Educational Resources Information Center

    Bottorff, Joan L.; Ratner, Pamela A.; Johnson, Joy L.; Lovato, Chris Y.; Joab, S. Amanda

    1998-01-01

    Accurate and sensitive communication of cancer-risk information is important. Based on a literature review of 75 research reports, expert opinion papers, and clinical protocols, a synthesis of what is known about the communication of cancer-risk information is presented. Relevance of information to those not tested is discussed. (Author/EMK)

  8. Analyses to support development of risk-informed separation distances for hydrogen codes and standards.

    SciTech Connect

    LaChance, Jeffrey L.; Houf, William G.; Fluer, Inc., Paso Robels, CA; Fluer, Larry; Middleton, Bobby

    2009-03-01

    The development of a set of safety codes and standards for hydrogen facilities is necessary to ensure they are designed and operated safely. To help ensure that a hydrogen facility meets an acceptable level of risk, code and standard development organizations are tilizing risk-informed concepts in developing hydrogen codes and standards.

  9. HTGR analytical methods and design verification

    SciTech Connect

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier.

  10. Applications of a transonic wing design method

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Smith, Leigh A.

    1989-01-01

    A method for designing wings and airfoils at transonic speeds using a predictor/corrector approach was developed. The procedure iterates between an aerodynamic code, which predicts the flow about a given geometry, and the design module, which compares the calculated and target pressure distributions and modifies the geometry using an algorithm that relates differences in pressure to a change in surface curvature. The modular nature of the design method makes it relatively simple to couple it to any analysis method. The iterative approach allows the design process and aerodynamic analysis to converge in parallel, significantly reducing the time required to reach a final design. Viscous and static aeroelastic effects can also be accounted for during the design or as a post-design correction. Results from several pilot design codes indicated that the method accurately reproduced pressure distributions as well as the coordinates of a given airfoil or wing by modifying an initial contour. The codes were applied to supercritical as well as conventional airfoils, forward- and aft-swept transport wings, and moderate-to-highly swept fighter wings. The design method was found to be robust and efficient, even for cases having fairly strong shocks.

  11. Impeller blade design method for centrifugal compressors

    NASA Technical Reports Server (NTRS)

    Jansen, W.; Kirschner, A. M.

    1974-01-01

    The design of a centrifugal impeller with blades that are aerodynamically efficient, easy to manufacture, and mechanically sound is discussed. The blade design method described here satisfies the first two criteria and with a judicious choice of certain variables will also satisfy stress considerations. The blade shape is generated by specifying surface velocity distributions and consists of straight-line elements that connect points at hub and shroud. The method may be used to design radially elemented and backward-swept blades. The background, a brief account of the theory, and a sample design are described.

  12. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    NASA Astrophysics Data System (ADS)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  13. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  14. Mixed Methods Research Designs in Counseling Psychology

    ERIC Educational Resources Information Center

    Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.

    2005-01-01

    With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…

  15. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  16. Risk-informed inservice test activities at the NRC

    SciTech Connect

    Fischer, D.; Cheok, M.; Hsia, A.

    1996-12-01

    The operational readiness of certain safety-related components is vital to the safe operation of nuclear power plants. Inservice testing (IST) is one of the mechanisms used by licensees to ensure this readiness. In the past, the type and frequency of IST have been based on the collective best judgment of the NRC and industry in an ASME Code consensus process and NRC rulemaking process. Furthermore, IST requirements have not explicitly considered unique component and system designs and contribution to overall plant risk. Because of the general nature of ASME Code test requirements and non-reliance on risk estimates, current IST requirements may not adequately emphasize testing those components that are most important to safety and may overly emphasize testing of less safety significant components. Nuclear power plant licensees are currently interested in optimizing testing by applying resources in more safety significant areas and, where appropriate, reducing measures in less safety-significant areas. They are interested in maintaining system availability and reducing overall maintenance costs in ways that do not adversely affect safety. The NRC has been interested in using probabilistic, as an adjunct to deterministic, techniques to help define the scope, type and frequency of IST. The development of risk-informed IST programs has the potential to optimize the use of NRC and industry resources without adverse affect on safety.

  17. Development of a hydraulic turbine design method

    NASA Astrophysics Data System (ADS)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  18. Preliminary aerothermodynamic design method for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Petrie, S. L.

    1987-01-01

    Preliminary design methods are presented for vehicle aerothermodynamics. Predictions are made for Shuttle orbiter, a Mach 6 transport vehicle and a high-speed missile configuration. Rapid and accurate methods are discussed for obtaining aerodynamic coefficients and heat transfer rates for laminar and turbulent flows for vehicles at high angles of attack and hypersonic Mach numbers.

  19. Combinatorial protein design strategies using computational methods.

    PubMed

    Kono, Hidetoshi; Wang, Wei; Saven, Jeffery G

    2007-01-01

    Computational methods continue to facilitate efforts in protein design. Most of this work has focused on searching sequence space to identify one or a few sequences compatible with a given structure and functionality. Probabilistic computational methods provide information regarding the range of amino acid variability permitted by desired functional and structural constraints. Such methods may be used to guide the construction of both individual sequences and combinatorial libraries of proteins. PMID:17041256

  20. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  1. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  2. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  3. Optimization methods applied to hybrid vehicle design

    NASA Technical Reports Server (NTRS)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  4. Effects of baseline risk information on social and individual choices.

    PubMed

    Gyrd-Hansen, Dorte; Kristiansen, Ivar Sønbø; Nexøe, Jørgen; Nielsen, Jesper Bo

    2002-01-01

    This article analyzes preferences for risk reductions in the context of individual and societal decision making. The effect of information on baseline risk is analyzed in both contexts. The results indicate that if individuals are to imagine that they suffer from 1 low-risk and 1 high-risk ailment, and are offered a specified identical absolute risk reduction, a majority will ceteris paribus opt for treatment of the low-risk ailment. A different preference structure is elicited when priority questions are framed as social choices. Here, a majority will prefer to treat the high-risk group of patients. The preference reversal demonstrates the extent to which baseline risk information can influence preferences in different choice settings. It is argued that presentation of baseline risk information may induce framing effects that lead to nonoptimal resource allocations. A solution to this problem may be to not present group-specific baseline risk information when eliciting preferences. PMID:11833667

  5. A sociotechnical method for designing work systems.

    PubMed

    Waterson, Patrick E; Older Gray, Melanie T; Clegg, Chris W

    2002-01-01

    The paper describes a new method for allocating work between and among humans and machines. The method consists of a series of stages, which cover how the overall work system should be organized and designed; how tasks within the work system should be allocated (human-human allocations); and how tasks involving the use of technology should be allocated (human-machine allocations). The method makes use of a series of decision criteria that allow end users to consider a range of factors relevant to function allocation, including aspects of job, organizational, and technological design. The method is described in detail using an example drawn from a workshop involving the redesign of a naval command and control (C2) subsystem. We also report preliminary details of the evaluation of the method, based on the views of participants at the workshop. A final section outlines the contribution of the work in terms of current theoretical developments within the domain of function allocation. The method has been applied to the domain of naval C2 systems; however, it is also designed for generic use within function allocation and sociotechnical work systems. PMID:12502156

  6. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  7. MAST Propellant and Delivery System Design Methods

    NASA Technical Reports Server (NTRS)

    Nadeem, Uzair; Mc Cleskey, Carey M.

    2015-01-01

    A Mars Aerospace Taxi (MAST) concept and propellant storage and delivery case study is undergoing investigation by NASA's Element Design and Architectural Impact (EDAI) design and analysis forum. The MAST lander concept envisions landing with its ascent propellant storage tanks empty and supplying these reusable Mars landers with propellant that is generated and transferred while on the Mars surface. The report provides an overview of the data derived from modeling between different methods of propellant line routing (or "lining") and differentiate the resulting design and operations complexity of fluid and gaseous paths based on a given set of fluid sources and destinations. The EDAI team desires a rough-order-magnitude algorithm for estimating the lining characteristics (i.e., the plumbing mass and complexity) associated different numbers of vehicle propellant sources and destinations. This paper explored the feasibility of preparing a mathematically sound algorithm for this purpose, and offers a method for the EDAI team to implement.

  8. Approaches to cancer assessment in EPA's Integrated Risk Information System

    SciTech Connect

    Gehlhaus, Martin W.; Gift, Jeffrey S.; Hogan, Karen A.; Kopylev, Leonid; Schlosser, Paul M.; Kadry, Abdel-Razak

    2011-07-15

    The U.S. Environmental Protection Agency's (EPA) Integrated Risk Information System (IRIS) Program develops assessments of health effects that may result from chronic exposure to chemicals in the environment. The IRIS database contains more than 540 assessments. When supported by available data, IRIS assessments provide quantitative analyses of carcinogenic effects. Since publication of EPA's 2005 Guidelines for Carcinogen Risk Assessment, IRIS cancer assessments have implemented new approaches recommended in these guidelines and expanded the use of complex scientific methods to perform quantitative dose-response assessments. Two case studies of the application of the mode of action framework from the 2005 Cancer Guidelines are presented in this paper. The first is a case study of 1,2,3-trichloropropane, as an example of a chemical with a mutagenic mode of carcinogenic action thus warranting the application of age-dependent adjustment factors for early-life exposure; the second is a case study of ethylene glycol monobutyl ether, as an example of a chemical with a carcinogenic action consistent with a nonlinear extrapolation approach. The use of physiologically based pharmacokinetic (PBPK) modeling to quantify interindividual variability and account for human parameter uncertainty as part of a quantitative cancer assessment is illustrated using a case study involving probabilistic PBPK modeling for dichloromethane. We also discuss statistical issues in assessing trends and model fit for tumor dose-response data, analysis of the combined risk from multiple types of tumors, and application of life-table methods for using human data to derive cancer risk estimates. These issues reflect the complexity and challenges faced in assessing the carcinogenic risks from exposure to environmental chemicals, and provide a view of the current trends in IRIS carcinogenicity risk assessment.

  9. Acoustic Treatment Design Scaling Methods. Phase 2

    NASA Technical Reports Server (NTRS)

    Clark, L. (Technical Monitor); Parrott, T. (Technical Monitor); Jones, M. (Technical Monitor); Kraft, R. E.; Yu, J.; Kwan, H. W.; Beer, B.; Seybert, A. F.; Tathavadekar, P.

    2003-01-01

    The ability to design, build and test miniaturized acoustic treatment panels on scale model fan rigs representative of full scale engines provides not only cost-savings, but also an opportunity to optimize the treatment by allowing multiple tests. To use scale model treatment as a design tool, the impedance of the sub-scale liner must be known with confidence. This study was aimed at developing impedance measurement methods for high frequencies. A normal incidence impedance tube method that extends the upper frequency range to 25,000 Hz. without grazing flow effects was evaluated. The free field method was investigated as a potential high frequency technique. The potential of the two-microphone in-situ impedance measurement method was evaluated in the presence of grazing flow. Difficulties in achieving the high frequency goals were encountered in all methods. Results of developing a time-domain finite difference resonator impedance model indicated that a re-interpretation of the empirical fluid mechanical models used in the frequency domain model for nonlinear resistance and mass reactance may be required. A scale model treatment design that could be tested on the Universal Propulsion Simulator vehicle was proposed.

  10. 3. 6 simplified methods for design

    SciTech Connect

    Nickell, R.E.; Yahr, G.T.

    1981-01-01

    Simplified design analysis methods for elevated temperature construction are classified and reviewed. Because the major impetus for developing elevated temperature design methodology during the past ten years has been the LMFBR program, considerable emphasis is placed upon results from this source. The operating characteristics of the LMFBR are such that cycles of severe transient thermal stresses can be interspersed with normal elevated temperature operational periods of significant duration, leading to a combination of plastic and creep deformation. The various simplified methods are organized into two general categories, depending upon whether it is the material, or constitutive, model that is reduced, or the geometric modeling that is simplified. Because the elastic representation of material behavior is so prevalent, an entire section is devoted to elastic analysis methods. Finally, the validation of the simplified procedures is discussed.

  11. Geometric methods for the design of mechanisms

    NASA Astrophysics Data System (ADS)

    Stokes, Ann Westagard

    1993-01-01

    Challenges posed by the process of designing robotic mechanisms have provided a new impetus to research in the classical subjects of kinematics, elastic analysis, and multibody dynamics. Historically, mechanism designers have considered these areas of analysis to be generally separate and distinct sciences. However, there are significant classes of problems which require a combination of these methods to arrive at a satisfactory solution. For example, both the compliance and the inertia distribution strongly influence the performance of a robotic manipulator. In this thesis, geometric methods are applied to the analysis of mechanisms where kinematics, elasticity, and dynamics play fundamental and interactive roles. Tools for the mathematical analysis, design, and optimization of a class of holonomic and nonholonomic mechanisms are developed. Specific contributions of this thesis include a network theory for elasto-kinematic systems. The applicability of the network theory is demonstrated by employing it to calculate the optimal distribution of joint compliance in a serial manipulator. In addition, the advantage of applying Lie group theoretic approaches to mechanisms requiring specific dynamic properties is demonstrated by extending Brockett's product of exponentials formula to the domain of dynamics. Conditions for the design of manipulators having inertia matrices which are constant in joint angle coordinates are developed. Finally, analysis and design techniques are developed for a class of mechanisms which rectify oscillations into secular motions. These techniques are applied to the analysis of free-floating chains that can reorient themselves in zero angular momentum processes and to the analysis of rattleback tops.

  12. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  13. A risk-informed approach to safety margins analysis

    SciTech Connect

    Curtis Smith; Diego Mandelli

    2013-07-01

    The Risk Informed Safety Margins Characterization (RISMC) Pathway is a systematic approach developed to characterize and quantify safety margins of nuclear power plant structures, systems and components. The model has been tested on the Advanced Test Reactor (ATR) at Idaho National Lab.

  14. Waterflooding injectate design systems and methods

    SciTech Connect

    Brady, Patrick V.; Krumhansl, James L.

    2014-08-19

    A method of designing an injectate to be used in a waterflooding operation is disclosed. One aspect includes specifying data representative of chemical characteristics of a liquid hydrocarbon, a connate, and a reservoir rock, of a subterranean reservoir. Charged species at an interface of the liquid hydrocarbon are determined based on the specified data by evaluating at least one chemical reaction. Charged species at an interface of the reservoir rock are determined based on the specified data by evaluating at least one chemical reaction. An extent of surface complexation between the charged species at the interfaces of the liquid hydrocarbon and the reservoir rock is determined by evaluating at least one surface complexation reaction. The injectate is designed and is operable to decrease the extent of surface complexation between the charged species at interfaces of the liquid hydrocarbon and the reservoir rock. Other methods, apparatus, and systems are disclosed.

  15. An improved design method for EPC middleware

    NASA Astrophysics Data System (ADS)

    Lou, Guohuan; Xu, Ran; Yang, Chunming

    2014-04-01

    For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.

  16. Design methods of rhombic tensegrity structures

    NASA Astrophysics Data System (ADS)

    Feng, Xi-Qiao; Li, Yue; Cao, Yan-Ping; Yu, Shou-Wen; Gu, Yuan-Tong

    2010-08-01

    As a special type of novel flexible structures, tensegrity holds promise for many potential applications in such fields as materials science, biomechanics, civil and aerospace engineering. Rhombic systems are an important class of tensegrity structures, in which each bar constitutes the longest diagonal of a rhombus of four strings. In this paper, we address the design methods of rhombic structures based on the idea that many tensegrity structures can be constructed by assembling one-bar elementary cells. By analyzing the properties of rhombic cells, we first develop two novel schemes, namely, direct enumeration scheme and cell-substitution scheme. In addition, a facile and efficient method is presented to integrate several rhombic systems into a larger tensegrity structure. To illustrate the applications of these methods, some novel rhombic tensegrity structures are constructed.

  17. Direct optimization method for reentry trajectory design

    NASA Astrophysics Data System (ADS)

    Jallade, S.; Huber, P.; Potti, J.; Dutruel-Lecohier, G.

    The software package called `Reentry and Atmospheric Transfer Trajectory' (RATT) was developed under ESA contract for the design of atmospheric trajectories. It includes four software TOP (Trajectory OPtimization) programs, which optimize reentry and aeroassisted transfer trajectories. 6FD and 3FD (6 and 3 degrees of freedom Flight Dynamic) are devoted to the simulation of the trajectory. SCA (Sensitivity and Covariance Analysis) performs covariance analysis on a given trajectory with respect to different uncertainties and error sources. TOP provides the optimum guidance law of a three degree of freedom reentry of aeroassisted transfer (AAOT) trajectories. Deorbit and reorbit impulses (if necessary) can be taken into account in the optimization. A wide choice of cost function is available to the user such as the integrated heat flux, or the sum of the velocity impulses, or a linear combination of both of them for trajectory and vehicle design. The crossrange and the downrange can be maximized during reentry trajectory. Path constraints are available on the load factor, the heat flux and the dynamic pressure. Results on these proposed options are presented. TOPPHY is the part of the TOP software corresponding to the definition and the computation of the optimization problemphysics. TOPPHY can interface with several optimizes with dynamic solvers: TOPOP and TROPIC using direct collocation methods and PROMIS using direct multiple shooting method. TOPOP was developed in the frame of this contract, it uses Hermite polynomials for the collocation method and the NPSOL optimizer from the NAG library. Both TROPIC and PROMIS were developed by the DLR (Deutsche Forschungsanstalt fuer Luft und Raumfahrt) and use the SLSQP optimizer. For the dynamic equation resolution, TROPIC uses a collocation method with Splines and PROMIS uses a multiple shooting method with finite differences. The three different optimizers including dynamics were tested on the reentry trajectory of the

  18. Methods for structural design at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.

    1973-01-01

    A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.

  19. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  20. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  1. Pharmaceutical websites and the communication of risk information.

    PubMed

    Davis, Joel J; Cross, Emily; Crowley, John

    2007-01-01

    This study examines the pharmaceutical websites of 44 leading direct-to-consumer (DTC) advertised drugs to determine the extent to which risk information was completely communicated. Three operational definitions of "completeness" were used: communication of the single highest incidence side effect, communication of top three highest incidence side effects, and communication of side effects with incidence of >or= 10% (all measured in terms of absolute percentage). Results indicated that regardless of the measures used, pharmaceutical websites are unlikely to completely communicate risk information. About two thirds of all sites communicated the single highest incidence side effect or all top three side effects. For drugs with side effects at >or= 10% incidence, only about half of their websites fully reported all effects at this level of incidence. Implications for advertisers and regulatory agencies are presented. PMID:17365347

  2. Needs for Risk Informing Environmental Cleanup Decision Making - 13613

    SciTech Connect

    Zhu, Ming; Moorer, Richard

    2013-07-01

    This paper discusses the needs for risk informing decision making by the U.S. Department of Energy (DOE) Office of Environmental Management (EM). The mission of the DOE EM is to complete the safe cleanup of the environmental legacy brought about from the nation's five decades of nuclear weapons development and production and nuclear energy research. This work represents some of the most technically challenging and complex cleanup efforts in the world and is projected to require the investment of billions of dollars and several decades to complete. Quantitative assessments of health and environmental risks play an important role in work prioritization and cleanup decisions of these challenging environmental cleanup and closure projects. The risk assessments often involve evaluation of performance of integrated engineered barriers and natural systems over a period of hundreds to thousands of years, when subject to complex geo-environmental transformation processes resulting from remediation and disposal actions. The requirement of resource investments for the cleanup efforts and the associated technical challenges have subjected the EM program to continuous scrutiny by oversight entities. Recent DOE reviews recommended application of a risk-informed approach throughout the EM complex for improved targeting of resources. The idea behind this recommendation is that by using risk-informed approaches to prioritize work scope, the available resources can be best utilized to reduce environmental and health risks across the EM complex, while maintaining the momentum of the overall EM cleanup program at a sustainable level. In response to these recommendations, EM is re-examining its work portfolio and key decision making with risk insights for the major sites. This paper summarizes the review findings and recommendations from the DOE internal reviews, discusses the needs for risk informing the EM portfolio and makes an attempt to identify topics for R and D in integrated

  3. Design Method and Calibration of Moulinet

    NASA Astrophysics Data System (ADS)

    Itoh, Hirokazu; Yamada, Hirokazu; Udagawa, Sinsuke

    The formula for obtaining the absorption horsepower of a Moulinet was rewritten, and the physical meaning of the constant in the formula was clarified. Based on this study, the design method of the Moulinet and the calibration method of the Moulinet that was performed after manufacture were verified experimentally. Consequently, the following was clarified; (1) If the propeller power coefficient was taken to be the proportionality constant, the absorption horsepower of the Moulinet was proportional to the cube of the revolution speed, and the fifth power of the Moulinet diameter. (2) If the Moulinet design was geometrically similar to the standard dimensions of the Aviation Technical Research Center's type-6 Moulinet, the proportionality constant C1 given in the reference could be used, and the absorption horsepower of the Moulinet was proportional to the cube of the revolution speed, the cube of the Moulinet diameter, and the side projection area of the Moulinet. (3) The proportionality constant C1 was proportional to the propeller power coefficient CP.

  4. 10 CFR 50.69 - Risk-informed categorization and treatment of structures, systems and components for nuclear...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... whose expertise includes, at a minimum, PRA, safety analysis, plant operation, design engineering, and..., systems and components for nuclear power reactors. 50.69 Section 50.69 Energy NUCLEAR REGULATORY..., systems and components for nuclear power reactors. (a) Definitions. Risk-Informed Safety Class...

  5. 10 CFR 50.69 - Risk-informed categorization and treatment of structures, systems and components for nuclear...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... whose expertise includes, at a minimum, PRA, safety analysis, plant operation, design engineering, and..., systems and components for nuclear power reactors. 50.69 Section 50.69 Energy NUCLEAR REGULATORY..., systems and components for nuclear power reactors. (a) Definitions. Risk-Informed Safety Class...

  6. A structural design decomposition method utilizing substructuring

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.

    1994-01-01

    A new method of design decomposition for structural analysis and optimization is described. For this method, the structure is divided into substructures where each substructure has its structural response described by a structural-response subproblem, and its structural sizing determined from a structural-sizing subproblem. The structural responses of substructures that have rigid body modes when separated from the remainder of the structure are further decomposed into displacements that have no rigid body components, and a set of rigid body modes. The structural-response subproblems are linked together through forces determined within a structural-sizing coordination subproblem which also determines the magnitude of any rigid body displacements. Structural-sizing subproblems having constraints local to the substructures are linked together through penalty terms that are determined by a structural-sizing coordination subproblem. All the substructure structural-response subproblems are totally decoupled from each other, as are all the substructure structural-sizing subproblems, thus there is significant potential for use of parallel solution methods for these subproblems.

  7. Method for designing gas tag compositions

    DOEpatents

    Gross, Kenny C.

    1995-01-01

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node #1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node #2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred.

  8. Method for designing gas tag compositions

    DOEpatents

    Gross, K.C.

    1995-04-11

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node No. 1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node No. 2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred. 5 figures.

  9. Research and Design of Rootkit Detection Method

    NASA Astrophysics Data System (ADS)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  10. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  11. A rainfall design method for spatial flood risk assessment: considering multiple flood sources

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Tatano, H.

    2015-08-01

    Information about the spatial distribution of flood risk is important for integrated urban flood risk management. Focusing on urban areas, spatial flood risk assessment must reflect all risk information derived from multiple flood sources: rivers, drainage, coastal flooding etc. that may affect the area. However, conventional flood risk assessment deals with each flood source independently, which leads to an underestimation of flood risk in the floodplain. Even in floodplains that have no risk from coastal flooding, flooding from river channels and inundation caused by insufficient drainage capacity should be considered simultaneously. For integrated flood risk management, it is necessary to establish a methodology to estimate flood risk distribution across a floodplain. In this paper, a rainfall design method for spatial flood risk assessment, which considers the joint effects of multiple flood sources, is proposed. The concept of critical rainfall duration determined by the concentration time of flooding is introduced to connect response characteristics of different flood sources with rainfall. A copula method is then adopted to capture the correlation of rainfall amount with different critical rainfall durations. Rainfall events are designed taking advantage of the copula structure of correlation and marginal distribution of rainfall amounts within different critical rainfall durations. A case study in the Otsu River Basin, Osaka prefecture, Japan was conducted to demonstrate this methodology.

  12. Effects of racial and ethnic group and health literacy on responses to genomic risk information in a medically underserved population

    PubMed Central

    Kaphingst, Kimberly A.; Stafford, Jewel D.; McGowan, Lucy D’Agostino; Seo, Joann; Lachance, Christina R.; Goodman, Melody S.

    2015-01-01

    Objective Few studies have examined how individuals respond to genomic risk information for common, chronic diseases. This randomized study examined differences in responses by type of genomic information [genetic test/family history] and disease condition [diabetes/heart disease] and by race/ethnicity in a medically underserved population. Methods 1057 English-speaking adults completed a survey containing one of four vignettes (two-by-two randomized design). Differences in dependent variables (i.e., interest in receiving genomic assessment, discussing with doctor or family, changing health habits) by experimental condition and race/ethnicity were examined using chi-squared tests and multivariable regression analysis. Results No significant differences were found in dependent variables by type of genomic information or disease condition. In multivariable models, Hispanics were more interested in receiving a genomic assessment than Whites (OR=1.93; p<0.0001); respondents with marginal (OR=1.54; p=0.005) or limited (OR=1.85; p=0.009) health literacy had greater interest than those with adequate health literacy. Blacks (OR=1.78; p=0.001) and Hispanics (OR=1.85; p=0.001) had greater interest in discussing information with family than Whites. Non-Hispanic Blacks (OR=1.45; p=0.04) had greater interest in discussing genomic information with a doctor than Whites. Blacks (β= −0.41; p<0.001) and Hispanics (β= −0.25; p=0.033) intended to change fewer health habits than Whites; health literacy was negatively associated with number of health habits participants intended to change. Conclusions Findings suggest that race/ethnicity may affect responses to genomic risk information. Additional research could examine how cognitive representations of this information differ across racial/ethnic groups. Health literacy is also critical to consider in developing approaches to communicating genomic information. PMID:25622080

  13. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  14. An inverse design method for 2D airfoil

    NASA Astrophysics Data System (ADS)

    Liang, Zhi-Yong; Cui, Peng; Zhang, Gen-Bao

    2010-03-01

    The computational method for aerodynamic design of aircraft is applied more universally than before, in which the design of an airfoil is a hot problem. The forward problem is discussed by most relative papers, but inverse method is more useful in practical designs. In this paper, the inverse design of 2D airfoil was investigated. A finite element method based on the variational principle was used for carrying out. Through the simulation, it was shown that the method was fit for the design.

  15. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  16. Materials Reliability Program: Risk-Informed Revision of ASME Section XI Appendix G - Proof of Concept (MRP-143)

    SciTech Connect

    B. Bishop; et al

    2005-03-30

    This study indicates that risk-informed methods can be used to significantly relax the current ASME and NRC Appendix G requirements while still maintaining satisfactory levels of reactor vessel structural integrity. This relaxation in Appendix G requirements directly translates into significant improvements in operational flexibility.

  17. Global optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  18. An Efficient Inverse Aerodynamic Design Method For Subsonic Flows

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II

    2000-01-01

    Computational Fluid Dynamics based design methods are maturing to the point that they are beginning to be used in the aircraft design process. Many design methods however have demonstrated deficiencies in the leading edge region of airfoil sections. The objective of the present research is to develop an efficient inverse design method which is valid in the leading edge region. The new design method is a streamline curvature method, and a new technique is presented for modeling the variation of the streamline curvature normal to the surface. The new design method allows the surface coordinates to move normal to the surface, and has been incorporated into the Constrained Direct Iterative Surface Curvature (CDISC) design method. The accuracy and efficiency of the design method is demonstrated using both two-dimensional and three-dimensional design cases.

  19. Design optimization method for Francis turbine

    NASA Astrophysics Data System (ADS)

    Kawajiri, H.; Enomoto, Y.; Kurosawa, S.

    2014-03-01

    This paper presents a design optimization system coupled CFD. Optimization algorithm of the system employs particle swarm optimization (PSO). Blade shape design is carried out in one kind of NURBS curve defined by a series of control points. The system was applied for designing the stationary vanes and the runner of higher specific speed francis turbine. As the first step, single objective optimization was performed on stay vane profile, and second step was multi-objective optimization for runner in wide operating range. As a result, it was confirmed that the design system is useful for developing of hydro turbine.

  20. Alternative methods for the design of jet engine control systems

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Leake, R. J.; Basso, R.; Gejji, R.; Maloney, A.; Seshadri, V.

    1976-01-01

    Various alternatives to linear quadratic design methods for jet engine control systems are discussed. The main alternatives are classified into two broad categories: nonlinear global mathematical programming methods and linear local multivariable frequency domain methods. Specific studies within these categories include model reduction, the eigenvalue locus method, the inverse Nyquist method, polynomial design, dynamic programming, and conjugate gradient approaches.

  1. Demystifying Mixed Methods Research Design: A Review of the Literature

    ERIC Educational Resources Information Center

    Caruth, Gail D.

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research.…

  2. Airfoil design method using the Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Malone, J. B.; Narramore, J. C.; Sankar, L. N.

    1991-01-01

    An airfoil design procedure is described that was incorporated into an existing 2-D Navier-Stokes airfoil analysis method. The resulting design method, an iterative procedure based on a residual-correction algorithm, permits the automated design of airfoil sections with prescribed surface pressure distributions. The inverse design method and the technique used to specify target pressure distributions are described. It presents several example problems to demonstrate application of the design procedure. It shows that this inverse design method develops useful airfoil configurations with a reasonable expenditure of computer resources.

  3. A Method of Integrated Description of Design Information for Reusability

    NASA Astrophysics Data System (ADS)

    Tsumaya, Akira; Nagae, Masao; Wakamatsu, Hidefumi; Shirase, Keiichi; Arai, Eiji

    Much of product design is executed concurrently these days. For such concurrent design, the method which can share and ueuse varioud kind of design information among designers is needed. However, complete understanding of the design information among designers have been a difficult issue. In this paper, design process model with use of designers’ intention is proposed. A method to combine the design process information and the design object information is also proposed. We introduce how to describe designers’ intention by providing some databases. Keyword Database consists of ontological data related to design object/activities. Designers select suitable keyword(s) from Keyword Database and explain the reason/ideas for their design activities by the description with use of keyword(s). We also developed the integration design information management system architecture by using a method of integrated description with designers’ intension. This system realizes connections between the information related to design process and that related to design object through designers’ intention. Designers can communicate with each other to understand how others make decision in design through that. Designers also can re-use both design process information data and design object information data through detabase management sub-system.

  4. Light Water Reactor Sustainability Program Risk Informed Safety Margin Characterization (RISMC) Advanced Test Reactor Demonstration Case Study

    SciTech Connect

    Curtis Smith; David Schwieder; Cherie Phelan; Anh Bui; Paul Bayless

    2012-08-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about LWR design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the RISMC Pathway R&D is to support plant decisions for risk-informed margins management with the aim to improve economics, reliability, and sustain safety of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. This report describes the RISMC methodology demonstration where the Advanced Test Reactor (ATR) was used as a test-bed for purposes of determining safety margins. As part of the demonstration, we describe how both the thermal-hydraulics and probabilistic safety calculations are integrated and used to quantify margin management strategies.

  5. JASMINE design and method of data reduction

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Niwa, Yoshito

    2008-07-01

    Japan Astrometry Satellite Mission for Infrared Exploration (JASMINE) aims to construct a map of the Galactic bulge with 10 μ arc sec accuracy. We use z-band CCD for avoiding dust absorption, and observe about 10 × 20 degrees area around the Galactic bulge region. Because the stellar density is very high, each FOVs can be combined with high accuracy. With 5 years observation, we will construct 10 μ arc sec accurate map. In this poster, I will show the observation strategy, design of JASMINE hardware, reduction scheme, and error budget. We also construct simulation software named JASMINE Simulator. We also show the simulation results and design of software.

  6. Lithography aware overlay metrology target design method

    NASA Astrophysics Data System (ADS)

    Lee, Myungjun; Smith, Mark D.; Lee, Joonseuk; Jung, Mirim; Lee, Honggoo; Kim, Youngsik; Han, Sangjun; Adel, Michael E.; Lee, Kangsan; Lee, Dohwa; Choi, Dongsub; Liu, Zephyr; Itzkovich, Tal; Levinski, Vladimir; Levy, Ady

    2016-03-01

    We present a metrology target design (MTD) framework based on co-optimizing lithography and metrology performance. The overlay metrology performance is strongly related to the target design and optimizing the target under different process variations in a high NA optical lithography tool and measurement conditions in a metrology tool becomes critical for sub-20nm nodes. The lithography performance can be quantified by device matching and printability metrics, while accuracy and precision metrics are used to quantify the metrology performance. Based on using these metrics, we demonstrate how the optimized target can improve target printability while maintaining the good metrology performance for rotated dipole illumination used for printing a sub-100nm diagonal feature in a memory active layer. The remaining challenges and the existing tradeoff between metrology and lithography performance are explored with the metrology target designer's perspective. The proposed target design framework is completely general and can be used to optimize targets for different lithography conditions. The results from our analysis are both physically sensible and in good agreement with experimental results.

  7. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  8. A comparison of digital flight control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Many variations in design methods for aircraft digital flight control have been proposed in the literature. In general, the methods fall into two categories: those where the design is done in the continuous domain (or s-plane), and those where the design is done in the discrete domain (or z-plane). This paper evaluates several variations of each category and compares them for various flight control modes of the Langley TCV Boeing 737 aircraft. Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the 'uncompensated s-plane design' method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  9. Soft Computing Methods in Design of Superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  10. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  11. The Triton: Design concepts and methods

    NASA Technical Reports Server (NTRS)

    Meholic, Greg; Singer, Michael; Vanryn, Percy; Brown, Rhonda; Tella, Gustavo; Harvey, Bob

    1992-01-01

    During the design of the C & P Aerospace Triton, a few problems were encountered that necessitated changes in the configuration. After the initial concept phase, the aspect ratio was increased from 7 to 7.6 to produce a greater lift to drag ratio (L/D = 13) which satisfied the horsepower requirements (118 hp using the Lycoming O-235 engine). The initial concept had a wing planform area of 134 sq. ft. Detailed wing sizing analysis enlarged the planform area to 150 sq. ft., without changing its layout or location. The most significant changes, however, were made just prior to inboard profile design. The fuselage external diameter was reduced from 54 to 50 inches to reduce drag to meet the desired cruise speed of 120 knots. Also, the nose was extended 6 inches to accommodate landing gear placement. Without the extension, the nosewheel received an unacceptable percentage (25 percent) of the landing weight. The final change in the configuration was made in accordance with the stability and control analysis. In order to reduce the static margin from 20 to 13 percent, the horizontal tail area was reduced from 32.02 to 25.0 sq. ft. The Triton meets all the specifications set forth in the design criteria. If time permitted another iteration of the calculations, two significant changes would be made. The vertical stabilizer area would be reduced to decrease the aircraft lateral stability slope since the current value was too high in relation to the directional stability slope. Also, the aileron size would be decreased to reduce the roll rate below the current 106 deg/second. Doing so would allow greater flap area (increasing CL(sub max)) and thus reduce the overall wing area. C & P would also recalculate the horsepower and drag values to further validate the 120 knot cruising speed.

  12. The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia

    NASA Astrophysics Data System (ADS)

    Kucharski, John; Tkach, Mark; Olszewski, Jennifer; Chaudhry, Rabia; Mendoza, Guillermo

    2016-04-01

    This presentation demonstrates the application of Climate Risk Informed Decision Analysis (CRIDA) at Zambia's principal water treatment facility, The Iolanda Water Treatment Plant. The water treatment plant is prone to unacceptable failures during periods of low hydropower production at the Kafue Gorge Dam Hydroelectric Power Plant. The case study explores approaches of increasing the water treatment plant's ability to deliver acceptable levels of service under the range of current and potential future climate states. The objective of the study is to investigate alternative investments to build system resilience that might have been informed by the CRIDA process, and to evaluate the extra resource requirements by a bilateral donor agency to implement the CRIDA process. The case study begins with an assessment of the water treatment plant's vulnerability to climate change. It does so by following general principals described in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework". By utilizing relatively simple bootstrapping methods a range of possible future climate states is generated while avoiding the use of more complex and costly downscaling methodologies; that are beyond the budget and technical capacity of many teams. The resulting climate vulnerabilities and uncertainty in the climate states that produce them are analyzed as part of a "Level of Concern" analysis. CRIDA principals are then applied to this Level of Concern analysis in order to arrive at a set of actionable water management decisions. The principal goals of water resource management is to transform variable, uncertain hydrology into dependable services (e.g. water supply, flood risk reduction, ecosystem benefits, hydropower production, etc…). Traditional approaches to climate adaptation require the generation of predicted future climate states but do little guide decision makers how this information should impact decision making. In

  13. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a

  14. A flexible layout design method for passive micromixers.

    PubMed

    Deng, Yongbo; Liu, Zhenyu; Zhang, Ping; Liu, Yongshun; Gao, Qingyong; Wu, Yihui

    2012-10-01

    This paper discusses a flexible layout design method of passive micromixers based on the topology optimization of fluidic flows. Being different from the trial and error method, this method obtains the detailed layout of a passive micromixer according to the desired mixing performance by solving a topology optimization problem. Therefore, the dependence on the experience of the designer is weaken, when this method is used to design a passive micromixer with acceptable mixing performance. Several design disciplines for the passive micromixers are considered to demonstrate the flexibility of the layout design method for passive micromixers. These design disciplines include the approximation of the real 3D micromixer, the manufacturing feasibility, the spacial periodic design, and effects of the Péclet number and Reynolds number on the designs obtained by this layout design method. The capability of this design method is validated by several comparisons performed between the obtained layouts and the optimized designs in the recently published literatures, where the values of the mixing measurement is improved up to 40.4% for one cycle of the micromixer. PMID:22736305

  15. Design Methods and Optimization for Morphing Aircraft

    NASA Technical Reports Server (NTRS)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  16. Defining resilience within a risk-informed assessment framework

    SciTech Connect

    Coles, Garill A.; Unwin, Stephen D.; Holter, Gregory M.; Bass, Robert B.; Dagle, Jeffery E.

    2011-08-01

    The concept of resilience is the subject of considerable discussion in academic, business, and governmental circles. The United States Department of Homeland Security for one has emphasised the need to consider resilience in safeguarding critical infrastructure and key resources. The concept of resilience is complex, multidimensional, and defined differently by different stakeholders. The authors contend that there is a benefit in moving from discussing resilience as an abstraction to defining resilience as a measurable characteristic of a system. This paper proposes defining resilience measures using elements of a traditional risk assessment framework to help clarify the concept of resilience and as a way to provide non-traditional risk information. The authors show various, diverse dimensions of resilience can be quantitatively defined in a common risk assessment framework based on the concept of loss of service. This allows the comparison of options for improving the resilience of infrastructure and presents a means to perform cost-benefit analysis. This paper discusses definitions and key aspects of resilience, presents equations for the risk of loss of infrastructure function that incorporate four key aspects of resilience that could prevent or mitigate that loss, describes proposed resilience factor definitions based on those risk impacts, and provides an example that illustrates how resilience factors would be calculated using a hypothetical scenario.

  17. Background risk information to assist in risk management decision making

    SciTech Connect

    Hammonds, J.S.; Hoffman, F.O.; White, R.K.; Miller, D.B.

    1992-10-01

    The evaluation of the need for remedial activities at hazardous waste sites requires quantification of risks of adverse health effects to humans and the ecosystem resulting from the presence of chemical and radioactive substances at these sites. The health risks from exposure to these substances are in addition to risks encountered because of the virtually unavoidable exposure to naturally occurring chemicals and radioactive materials that are present in air, water, soil, building materials, and food products. To provide a frame of reference for interpreting risks quantified for hazardous waste sites, it is useful to identify the relative magnitude of risks of both a voluntary and involuntary nature that are ubiquitous throughout east Tennessee. In addition to discussing risks from the ubiquitous presence of background carcinogens in the east Tennessee environment, this report also presents risks resulting from common, everyday activities. Such information should, not be used to discount or trivialize risks from hazardous waste contamination, but rather, to create a sensitivity to general risk issues, thus providing a context for better interpretation of risk information.

  18. Comparison of Traditional Design Nonlinear Programming Optimization and Stochastic Methods for Structural Design

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2010-01-01

    Structural design generated by traditional method, optimization method and the stochastic design concept are compared. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions were produced by all the three methods. The variation in the weight calculated by the methods was modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliabilitytraced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  19. Analytical techniques for instrument design - matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-09-01

    We take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalisation to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, we discuss a toolbox of matrix manipulations that can be performed on the 6- dimensional Cooper-Nathans matrix: diagonalisation (Moller-Nielsen method), coordinate changes e.g. from ({Delta}k{sub I},{Delta}k{sub F} to {Delta}E, {Delta}Q & 2 dummy variables), integration of one or more variables (e.g. over such dummy variables), integration subject to linear constraints (e.g. Bragg`s Law for analysers), inversion to give the variance-covariance matrix, and so on. We show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. We will argue that a generalised program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. We will also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  20. Design Method for Single-Blade Centrifugal Pump Impeller

    NASA Astrophysics Data System (ADS)

    Nishi, Yasuyuki; Fujiwara, Ryota; Fukutomi, Junichiro

    The sewage pumps are demanded a high pump efficiency and a performance in passing foreign bodies. Therefore, the impeller used by these usages requires the large passed particle size (minimum particle size in the pump). However, because conventional design method of pump impeller results in small impeller exit width, it is difficult to be applied to the design of single-blade centrifugal pump impeller which is used as a sewage pump. This paper proposes a design method for single-blade centrifugal pump impeller. As a result, the head curve of the impeller designed by the proposed design method satisfied design specifications, and pump efficiency was over 62% more than conventional single-blade centrifugal pump impeller. By comparing design values with CFD analysis values, the suction velocity ratio of the design parameter agreed well with each other, but the relative velocity ratio did not agree due to the influence of the backflow of the impeller entrance.

  1. Methods for very high temperature design

    SciTech Connect

    Blass, J.J.; Corum, J.M.; Chang, S.J.

    1989-01-01

    Design rules and procedures for high-temperature, gas-cooled reactor components are being formulated as an ASME Boiler and Pressure Vessel Code Case. A draft of the Case, patterned after Code Case N-47, and limited to Inconel 617 and temperatures of 982/degree/C (1800/degree/F) or less, will be completed in 1989 for consideration by relevant Code committees. The purpose of this paper is to provide a synopsis of the significant differences between the draft Case and N-47, and to provide more complete accounts of the development of allowable stress and stress rupture values and the development of isochronous stress vs strain curves, in both of which Oak Ridge National Laboratory (ORNL) played a principal role. The isochronous curves, which represent average behavior for many heats of Inconel 617, were based in part on a unified constitutive model developed at ORNL. Details are also provided of this model of inelastic deformation behavior, which does not distinguish between rate-dependent plasticity and time-dependent creep, along with comparisons between calculated and observed results of tests conducted on a typical heat of Inconel 617 by the General Electric Company for the Department of Energy. 4 refs., 15 figs., 1 tab.

  2. Analytical techniques for instrument design -- Matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-12-31

    The authors take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalization to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, they discuss a toolbox of matrix manipulations that can be performed on the 6-dimensional Cooper-Nathans matrix. They show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. They will argue that a generalized program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. They also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  3. Perspectives toward the stereotype production method for public symbol design: a case study of novice designers.

    PubMed

    Ng, Annie W Y; Siu, Kin Wai Michael; Chan, Chetwyn C H

    2013-01-01

    This study investigated the practices and attitudes of novice designers toward user involvement in public symbol design at the conceptual design stage, i.e. the stereotype production method. Differences between male and female novice designers were examined. Forty-eight novice designers (24 male, 24 female) were asked to design public symbol referents based on suggestions made by a group of users in a previous study and provide feedback with regard to the design process. The novice designers were receptive to the adoption of user suggestions in the conception of the design, but tended to modify the pictorial representations generated by the users to varying extents. It is also significant that the male and female novice designers appeared to emphasize different aspects of user suggestions, and the female novice designers were more positive toward these suggestions than their male counterparts. The findings should aid the optimization of the stereotype production method for user-involved symbol design. PMID:22632980

  4. Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :

    SciTech Connect

    Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine; LaChance, Jeffrey L.; Horne, Douglas B.

    2014-03-01

    Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazards from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.

  5. Risk-Informing Safety Reviews for Non-Reactor Nuclear Facilities

    SciTech Connect

    Mubayi, V.; Azarm, A.; Yue, M.; Mukaddam, W.; Good, G.; Gonzalez, F.; Bari, R.A.

    2011-03-13

    This paper describes a methodology used to model potential accidents in fuel cycle facilities that employ chemical processes to separate and purify nuclear materials. The methodology is illustrated with an example that uses event and fault trees to estimate the frequency of a specific energetic reaction that can occur in nuclear material processing facilities. The methodology used probabilistic risk assessment (PRA)-related tools as well as information about the chemical reaction characteristics, information on plant design and operational features, and generic data about component failure rates and human error rates. The accident frequency estimates for the specific reaction help to risk-inform the safety review process and assess compliance with regulatory requirements.

  6. HEALTHY study rationale, design and methods

    PubMed Central

    2009-01-01

    The HEALTHY primary prevention trial was designed and implemented in response to the growing numbers of children and adolescents being diagnosed with type 2 diabetes. The objective was to moderate risk factors for type 2 diabetes. Modifiable risk factors measured were indicators of adiposity and glycemic dysregulation: body mass index ≥85th percentile, fasting glucose ≥5.55 mmol l-1 (100 mg per 100 ml) and fasting insulin ≥180 pmol l-1 (30 μU ml-1). A series of pilot studies established the feasibility of performing data collection procedures and tested the development of an intervention consisting of four integrated components: (1) changes in the quantity and nutritional quality of food and beverage offerings throughout the total school food environment; (2) physical education class lesson plans and accompanying equipment to increase both participation and number of minutes spent in moderate-to-vigorous physical activity; (3) brief classroom activities and family outreach vehicles to increase knowledge, enhance decision-making skills and support and reinforce youth in accomplishing goals; and (4) communications and social marketing strategies to enhance and promote changes through messages, images, events and activities. Expert study staff provided training, assistance, materials and guidance for school faculty and staff to implement the intervention components. A cohort of students were enrolled in sixth grade and followed to end of eighth grade. They attended a health screening data collection at baseline and end of study that involved measurement of height, weight, blood pressure, waist circumference and a fasting blood draw. Height and weight were also collected at the end of the seventh grade. The study was conducted in 42 middle schools, six at each of seven locations across the country, with 21 schools randomized to receive the intervention and 21 to act as controls (data collection activities only). Middle school was the unit of sample size and

  7. Experimental design for improved ceramic processing, emphasizing the Taguchi Method

    SciTech Connect

    Weiser, M.W. . Mechanical Engineering Dept.); Fong, K.B. )

    1993-12-01

    Ceramic processing often requires substantial experimentation to produce acceptable product quality and performance. This is a consequence of ceramic processes depending upon a multitude of factors, some of which can be controlled and others that are beyond the control of the manufacturer. Statistical design of experiments is a procedure that allows quick, economical, and accurate evaluation of processes and products that depend upon several variables. Designed experiments are sets of tests in which the variables are adjusted methodically. A well-designed experiment yields unambiguous results at minimal cost. A poorly designed experiment may reveal little information of value even with complex analysis, wasting valuable time and resources. This article will review the most common experimental designs. This will include both nonstatistical designs and the much more powerful statistical experimental designs. The Taguchi Method developed by Grenichi Taguchi will be discussed in some detail. The Taguchi method, based upon fractional factorial experiments, is a powerful tool for optimizing product and process performance.

  8. A new interval optimization method considering tolerance design

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Xie, H. C.; Zhang, Z. G.; Han, X.

    2015-12-01

    This study considers the design variable uncertainty in the actual manufacturing process for a product or structure and proposes a new interval optimization method based on tolerance design, which can provide not only an optimal design but also the allowable maximal manufacturing errors that the design can bear. The design variables' manufacturing errors are depicted using the interval method, and an interval optimization model for the structure is constructed. A dimensionless design tolerance index is defined to describe the overall uncertainty of all design variables, and by combining the nominal objective function, a deterministic two-objective optimization model is built. The possibility degree of interval is used to represent the reliability of the constraints under uncertainty, through which the model is transformed to a deterministic optimization problem. Three numerical examples are investigated to verify the effectiveness of the present method.

  9. An analytical method for designing low noise helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.

    1978-01-01

    The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.

  10. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  11. Artificial Intelligence Methods: Challenge in Computer Based Polymer Design

    NASA Astrophysics Data System (ADS)

    Rusu, Teodora; Pinteala, Mariana; Cartwright, Hugh

    2009-08-01

    This paper deals with the use of Artificial Intelligence Methods (AI) in the design of new molecules possessing desired physical, chemical and biological properties. This is an important and difficult problem in the chemical, material and pharmaceutical industries. Traditional methods involve a laborious and expensive trial-and-error procedure, but computer-assisted approaches offer many advantages in the automation of molecular design.

  12. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  13. Expanding color design methods for architecture and allied disciplines

    NASA Astrophysics Data System (ADS)

    Linton, Harold E.

    2002-06-01

    The color design processes of visual artists, architects, designers, and theoreticians included in this presentation reflect the practical role of color in architecture. What the color design professional brings to the architectural design team is an expertise and rich sensibility made up of a broad awareness and a finely tuned visual perception. This includes a knowledge of design and its history, expertise with industrial color materials and their methods of application, an awareness of design context and cultural identity, a background in physiology and psychology as it relates to human welfare, and an ability to problem-solve and respond creatively to design concepts with innovative ideas. The broadening of the definition of the colorists's role in architectural design provides architects, artists and designers with significant opportunities for continued professional and educational development.

  14. Design methods for fault-tolerant finite state machines

    NASA Technical Reports Server (NTRS)

    Niranjan, Shailesh; Frenzel, James F.

    1993-01-01

    VLSI electronic circuits are increasingly being used in space-borne applications where high levels of radiation may induce faults, known as single event upsets. In this paper we review the classical methods of designing fault tolerant digital systems, with an emphasis on those methods which are particularly suitable for VLSI-implementation of finite state machines. Four methods are presented and will be compared in terms of design complexity, circuit size, and estimated circuit delay.

  15. Aerodynamic design optimization by using a continuous adjoint method

    NASA Astrophysics Data System (ADS)

    Luo, JiaQi; Xiong, JunTao; Liu, Feng

    2014-07-01

    This paper presents the fundamentals of a continuous adjoint method and the applications of this method to the aerodynamic design optimization of both external and internal flows. General formulation of the continuous adjoint equations and the corresponding boundary conditions are derived. With the adjoint method, the complete gradient information needed in the design optimization can be obtained by solving the governing flow equations and the corresponding adjoint equations only once for each cost function, regardless of the number of design parameters. An inverse design of airfoil is firstly performed to study the accuracy of the adjoint gradient and the effectiveness of the adjoint method as an inverse design method. Then the method is used to perform a series of single and multiple point design optimization problems involving the drag reduction of airfoil, wing, and wing-body configuration, and the aerodynamic performance improvement of turbine and compressor blade rows. The results demonstrate that the continuous adjoint method can efficiently and significantly improve the aerodynamic performance of the design in a shape optimization problem.

  16. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  17. An inverse method with regularity condition for transonic airfoil design

    NASA Technical Reports Server (NTRS)

    Zhu, Ziqiang; Xia, Zhixun; Wu, Liyi

    1991-01-01

    It is known from Lighthill's exact solution of the incompressible inverse problem that in the inverse design problem, the surface pressure distribution and the free stream speed cannot both be prescribed independently. This implies the existence of a constraint on the prescribed pressure distribution. The same constraint exists at compressible speeds. Presented here is an inverse design method for transonic airfoils. In this method, the target pressure distribution contains a free parameter that is adjusted during the computation to satisfy the regularity condition. Some design results are presented in order to demonstrate the capabilities of the method.

  18. An artificial viscosity method for the design of supercritical airfoils

    NASA Technical Reports Server (NTRS)

    Mcfadden, G. B.

    1979-01-01

    A numerical technique is presented for the design of two-dimensional supercritical wing sections with low wave drag. The method is a design mode of the analysis code H which gives excellent agreement with experimental results and is widely used in the aircraft industry. Topics covered include the partial differential equations of transonic flow, the computational procedure and results; the design procedure; a convergence theorem; and description of the code.

  19. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods Research Perspective

    ERIC Educational Resources Information Center

    Hitchcock, John H.; Nastasi, Bonnie K.; Summerville, Meredith

    2010-01-01

    The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature. These two…

  20. 77 FR 55832 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of a New Equivalent Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... made under the provisions of 40 CFR part 53, as ] amended on August 31, 2011 (76 FR 54326-54341). The... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of a New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of a new equivalent method...

  1. 77 FR 60985 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ... 53, as amended on August 31, 2011 (76 FR 54326-54341). The new equivalent methods are automated... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of three new...

  2. Investigating the Use of Design Methods by Capstone Design Students at Clemson University

    ERIC Educational Resources Information Center

    Miller, W. Stuart; Summers, Joshua D.

    2013-01-01

    The authors describe a preliminary study to understand the attitude of engineering students regarding the use of design methods in projects to identify the factors either affecting or influencing the use of these methods by novice engineers. A senior undergraduate capstone design course at Clemson University, consisting of approximately fifty…

  3. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  4. New directions for Artificial Intelligence (AI) methods in optimum design

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1989-01-01

    Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.

  5. Approximate method of designing a two-element airfoil

    NASA Astrophysics Data System (ADS)

    Abzalilov, D. F.; Mardanov, R. F.

    2011-09-01

    An approximate method is proposed for designing a two-element airfoil. The method is based on reducing an inverse boundary-value problem in a doubly connected domain to a problem in a singly connected domain located on a multisheet Riemann surface. The essence of the method is replacement of channels between the airfoil elements by channels of flow suction and blowing. The shape of these channels asymptotically tends to the annular shape of channels passing to infinity on the second sheet of the Riemann surface. The proposed method can be extended to designing multielement airfoils.

  6. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  7. Design method for four-reflector type beam waveguide systems

    NASA Technical Reports Server (NTRS)

    Betsudan, S.; Katagi, T.; Urasaki, S.

    1986-01-01

    Discussed is a method for the design of four reflector type beam waveguide feed systems, comprised of a conical horn and 4 focused reflectors, which are used widely as the primary reflector systems for communications satellite Earth station antennas. The design parameters for these systems are clarified, the relations between each parameter are brought out based on the beam mode development, and the independent design parameters are specified. The characteristics of these systems, namely spillover loss, crosspolarization components, and frequency characteristics, and their relation to the design parameters, are also shown. It is also indicated that design parameters which decide the dimensions of the conical horn or the shape of the focused reflectors can be unerringly established once the design standard for the system has been selected as either: (1) minimizing the crosspolarization component by keeping the spillover loss to within acceptable limits, or (2) minimizing the spillover loss by maintaining the crossover components below an acceptable level and the independent design parameters, such as the respective sizes of the focused reflectors and the distances between the focussed reflectors, etc., have been established according to mechanical restrictions. A sample design is also shown. In addition to being able to clarify the effects of each of the design parameters on the system and improving insight into these systems, the efficiency of these systems will also be increased with this design method.

  8. A multidisciplinary optimization method for designing boundary layer ingesting inlets

    NASA Astrophysics Data System (ADS)

    Rodriguez, David Leonard

    2001-07-01

    The Blended-Wing-Body is a conceptual aircraft design with rear-mounted, over-wing engines. Two types of engine installations have been considered for this aircraft. One installation is quite conventional with podded engines mounted on pylons. The other installation has partially buried engines with boundary layer ingesting inlets. Although ingesting the low-momentum flow in a boundary layer can improve propulsive efficiency, poor inlet performance can offset and even overwhelm this potential advantage. For both designs, the tight coupling between the aircraft aerodynamics and the propulsion system poses a difficult design integration problem. This dissertation presents a design method that solves the problem using multidisciplinary optimization. A Navier-Stokes flow solver, an engine analysis method, and a nonlinear optimizer are combined into a design tool that correctly addresses the tight coupling of the problem. The method is first applied to a model 2D problem to expedite development and thoroughly test the scheme. The low computational cost of the 2D method allows for several inlet installations to be optimized and analyzed. The method is then upgraded by using a validated 3D Navier-Stokes solver. The two candidate engine installations are analyzed and optimized using this inlet design method. The method is shown to be quite effective at integrating the propulsion and aerodynamic systems of the Blend-Wing-Body for both engine installations by improving overall performance and satisfying any specified design constraints. By comparing the two optimized designs, the potential advantages of ingesting boundary layer flow for this aircraft are demonstrated.

  9. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues. PMID:21985898

  10. Design of diffractive optical surfaces within the nonimaging SMS design method

    NASA Astrophysics Data System (ADS)

    Mendes-Lopes, João.; Benítez, Pablo; Miñano, Juan C.

    2015-09-01

    The Simultaneous Multiple Surface (SMS) method was initially developed as a design method in Nonimaging Optics and later, the method was extended for designing Imaging Optics. We show an extension of the SMS method to diffractive surfaces. Using this method, diffractive kinoform surfaces are calculated simultaneously and through a direct method, i. e. it is not based in multi-parametric optimization techniques. Using the phase-shift properties of diffractive surfaces as an extra degree of freedom, only N/2 surfaces are needed to perfectly couple N one parameter wavefronts. Wavefronts of different wavelengths can also be coupled, hence chromatic aberration can be corrected in SMS-based systems. This method can be used by combining and calculating simultaneously both reflective, refractive and diffractive surfaces, through direct calculation of phase and refractive/reflective profiles. Representative diffractive systems designed by the SMS method are presented.

  11. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery. PMID:26059362

  12. A comparison of methods for DPLL loop filter design

    NASA Technical Reports Server (NTRS)

    Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.

    1986-01-01

    Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.

  13. Novel parameter-based flexure bearing design method

    NASA Astrophysics Data System (ADS)

    Amoedo, Simon; Thebaud, Edouard; Gschwendtner, Michael; White, David

    2016-06-01

    A parameter study was carried out on the design variables of a flexure bearing to be used in a Stirling engine with a fixed axial displacement and a fixed outer diameter. A design method was developed in order to assist identification of the optimum bearing configuration. This was achieved through a parameter study of the bearing carried out with ANSYS®. The parameters varied were the number and the width of the arms, the thickness of the bearing, the eccentricity, the size of the starting and ending holes, and the turn angle of the spiral. Comparison was made between the different designs in terms of axial and radial stiffness, the natural frequency, and the maximum induced stresses. Moreover, the Finite Element Analysis (FEA) was compared to theoretical results for a given design. The results led to a graphical design method which assists the selection of flexure bearing geometrical parameters based on pre-determined geometric and material constraints.

  14. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  15. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  16. A computational design method for transonic turbomachinery cascades

    NASA Technical Reports Server (NTRS)

    Sobieczky, H.; Dulikravich, D. S.

    1982-01-01

    This paper describes a systematical computational procedure to find configuration changes necessary to modify the resulting flow past turbomachinery cascades, channels and nozzles, to be shock-free at prescribed transonic operating conditions. The method is based on a finite area transonic analysis technique and the fictitious gas approach. This design scheme has two major areas of application. First, it can be used for design of supercritical cascades, with applications mainly in compressor blade design. Second, it provides subsonic inlet shapes including sonic surfaces with suitable initial data for the design of supersonic (accelerated) exits, like nozzles and turbine cascade shapes. This fast, accurate and economical method with a proven potential for applications to three-dimensional flows is illustrated by some design examples.

  17. Risk-based methods applicable to ranking conceptual designs

    SciTech Connect

    Breeding, R.J.; Ortiz, K.; Ringland, J.T.; Lim, J.J.

    1993-11-01

    In Ginichi Taguchi`s latest book on quality engineering, an emphasis is placed on robust design processes in which quality engineering techniques are brought ``upstream,`` that is, they are utilized as early as possible, preferably in the conceptual design stage. This approach was used in a study of possible future safety system designs for weapons. As an experiment, a method was developed for using probabilistic risk analysis (PRA) techniques to rank conceptual designs for performance against a safety metric for ultimate incorporation into a Pugh matrix evaluation. This represents a high-level UW application of PRA methods to weapons. As with most conceptual designs, details of the implementation were not yet developed; many of the components had never been built, let alone tested. Therefore, our application of risk assessment methods was forced to be at such a high level that the entire evaluation could be performed on a spreadsheet. Nonetheless, the method produced numerical estimates of safety in a manner that was consistent, reproducible, and scrutable. The results enabled us to rank designs to identify areas where returns on research efforts would be the greatest. The numerical estimates were calibrated against what is achievable by current weapon safety systems. The use of expert judgement is inescapable, but these judgements are explicit and the method is easily implemented on an spreadsheet computer program.

  18. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  19. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  20. FRP bolted flanged connections -- Modern design and fabrication methods

    SciTech Connect

    Blach, A.E.; Sun, L.

    1995-11-01

    Bolted flanged connections for fiber reinforced plastic (FRP) pipes and pressure vessels are of great importance for any user of FRP material in fluid containment applications. At present, no dimensional standards or design rules exist for FRP flanges. Most often, flanges are fabricated to dimensional standards for metallic flanges without questioning their applicability to FRP materials. This paper discusses simplified and exact design methods for composite flanges, based on isotropic material design and on laminate theory design. Both, exact and simplified methods are included. Results of various design methods are then compared with experimental results from strain gage measurements on test pressure vessels. Methods of flange fabrication such as hand lay-up, injection molding, filament winding, and others, are discussed for their relative merits in pressure vessel and piping applications. Both, integral and bonded flanges are covered as applicable to the various methods of fabrication, also the economic implications of these methods. Also treated are the problems of gasket selection, bolting and overbolting, gasket stresses, and leakage of flanged connections.

  1. Achieving a Risk-Informed Decision-Making Environment at NASA: The Emphasis of NASA's Risk Management Policy

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews the evolution of risk management (RM) at NASA. The aim of the RM approach at NASA is to promote an approach that is heuristic, proactive, and coherent across all of NASA. Risk Informed Decision Making (RIDM) is a decision making process that uses a diverse set of performance measures along with other considerations within a deliberative process to inform decision making. RIDM is invoked for key decisions such as architecture and design decisions, make-buy decisions, and budget reallocation. The RIDM process and how it relates to the continuous Risk Management (CRM) process is reviewed.

  2. Optimal Input Signal Design for Data-Centric Estimation Methods

    PubMed Central

    Deshpande, Sunil; Rivera, Daniel E.

    2013-01-01

    Data-centric estimation methods such as Model-on-Demand and Direct Weight Optimization form attractive techniques for estimating unknown functions from noisy data. These methods rely on generating a local function approximation from a database of regressors at the current operating point with the process repeated at each new operating point. This paper examines the design of optimal input signals formulated to produce informative data to be used by local modeling procedures. The proposed method specifically addresses the distribution of the regressor vectors. The design is examined for a linear time-invariant system under amplitude constraints on the input. The resulting optimization problem is solved using semidefinite relaxation methods. Numerical examples show the benefits in comparison to a classical PRBS input design. PMID:24317042

  3. Optimal Input Signal Design for Data-Centric Estimation Methods.

    PubMed

    Deshpande, Sunil; Rivera, Daniel E

    2013-01-01

    Data-centric estimation methods such as Model-on-Demand and Direct Weight Optimization form attractive techniques for estimating unknown functions from noisy data. These methods rely on generating a local function approximation from a database of regressors at the current operating point with the process repeated at each new operating point. This paper examines the design of optimal input signals formulated to produce informative data to be used by local modeling procedures. The proposed method specifically addresses the distribution of the regressor vectors. The design is examined for a linear time-invariant system under amplitude constraints on the input. The resulting optimization problem is solved using semidefinite relaxation methods. Numerical examples show the benefits in comparison to a classical PRBS input design. PMID:24317042

  4. Test methods and design allowables for fibrous composites. Volume 2

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C. (Editor)

    1989-01-01

    Topics discussed include extreme/hostile environment testing, establishing design allowables, and property/behavior specific testing. Papers are presented on environmental effects on the high strain rate properties of graphite/epoxy composite, the low-temperature performance of short-fiber reinforced thermoplastics, the abrasive wear behavior of unidirectional and woven graphite fiber/PEEK, test methods for determining design allowables for fiber reinforced composites, and statistical methods for calculating material allowables for MIL-HDBK-17. Attention is also given to a test method to measure the response of composite materials under reversed cyclic loads, a through-the-thickness strength specimen for composites, the use of torsion tubes to measure in-plane shear properties of filament-wound composites, the influlence of test fixture design on the Iosipescu shear test for fiber composite materials, and a method for monitoring in-plane shear modulus in fatigue testing of composites.

  5. Tradeoff methods in multiobjective insensitive design of airplane control systems

    NASA Technical Reports Server (NTRS)

    Schy, A. A.; Giesy, D. P.

    1984-01-01

    The latest results of an ongoing study of computer-aided design of airplane control systems are given. Constrained minimization algorithms are used, with the design objectives in the constraint vector. The concept of Pareto optimiality is briefly reviewed. It is shown how an experienced designer can use it to find designs which are well-balanced in all objectives. Then the problem of finding designs which are insensitive to uncertainty in system parameters are discussed, introducing a probabilistic vector definition of sensitivity which is consistent with the deterministic Pareto optimal problem. Insensitivity is important in any practical design, but it is particularly important in the design of feedback control systems, since it is considered to be the most important distinctive property of feedback control. Methods of tradeoff between deterministic and stochastic-insensitive (SI) design are described, and tradeoff design results are presented for the example of the a Shuttle lateral stability augmentation system. This example is used because careful studies have been made of the uncertainty in Shuttle aerodynamics. Finally, since accurate statistics of uncertain parameters are usually not available, the effects of crude statistical models on SI designs are examined.

  6. A new method of VLSI conform design for MOS cells

    NASA Astrophysics Data System (ADS)

    Schmidt, K. H.; Wach, W.; Mueller-Glaser, K. D.

    An automated method for the design of specialized SSI/LSI-level MOS cells suitable for incorporation in VLSI chips is described. The method uses the symbolic-layout features of the CABBAGE computer program (Hsueh, 1979; De Man et al., 1982), but restricted by a fixed grid system to facilitate compaction procedures. The techniques used are shown to significantly speed the processes of electrical design, layout, design verification, and description for subsequent CAD/CAM application. In the example presented, a 211-transistor, parallel-load, synchronous 4-bit up/down binary counter cell was designed in 9 days, as compared to 30 days for a manually-optimized-layout version and 3 days for a larger, less efficient cell designed by a programmable logic array; the cell areas were 0.36, 0.21, and 0.79 sq mm, respectively. The primary advantage of the method is seen in the extreme ease with which the cell design can be adapted to new parameters or design rules imposed by improvements in technology.

  7. Computer method for design of acoustic liners for turbofan engines

    NASA Technical Reports Server (NTRS)

    Minner, G. L.; Rice, E. J.

    1976-01-01

    A design package is presented for the specification of acoustic liners for turbofans. An estimate of the noise generation was made based on modifications of existing noise correlations, for which the inputs are basic fan aerodynamic design variables. The method does not predict multiple pure tones. A target attenuation spectrum was calculated which was the difference between the estimated generation spectrum and a flat annoyance-weighted goal attenuated spectrum. The target spectrum was combined with a knowledge of acoustic liner performance as a function of the liner design variables to specify the acoustic design. The liner design method at present is limited to annular duct configurations. The detailed structure of the liner was specified by combining the required impedance (which is a result of the previous step) with a mathematical model relating impedance to the detailed structure. The design procedure was developed for a liner constructed of perforated sheet placed over honeycomb backing cavities. A sample calculation was carried through in order to demonstrate the design procedure, and experimental results presented show good agreement with the calculated results of the method.

  8. Method for Enzyme Design with Genetically Encoded Unnatural Amino Acids.

    PubMed

    Hu, C; Wang, J

    2016-01-01

    We describe the methodologies for the design of artificial enzymes with genetically encoded unnatural amino acids. Genetically encoded unnatural amino acids offer great promise for constructing artificial enzymes with novel activities. In our studies, the designs of artificial enzyme were divided into two steps. First, we considered the unnatural amino acids and the protein scaffold separately. The scaffold is designed by traditional protein design methods. The unnatural amino acids are inspired by natural structure and organic chemistry methods, and synthesized by either organic chemistry methods or enzymatic conversion. With the increasing number of published unnatural amino acids with various functions, we described an unnatural amino acids toolkit containing metal chelators, redox mediators, and click chemistry reagents. These efforts enable a researcher to search the toolkit for appropriate unnatural amino acids for the study, rather than design and synthesize the unnatural amino acids from the beginning. After the first step, the model enzyme was optimized by computational methods and directed evolution. Lastly, we describe a general method for evolving aminoacyl-tRNA synthetase and expressing unnatural amino acids incorporated into a protein. PMID:27586330

  9. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  10. A decentralized linear quadratic control design method for flexible structures

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1990-01-01

    A decentralized suboptimal linear quadratic control design procedure which combines substructural synthesis, model reduction, decentralized control design, subcontroller synthesis, and controller reduction is proposed for the design of reduced-order controllers for flexible structures. The procedure starts with a definition of the continuum structure to be controlled. An evaluation model of finite dimension is obtained by the finite element method. Then, the finite element model is decomposed into several substructures by using a natural decomposition called substructuring decomposition. Each substructure, at this point, still has too large a dimension and must be reduced to a size that is Riccati-solvable. Model reduction of each substructure can be performed by using any existing model reduction method, e.g., modal truncation, balanced reduction, Krylov model reduction, or mixed-mode method. Then, based on the reduced substructure model, a subcontroller is designed by an LQ optimal control method for each substructure independently. After all subcontrollers are designed, a controller synthesis method called substructural controller synthesis is employed to synthesize all subcontrollers into a global controller. The assembling scheme used is the same as that employed for the structure matrices. Finally, a controller reduction scheme, called the equivalent impulse response energy controller (EIREC) reduction algorithm, is used to reduce the global controller to a reasonable size for implementation. The EIREC reduced controller preserves the impulse response energy of the full-order controller and has the property of matching low-frequency moments and low-frequency power moments. An advantage of the substructural controller synthesis method is that it relieves the computational burden associated with dimensionality. Besides that, the SCS design scheme is also a highly adaptable controller synthesis method for structures with varying configuration, or varying mass

  11. COMMUNICATING RISK INFORMATION TO STATE AND LOCAL AIR POLLUTION CONTROL AGENCIES VIA U.S. EPA'S AIR RISK INFORMATION SUPPORT CENTER (AIR RISC)

    EPA Science Inventory

    The Air Risk Information Support Center (Air RISC) has been organized by U.S. EPA's offices of Air Quality Planning and Standards and Health and Environmental Assessment. The center has been developed in cooperation with the State and Territorial air Pollution Control Program Adm...

  12. 75 FR 76982 - Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ... AGENCY Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches for... literature searches for IRIS assessments; request for information. SUMMARY: The U.S. Environmental Protection Agency (EPA) is announcing the availability of literature searches for four IRIS...

  13. A Tutorial on Probablilistic Risk Assessement and its Role in Risk-Informed Decision Making

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews risk assessment and its role in risk-informed decision making. It includes information on probabilistic risk assessment, typical risk management process, origins of risk matrix, performance measures, performance objectives and Bayes theorem.

  14. The conditional risk probability-based seawall height design method

    NASA Astrophysics Data System (ADS)

    Yang, Xing; Hu, Xiaodong; Li, Zhiqing

    2015-11-01

    The determination of the required seawall height is usually based on the combination of wind speed (or wave height) and still water level according to a specified return period, e.g., 50-year return period wind speed and 50-year return period still water level. In reality, the two variables are be partially correlated. This may be lead to over-design (costs) of seawall structures. The above-mentioned return period for the design of a seawall depends on economy, society and natural environment in the region. This means a specified risk level of overtopping or damage of a seawall structure is usually allowed. The aim of this paper is to present a conditional risk probability-based seawall height design method which incorporates the correlation of the two variables. For purposes of demonstration, the wind speeds and water levels collected from Jiangsu of China are analyzed. The results show this method can improve seawall height design accuracy.

  15. Study on Communication System of Social Risk Information on Nuclear Energy

    SciTech Connect

    Hidekazu Yoshikawa; Toshio Sugiman; Yasunaga Wakabayashi; Hiroshi Shimoda; Mika Terado; Mariko Akimoto; Yoshihiko Nagasato

    2004-07-01

    As a new risk communication method for the construction of effective knowledge bases about 'safety and non-anxiety for nuclear energy', a study on new communication method of social risk information by means of electronic communication has been started, by noticing rapid expansion of internet usage in the society. The purpose of this research is to enhance the public acceptance to nuclear power in Japan by the following two aspects. The first is to develop the mutual communication system among the working persons involved in both the operation and maintenance activities for nuclear power plant, by which they will exchange their daily experiences to improve the safety conscious activities to foster 'safety culture' attitude. The other is the development of an effective risk communication system between nuclear society and the general publics about the hot issues of 'what are the concerned involved in the final disposal of high-level radioactive waste?' and 'what should we do to have social consensus to deal with this issue in future'. The authors' research plan for the above purpose is summarized as shown in Table 1. As the first step of the authors' three year research project which started from August 2003, social investigation by questionnaires by internet and postal mail, have been just recently conducted on their risk perception for the nuclear power for the people engaged in nuclear business and women in the metropolitan area, respectively, in order to obtain the relevant information on how and what should be considered for constructing effective risk communication methods of social risk information between the people within nuclear industries and the general public in society. Although there need to be discussed, the contrasting risk images as shown in Fig.1, can be depicted between the nuclear people and general public these days in Japan, from the results of the social investigation. As the conclusion of the authors' study thus far conducted, the

  16. Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region

    NASA Astrophysics Data System (ADS)

    Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad

    2016-04-01

    More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically

  17. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena. PMID:21880844

  18. Rotordynamics and Design Methods of an Oil-Free Turbocharger

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.

    1999-01-01

    The feasibility of supporting a turbocharger rotor on air foil bearings is investigated based upon predicted rotordynamic stability, load accommodations, and stress considerations. It is demonstrated that foil bearings offer a plausible replacement for oil-lubricated bearings in diesel truck turbochargers. Also, two different rotor configurations are analyzed and the design is chosen which best optimizes the desired performance characteristics. The method of designing machinery for foil bearing use and the assumptions made are discussed.

  19. Design of large Francis turbine using optimal methods

    NASA Astrophysics Data System (ADS)

    Flores, E.; Bornard, L.; Tomas, L.; Liu, J.; Couston, M.

    2012-11-01

    Among a high number of Francis turbine references all over the world, covering the whole market range of heads, Alstom has especially been involved in the development and equipment of the largest power plants in the world : Three Gorges (China -32×767 MW - 61 to 113 m), Itaipu (Brazil- 20x750 MW - 98.7m to 127m) and Xiangjiaba (China - 8x812 MW - 82.5m to 113.6m - in erection). Many new projects are under study to equip new power plants with Francis turbines in order to answer an increasing demand of renewable energy. In this context, Alstom Hydro is carrying out many developments to answer those needs, especially for jumbo units such the planned 1GW type units in China. The turbine design for such units requires specific care by using the state of the art in computation methods and the latest technologies in model testing as well as the maximum feedback from operation of Jumbo plants already in operation. We present in this paper how a large Francis turbine can be designed using specific design methods, including the global and local optimization methods. The design of the spiral case, the tandem cascade profiles, the runner and the draft tube are designed with optimization loops involving a blade design tool, an automatic meshing software and a Navier-Stokes solver, piloted by a genetic algorithm. These automated optimization methods, presented in different papers over the last decade, are nowadays widely used, thanks to the growing computation capacity of the HPC clusters: the intensive use of such optimization methods at the turbine design stage allows to reach very high level of performances, while the hydraulic flow characteristics are carefully studied over the whole water passage to avoid any unexpected hydraulic phenomena.

  20. A finite-difference method for transonic airfoil design.

    NASA Technical Reports Server (NTRS)

    Steger, J. L.; Klineberg, J. M.

    1972-01-01

    This paper describes an inverse method for designing transonic airfoil sections or for modifying existing profiles. Mixed finite-difference procedures are applied to the equations of transonic small disturbance theory to determine the airfoil shape corresponding to a given surface pressure distribution. The equations are solved for the velocity components in the physical domain and flows with embedded shock waves can be calculated. To facilitate airfoil design, the method allows alternating between inverse and direct calculations to obtain a profile shape that satisfies given geometric constraints. Examples are shown of the application of the technique to improve the performance of several lifting airfoil sections. The extension of the method to three dimensions for designing supercritical wings is also indicated.

  1. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization problem. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  2. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization program. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  3. Computational methods of robust controller design for aerodynamic flutter suppression

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1981-01-01

    The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.

  4. Risk Information Exposure and Direct to Consumer Genetic Testing for BRCA Mutations among Women with a Personal or Family History of Breast or Ovarian Cancer

    PubMed Central

    Gray, Stacy W.; O’Grady, Cristin; Karp, Lauren; Smith, Daniel; Schwartz, J. Sanford; Hornik, Robert C.; Armstrong, Katrina

    2009-01-01

    Background Direct to consumer (DTC) BRCA testing may expand access to genetic testing and enhance cancer prevention efforts. However, it is not know if current DTC websites provide adequate risk information for informed medical decision-making. Methods 284 women with a personal or family history of breast/ovarian cancer were randomly assigned to view a “mock” DTC commercial website (control condition: CC, n=93) or the same “mock” website that included information on the potential risks of obtaining genetic testing online. Risk information was framed two ways: risk information attributed to expert sources (ES, n=98) and unattributed risk information (URI, n=93). Participants completed an online survey. Endpoints were intentions to get BRCA testing, testing site preference and beliefs about DTC BRCA testing. Results Sample characteristics: mean age 39 (range 18–70), 82% white, mean education 3 yrs. college. Women exposed to risk information had lower intentions to get BRCA testing than women in the CC (adjusted odds ratio (OR) 0.48; 95% confidence interval (CI) 0.26–0.87, p=0.016), less positive beliefs about online BRCA testing (adjusted OR 0.48; 95% 0.27–0.86, p=0.014). Women in the ES condition were more likely to prefer clinic based testing than women in the CC (adjusted OR 2.05; 95% CI 1.07–3.90, p=0.030). Conclusion Exposing women to information on the potential risks of online BRCA testing altered their intentions, beliefs and preferences for BRCA testing. Policy makers may want to consider content and framing of risk information on DTC websites as they formulate regulation for this rapidly growing industry. PMID:19318436

  5. A method for the probabilistic design assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    A formal procedure for the probabilistic design assessment of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the assessment. The probabilistic assessment consists of design criteria, modeling of composite structures and uncertainties, simulation methods, and the decision making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically assessed with accuracy and efficiency.

  6. Structure design: an artificial intelligence-based method for the design of molecules under geometrical constraints.

    PubMed

    Cohen, A A; Shatzmiller, S E

    1993-09-01

    This study presents an algorithm that implements artificial-intelligence techniques for automated, and site-directed drug design. The aim of the method is to link two or more predetermined functional groups into a sensible molecular structure. The proposed designing process mimics the classical manual design method, in which the drug designer sits in front of the computer screen and with the aid of computer graphics attempts to design the new drug. Therefore, the key principle of the algorithm is the parameterization of some criteria that affect the decision-making process carried out by the drug designer. This parameterization is based on the generation of weighting factors that reflect the knowledge and knowledge-based intuition of the drug designer, and thus add further rationalization to the drug design process. The proposed algorithm has been shown to yield a large variety of different structures, of which the drug designer may choose the most sensible. Performance tests indicate that with the proper set of parameters, the method generates a new structure within a short time. PMID:8110662

  7. Inverse design of airfoils using a flexible membrane method

    NASA Astrophysics Data System (ADS)

    Thinsurat, Kamon

    The Modified Garabedian Mc-Fadden (MGM) method is used to inversely design airfoils. The Finite Difference Method (FDM) for Non-Uniform Grids was developed to discretize the MGM equation for numerical solving. The Finite Difference Method (FDM) for Non-Uniform Grids has the advantage of being used flexibly with an unstructured grids airfoil. The commercial software FLUENT is being used as the flow solver. Several conditions are set in FLUENT such as subsonic inviscid flow, subsonic viscous flow, transonic inviscid flow, and transonic viscous flow to test the inverse design code for each condition. A moving grid program is used to create a mesh for new airfoils prior to importing meshes into FLUENT for the analysis of flows. For validation, an iterative process is used so the Cp distribution of the initial airfoil, the NACA0011, achieves the Cp distribution of the target airfoil, the NACA2315, for the subsonic inviscid case at M=0.2. Three other cases were carried out to validate the code. After the code validations, the inverse design method was used to design a shock free airfoil in the transonic condition and to design a separation free airfoil at a high angle of attack in the subsonic condition.

  8. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Phase 1

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley Multidisciplinary Design Optimization (MDO) method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of reproducible experiments. This report documents all computational experiments conducted in Phase I of the study. This report is a companion to the paper titled Initial Results of an MDO Method Evaluation Study by N. M. Alexandrov and S. Kodiyalam (AIAA-98-4884).

  9. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  10. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-17

    ... COMMISSION An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes... Assessment in Risk- Informed Decisions on Plant-Specific Changes to the Licensing Basis,'' (proposed Revision... Assessment Results for Risk-Informed Activities'' and the references were updated. It is the intent of...

  11. A PDE Sensitivity Equation Method for Optimal Aerodynamic Design

    NASA Technical Reports Server (NTRS)

    Borggaard, Jeff; Burns, John

    1996-01-01

    The use of gradient based optimization algorithms in inverse design is well established as a practical approach to aerodynamic design. A typical procedure uses a simulation scheme to evaluate the objective function (from the approximate states) and its gradient, then passes this information to an optimization algorithm. Once the simulation scheme (CFD flow solver) has been selected and used to provide approximate function evaluations, there are several possible approaches to the problem of computing gradients. One popular method is to differentiate the simulation scheme and compute design sensitivities that are then used to obtain gradients. Although this black-box approach has many advantages in shape optimization problems, one must compute mesh sensitivities in order to compute the design sensitivity. In this paper, we present an alternative approach using the PDE sensitivity equation to develop algorithms for computing gradients. This approach has the advantage that mesh sensitivities need not be computed. Moreover, when it is possible to use the CFD scheme for both the forward problem and the sensitivity equation, then there are computational advantages. An apparent disadvantage of this approach is that it does not always produce consistent derivatives. However, for a proper combination of discretization schemes, one can show asymptotic consistency under mesh refinement, which is often sufficient to guarantee convergence of the optimal design algorithm. In particular, we show that when asymptotically consistent schemes are combined with a trust-region optimization algorithm, the resulting optimal design method converges. We denote this approach as the sensitivity equation method. The sensitivity equation method is presented, convergence results are given and the approach is illustrated on two optimal design problems involving shocks.

  12. Taguchi method of experimental design in materials education

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  13. Function combined method for design innovation of children's bike

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoli; Qiu, Tingting; Chen, Huijuan

    2013-03-01

    As children mature, bike products for children in the market develop at the same time, and the conditions are frequently updated. Certain problems occur when using a bike, such as cycle overlapping, repeating function, and short life cycle, which go against the principles of energy conservation and the environmental protection intensive design concept. In this paper, a rational multi-function method of design through functional superposition, transformation, and technical implementation is proposed. An organic combination of frog-style scooter and children's tricycle is developed using the multi-function method. From the ergonomic perspective, the paper elaborates on the body size of children aged 5 to 12 and effectively extracts data for a multi-function children's bike, which can be used for gliding and riding. By inverting the body, parts can be interchanged between the handles and the pedals of the bike. Finally, the paper provides a detailed analysis of the components and structural design, body material, and processing technology of the bike. The study of Industrial Product Innovation Design provides an effective design method to solve the bicycle problems, extends the function problems, improves the product market situation, and enhances the energy saving feature while implementing intensive product development effectively at the same time.

  14. Novel kind of DSP design method based on IP core

    NASA Astrophysics Data System (ADS)

    Yu, Qiaoyan; Liu, Peng; Wang, Weidong; Hong, Xiang; Chen, Jicheng; Yuan, Jianzhong; Chen, Keming

    2004-04-01

    With the pressure from the design productivity and various special applications, original design method for DSP can no longer keep up with the required speed. A novel design method is needed urgently. Intellectual Property (IP) reusing is a tendency for DSP design, but simple plug-and-play IP cores approaches almost never work. Therefore, appropriate control strategies are needed to connect all the IP cores used and coordinate the whole DSP. This paper presents a new DSP design procedure, which refers to System-on-a-chip, and later introduces a novel control strategy named DWC to implement the DSP based on IP cores. The most important part of this novel control strategy, pipeline control unit (PCU), is given in detail. Because a great number of data hazards occur in most computation-intensive scientific application, a new effective algorithm of checking data hazards is employed in PCU. Following this strategy, the design of a general or special purposed DSP can be finished in shorter time, and the DSP has a potency to improve performance with little modification on basic function units. This DWC strategy has been implement in a 16-bit fixed-pointed DSP successfully.

  15. System Synthesis in Preliminary Aircraft Design Using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and early preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically Design of Experiments (DOE) and Response Surface Methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an Overall Evaluation Criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in an innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting in solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a High Speed Civil Transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabilistic designs (and eventually robust ones).

  16. System Synthesis in Preliminary Aircraft Design using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).

  17. A Simple Method for High-Lift Propeller Conceptual Design

    NASA Technical Reports Server (NTRS)

    Patterson, Michael; Borer, Nick; German, Brian

    2016-01-01

    In this paper, we present a simple method for designing propellers that are placed upstream of the leading edge of a wing in order to augment lift. Because the primary purpose of these "high-lift propellers" is to increase lift rather than produce thrust, these props are best viewed as a form of high-lift device; consequently, they should be designed differently than traditional propellers. We present a theory that describes how these props can be designed to provide a relatively uniform axial velocity increase, which is hypothesized to be advantageous for lift augmentation based on a literature survey. Computational modeling indicates that such propellers can generate the same average induced axial velocity while consuming less power and producing less thrust than conventional propeller designs. For an example problem based on specifications for NASA's Scalable Convergent Electric Propulsion Technology and Operations Research (SCEPTOR) flight demonstrator, a propeller designed with the new method requires approximately 15% less power and produces approximately 11% less thrust than one designed for minimum induced loss. Higher-order modeling and/or wind tunnel testing are needed to verify the predicted performance.

  18. New displacement-based methods for optimal truss topology design

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Ben-Tal, Aharon; Haftka, Raphael T.

    1991-01-01

    Two alternate methods for maximum stiffness truss topology design are presented. The ground structure approach is used, and the problem is formulated in terms of displacements and bar areas. This large, nonconvex optimization problem can be solved by a simultaneous analysis and design approach. Alternatively, an equivalent, unconstrained, and convex problem in the displacements only can be formulated, and this problem can be solved by a nonsmooth, steepest descent algorithm. In both methods, the explicit solving of the equilibrium equations and the assembly of the global stiffness matrix are circumvented. A large number of examples have been studied, showing the attractive features of topology design as well as exposing interesting features of optimal topologies.

  19. New Methods and Transducer Designs for Ultrasonic Diagnostics and Therapy

    NASA Astrophysics Data System (ADS)

    Rybyanets, A. N.; Naumenko, A. A.; Sapozhnikov, O. A.; Khokhlova, V. A.

    Recent advances in the field of physical acoustics, imaging technologies, piezoelectric materials, and ultrasonic transducer design have led to emerging of novel methods and apparatus for ultrasonic diagnostics, therapy and body aesthetics. The paper presents the results on development and experimental study of different high intensity focused ultrasound (HIFU) transducers. Technological peculiarities of the HIFU transducer design as well as theoretical and numerical models of such transducers and the corresponding HIFU fields are discussed. Several HIFU transducers of different design have been fabricated using different advanced piezoelectric materials. Acoustic field measurements for those transducers have been performed using a calibrated fiber optic hydrophone and an ultrasonic measurement system (UMS). The results of ex vivo experiments with different tissues as well as in vivo experiments with blood vessels are presented that prove the efficacy, safety and selectivity of the developed HIFU transducers and methods.

  20. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  1. Using Propensity Score Methods to Approximate Factorial Experimental Designs

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2011-01-01

    The purpose of this study is through Monte Carlo simulation to compare several propensity score methods in approximating factorial experimental design and identify best approaches in reducing bias and mean square error of parameter estimates of the main and interaction effects of two factors. Previous studies focused more on unbiased estimates of…

  2. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Designation of noise description...

  3. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description...

  4. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Designation of noise description...

  5. Analytical methods of electrode design for a relativistic electron gun

    SciTech Connect

    Caporaso, G.J.; Cole, A.G.; Boyd, J.K.

    1985-05-09

    The standard paraxial ray equation method for the design of electrodes for an electrostatically focused gun is extended to include relativistic effects and the effects of the beam's azimuthal magnetic field. Solutions for parallel and converging beams are obtained and the predicted currents are compared against those measured on the High Brightness Test Stand. 4 refs., 2 figs.

  6. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  7. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  8. Polypharmacology: in silico methods of ligand design and development.

    PubMed

    McKie, Samuel A

    2016-04-01

    How to design a ligand to bind multiple targets, rather than to a single target, is the focus of this review. Rational polypharmacology draws on knowledge that is both broad ranging and hierarchical. Computer-aided multitarget ligand design methods are described according to their nested knowledge level. Ligand-only and then receptor-ligand strategies are first described; followed by the metabolic network viewpoint. Subsequently strategies that view infectious diseases as multigenomic targets are discussed, and finally the disease level interpretation of medicinal therapy is considered. As yet there is no consensus on how best to proceed in designing a multitarget ligand. The current methodologies are bought together in an attempt to give a practical overview of how polypharmacology design might be best initiated. PMID:27105127

  9. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. PMID:27397810

  10. Computational methods for aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Peeters, M. F.

    1983-01-01

    Five methods to increase the computational efficiency of aerodynamic design using numerical optimization, by reducing the computer time required to perform gradient calculations, are examined. The most promising method consists of drastically reducing the size of the computational domain on which aerodynamic calculations are made during gradient calculations. Since a gradient calculation requires the solution of the flow about an airfoil whose geometry was slightly perturbed from a base airfoil, the flow about the base airfoil is used to determine boundary conditions on the reduced computational domain. This method worked well in subcritical flow.

  11. A robust inverse inviscid method for airfoil design

    NASA Astrophysics Data System (ADS)

    Chaviaropoulos, P.; Dedoussis, V.; Papailiou, K. D.

    An irrotational inviscid compressible inverse design method for two-dimensional airfoil profiles is described. The method is based on the potential streamfunction formulation, where the physical space on which the boundaries of the airfoil are sought, is mapped onto the (phi, psi) space via a body-fitted coordinate transformation. A novel procedure based on differential geometry arguments is employed to derive the governing equations for the inverse problem, by requiring the curvature of the flat 2-D Euclidean space to be zero. An auxiliary coordinate transformation permits the definition of C-type computational grids on the (phi, psi) plane resulting to a more accurate description of the leading edge region. Geometry is determined by integrating Frenet equations along the grid lines. To validate the method inverse calculation results are compared to direct, `reproduction', calculation results. The design procedure of a new airfoil shape is also presented.

  12. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  13. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  14. Behavioral response to contamination risk information in a spatially explicit groundwater environment: Experimental evidence

    NASA Astrophysics Data System (ADS)

    Li, Jingyuan; Michael, Holly A.; Duke, Joshua M.; Messer, Kent D.; Suter, Jordan F.

    2014-08-01

    This paper assesses the effectiveness of aquifer monitoring information in achieving more sustainable use of a groundwater resource in the absence of management policy. Groundwater user behavior in the face of an irreversible contamination threat is studied by applying methods of experimental economics to scenarios that combine a physics-based, spatially explicit, numerical groundwater model with different representations of information about an aquifer and its risk of contamination. The results suggest that the threat of catastrophic contamination affects pumping decisions: pumping is significantly reduced in experiments where contamination is possible compared to those where pumping cost is the only factor discouraging groundwater use. The level of information about the state of the aquifer also affects extraction behavior. Pumping rates differ when information that synthesizes data on aquifer conditions (a "risk gauge") is provided, despite invariant underlying economic incentives, and this result does not depend on whether the risk information is location-specific or from a whole aquifer perspective. Interestingly, users increase pumping when the risk gauge signals good aquifer status compared to a no-gauge treatment. When the gauge suggests impending contamination, however, pumping declines significantly, resulting in a lower probability of contamination. The study suggests that providing relatively simple aquifer condition guidance derived from monitoring data can lead to more sustainable use of groundwater resources.

  15. Non-Contact Electromagnetic Exciter Design with Linear Control Method

    NASA Astrophysics Data System (ADS)

    Wang, Lin; Xiong, Xianzhi; Xu, Hua

    2016-04-01

    A non-contact type force actuator is necessary for studying the dynamic performance of a high-speed spindle system owing to its high-speed operating conditions. A non-contact electromagnetic exciter is designed for identifying the dynamic coefficients of journal bearings in high-speed grinding spindles. A linear force control method is developed based on PID controller. The influence of amplitude and frequency of current, misalignment and rotational speed on magnetic field and excitation force is investigated based on two-dimensional finite element analysis. The electromagnetic excitation force is measured with the auxiliary coils and calibrated by load cells. The design is validated by the experimental results. Theoretical and experimental investigations show that the proposed design can accurately generate linear excitation force with sufficiently large amplitude and higher signal to noise ratio. Moreover, the fluctuations in force amplitude are reduced to a greater extent with the designed linear control method even when the air gap changes due to the rotor vibration at high-speed conditions. Besides, it is possible to apply various types of excitations: constant, synchronous, and non-synchronous excitation forces based on the proposed linear control method. This exciter can be used as linear-force exciting and controlling system for dynamic performance study of different high-speed rotor-bearing systems.

  16. Risk informed resource allocation policy: safety can save costs.

    PubMed

    Pasman, H J

    2000-01-01

    During economic doldrums, decision making on investments for safety is even more difficult than it already is when funds are abundant. This paper attempts to offer some guidance. After stating the present challenge to prevention of losses in the process industries, the systematic approach of quantified risk assessment is briefly reviewed and improvements in the methodology are mentioned. In addition, attention is given to the use of a risk matrix to survey a plant and to derive a plan of action. Subsequently, the reduction of risk is reviewed. Measures for prevention, protection, and mitigation are discussed. The organization of safety has become at least as important as technical safety of equipment and standards. It is reflected in the introduction of a safety management system. Furthermore, the design process in a pro-active approach is described and the concept of inherent safety is briefly addressed. The concept of Layer of Protection Analysis is explained and also the reason why it is relevant to provide a cost-benefit analysis. Finally, after comments regarding the cost of accidents, the basics of costing and profitability are summarized and a way is suggested to apply this approach to risk-reducing measures. An example is provided on how a selection can be made from a number of alternatives. PMID:10677670

  17. Material Design, Selection, and Manufacturing Methods for System Sustainment

    SciTech Connect

    David Sowder, Jim Lula, Curtis Marshall

    2010-02-18

    This paper describes a material selection and validation process proven to be successful for manufacturing high-reliability long-life product. The National Secure Manufacturing Center business unit of the Kansas City Plant (herein called KCP) designs and manufactures complex electrical and mechanical components used in extreme environments. The material manufacturing heritage is founded in the systems design to manufacturing practices that support the U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA). Material Engineers at KCP work with the systems designers to recommend materials, develop test methods, perform analytical analysis of test data, define cradle to grave needs, present final selection and fielding. The KCP material engineers typically will maintain cost control by utilizing commercial products when possible, but have the resources and to develop and produce unique formulations as necessary. This approach is currently being used to mature technologies to manufacture materials with improved characteristics using nano-composite filler materials that will enhance system design and production. For some products the engineers plan and carry out science-based life-cycle material surveillance processes. Recent examples of the approach include refurbished manufacturing of the high voltage power supplies for cockpit displays in operational aircraft; dry film lubricant application to improve bearing life for guided munitions gyroscope gimbals, ceramic substrate design for electrical circuit manufacturing, and tailored polymeric materials for various systems. The following examples show evidence of KCP concurrent design-to-manufacturing techniques used to achieve system solutions that satisfy or exceed demanding requirements.

  18. Denoising Sparse Images from GRAPPA using the Nullspace Method (DESIGN)

    PubMed Central

    Weller, Daniel S.; Polimeni, Jonathan R.; Grady, Leo; Wald, Lawrence L.; Adalsteinsson, Elfar; Goyal, Vivek K

    2011-01-01

    To accelerate magnetic resonance imaging using uniformly undersampled (nonrandom) parallel imaging beyond what is achievable with GRAPPA alone, the Denoising of Sparse Images from GRAPPA using the Nullspace method (DESIGN) is developed. The trade-off between denoising and smoothing the GRAPPA solution is studied for different levels of acceleration. Several brain images reconstructed from uniformly undersampled k-space data using DESIGN are compared against reconstructions using existing methods in terms of difference images (a qualitative measure), PSNR, and noise amplification (g-factors) as measured using the pseudo-multiple replica method. Effects of smoothing, including contrast loss, are studied in synthetic phantom data. In the experiments presented, the contrast loss and spatial resolution are competitive with existing methods. Results for several brain images demonstrate significant improvements over GRAPPA at high acceleration factors in denoising performance with limited blurring or smoothing artifacts. In addition, the measured g-factors suggest that DESIGN mitigates noise amplification better than both GRAPPA and L1 SPIR-iT (the latter limited here by uniform undersampling). PMID:22213069

  19. Online Guidance Law of Missile Using Multiple Design Point Method

    NASA Astrophysics Data System (ADS)

    Yamaoka, Seiji; Ueno, Seiya

    This paper deals with design procedure of online guidance law for future missiles that are required to have agile maneuverability. For the purpose, the authors propose to mount high power side-thrusters on a missile. The guidance law for such missiles is discussed from a point of view of optimal control theory in this paper. Minimum time problem is solved for the approximated system. It is derived that bang-bang control is optimal input from the necessary conditions of optimal solution. Feedback guidance without iterative calculation is useful for actual systems. Multiple design point method is applied to design feedback gains and feedforward inputs of the guidance law. The numerical results show the good performance of the proposed guidance law.

  20. Smokers' sources of e-cigarette awareness and risk information

    PubMed Central

    Wackowski, Olivia A.; Bover Manderski, Michelle T.; Delnevo, Cristine D.

    2015-01-01

    Introduction Few studies have explored sources of e-cigarette awareness and peoples' e-cigarette information needs, interests, or behaviors. This study contributes to both domains of e-cigarette research. Methods Results are based on a 2014 e-cigarette focused survey of 519 current smokers from a nationally representative research panel. Results Smokers most frequently reported seeing e-cigarettes in stores (86.4%) and used in person (83%). Many (73%) had also heard about e-cigarettes from known users, broadcast media ads (68%), other (print, online) advertisements (71.5%), and/or from the news (60.9%); sources of awareness varied by e-cigarette experience. Most smokers (59.9%) believed e-cigarettes are less harmful than regular cigarettes, a belief attributed to “common sense” (76.4%), the news (39.2%), and advertisements (37.2%). However, 79.5% felt e-cigarette safety information was important. Over one-third said they would turn to a doctor first for e-cigarette safety information, although almost a quarter said they would turn to the Internet or product packaging first. Most (59.6%) ranked doctors as the most trustworthy risk source, and 6.8% had asked a health professional about e-cigarettes. Conclusions Future research should explore the content of e-cigarette information sources, their potential impact, and ways they might be strengthened or changed through regulatory and/or educational efforts. PMID:26576338

  1. Risk-informed regulation and safety management of nuclear power plants--on the prevention of severe accidents.

    PubMed

    Himanen, Risto; Julin, Ari; Jänkälä, Kalle; Holmberg, Jan-Erik; Virolainen, Reino

    2012-11-01

    There are four operating nuclear power plant (NPP) units in Finland. The Teollisuuden Voima (TVO) power company has two 840 MWe BWR units supplied by Asea-Atom at the Olkiluoto site. The Fortum corporation (formerly IVO) has two 500 MWe VVER 440/213 units at the Loviisa site. In addition, a 1600 MWe European Pressurized Water Reactor supplied by AREVA NP (formerly the Framatome ANP--Siemens AG Consortium) is under construction at the Olkiluoto site. Recently, the Finnish Parliament ratified the government Decision in Principle that the utilities' applications to build two new NPP units are in line with the total good of the society. The Finnish utilities, Fenno power company, and TVO company are in progress of qualifying the type of the new nuclear builds. In Finland, risk-informed applications are formally integrated in the regulatory process of NPPs that are already in the early design phase and these are to run through the construction and operation phases all through the entire plant service time. A plant-specific full-scope probabilistic risk assessment (PRA) is required for each NPP. PRAs shall cover internal events, area events (fires, floods), and external events such as harsh weather conditions and seismic events in all operating modes. Special attention is devoted to the use of various risk-informed PRA applications in the licensing of Olkiluoto 3 NPP. PMID:23035957

  2. COMPSIZE - PRELIMINARY DESIGN METHOD FOR FIBER REINFORCED COMPOSITE STRUCTURES

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1994-01-01

    The Composite Structure Preliminary Sizing program, COMPSIZE, is an analytical tool which structural designers can use when doing approximate stress analysis to select or verify preliminary sizing choices for composite structural members. It is useful in the beginning stages of design concept definition, when it is helpful to have quick and convenient approximate stress analysis tools available so that a wide variety of structural configurations can be sketched out and checked for feasibility. At this stage of the design process the stress/strain analysis does not need to be particularly accurate because any configurations tentatively defined as feasible will later be analyzed in detail by stress analysis specialists. The emphasis is on fast, user-friendly methods so that rough but technically sound evaluation of a broad variety of conceptual designs can be accomplished. Analysis equations used are, in most cases, widely known basic structural analysis methods. All the equations used in this program assume elastic deformation only. The default material selection is intermediate strength graphite/epoxy laid up in a quasi-isotropic laminate. A general flat laminate analysis subroutine is included for analyzing arbitrary laminates. However, COMPSIZE should be sufficient for most users to presume a quasi-isotropic layup and use the familiar basic structural analysis methods for isotropic materials, after estimating an appropriate elastic modulus. Homogeneous materials can be analyzed as simplified cases. The COMPSIZE program is written in IBM BASICA. The program format is interactive. It was designed on an IBM Personal Computer operating under DOS with a central memory requirement of approximately 128K. It has been implemented on an IBM compatible with GW-BASIC under DOS 3.2. COMPSIZE was developed in 1985.

  3. Numerical design method for thermally loaded plate-cylinder intersections

    SciTech Connect

    Baldur, R.; Laberge, C.A.; Lapointe, D. )

    1988-11-01

    This paper is an extension of work on stresses in corner radii described by the authors previously. Whereas the original study concerned itself with pressure effects only and the second reference gave the initial version of the work dealing with the thermal effects, this report gives more recent results concerning specifically thermal loads. As before, the results are limited to inside corner radii between cylinders and flat heat closures. Similarly, the analysis is based on a systematic series of finite element calculations with the significant parameters covering the field of useful design boundaries. The results are condensed into a rapid method for the determination of peak stresses needed for performing fatigue analysis in pressure vessels subjected to a significant, variable thermal load. The paper takes into account the influence of the film coefficient, temporal temperature variations, and material properties. A set of coefficients provides a convenient method of stress evaluation suitable for design purposes.

  4. Preliminary demonstration of a robust controller design method

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1980-01-01

    Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.

  5. National Tuberculosis Genotyping and Surveillance Network: Design and Methods

    PubMed Central

    Braden, Christopher R.; Schable, Barbara A.; Onorato, Ida M.

    2002-01-01

    The National Tuberculosis Genotyping and Surveillance Network was established in 1996 to perform a 5-year, prospective study of the usefulness of genotyping Mycobacterium tuberculosis isolates to tuberculosis control programs. Seven sentinel sites identified all new cases of tuberculosis, collected information on patients and contacts, and obtained patient isolates. Seven genotyping laboratories performed DNA fingerprinting analysis by the international standard IS6110 method. BioImage Whole Band Analyzer software was used to analyze patterns, and distinct patterns were assigned unique designations. Isolates with six or fewer bands on IS6110 patterns were also spoligotyped. Patient data and genotyping designations were entered in a relational database and merged with selected variables from the national surveillance database. In two related databases, we compiled the results of routine contact investigations and the results of investigations of the relationships of patients who had isolates with matching genotypes. We describe the methods used in the study. PMID:12453342

  6. Optical design and active optics methods in astronomy

    NASA Astrophysics Data System (ADS)

    Lemaitre, Gerard R.

    2013-03-01

    Optical designs for astronomy involve implementation of active optics and adaptive optics from X-ray to the infrared. Developments and results of active optics methods for telescopes, spectrographs and coronagraph planet finders are presented. The high accuracy and remarkable smoothness of surfaces generated by active optics methods also allow elaborating new optical design types with high aspheric and/or non-axisymmetric surfaces. Depending on the goal and performance requested for a deformable optical surface analytical investigations are carried out with one of the various facets of elasticity theory: small deformation thin plate theory, large deformation thin plate theory, shallow spherical shell theory, weakly conical shell theory. The resulting thickness distribution and associated bending force boundaries can be refined further with finite element analysis.

  7. Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures

    SciTech Connect

    Carter, Peter; Jetter, Robert I; Sham, Sam

    2011-01-01

    The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

  8. Helicopter flight-control design using an H(2) method

    NASA Technical Reports Server (NTRS)

    Takahashi, Marc D.

    1991-01-01

    Rate-command and attitude-command flight-control designs for a UH-60 helicopter in hover are presented and were synthesized using an H(2) method. Using weight functions, this method allows the direct shaping of the singular values of the sensitivity, complementary sensitivity, and control input transfer-function matrices to give acceptable feedback properties. The designs were implemented on the Vertical Motion Simulator, and four low-speed hover tasks were used to evaluate the control system characteristics. The pilot comments from the accel-decel, bob-up, hovering turn, and side-step tasks indicated good decoupling and quick response characteristics. However, an underlying roll PIO tendency was found to exist away from the hover condition, which was caused by a flap regressing mode with insufficient damping.

  9. A Requirements-Driven Optimization Method for Acoustic Treatment Design

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.

    2016-01-01

    Acoustic treatment designers have long been able to target specific noise sources inside turbofan engines. Facesheet porosity and cavity depth are key design variables of perforate-over-honeycomb liners that determine levels of noise suppression as well as the frequencies at which suppression occurs. Layers of these structures can be combined to create a robust attenuation spectrum that covers a wide range of frequencies. Looking to the future, rapidly-emerging additive manufacturing technologies are enabling new liners with multiple degrees of freedom, and new adaptive liners with variable impedance are showing promise. More than ever, there is greater flexibility and freedom in liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. The subject of this paper is an analytical method that derives the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground.

  10. Application of an optimization method to high performance propeller designs

    NASA Technical Reports Server (NTRS)

    Li, K. C.; Stefko, G. L.

    1984-01-01

    The application of an optimization method to determine the propeller blade twist distribution which maximizes propeller efficiency is presented. The optimization employs a previously developed method which has been improved to include the effects of blade drag, camber and thickness. Before the optimization portion of the computer code is used, comparisons of calculated propeller efficiencies and power coefficients are made with experimental data for one NACA propeller at Mach numbers in the range of 0.24 to 0.50 and another NACA propeller at a Mach number of 0.71 to validate the propeller aerodynamic analysis portion of the computer code. Then comparisons of calculated propeller efficiencies for the optimized and the original propellers show the benefits of the optimization method in improving propeller performance. This method can be applied to the aerodynamic design of propellers having straight, swept, or nonplanar propeller blades.

  11. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  12. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  13. Design Methods for Load-bearing Elements from Crosslaminated Timber

    NASA Astrophysics Data System (ADS)

    Vilguts, A.; Serdjuks, D.; Goremikins, V.

    2015-11-01

    Cross-laminated timber is an environmentally friendly material, which possesses a decreased level of anisotropy in comparison with the solid and glued timber. Cross-laminated timber could be used for load-bearing walls and slabs of multi-storey timber buildings as well as decking structures of pedestrian and road bridges. Design methods of cross-laminated timber elements subjected to bending and compression with bending were considered. The presented methods were experimentally validated and verified by FEM. Two cross-laminated timber slabs were tested at the action of static load. Pine wood was chosen as a board's material. Freely supported beam with the span equal to 1.9 m, which was loaded by the uniformly distributed load, was a design scheme of the considered plates. The width of the plates was equal to 1 m. The considered cross-laminated timber plates were analysed by FEM method. The comparison of stresses acting in the edge fibres of the plate and the maximum vertical displacements shows that both considered methods can be used for engineering calculations. The difference between the results obtained experimentally and analytically is within the limits from 2 to 31%. The difference in results obtained by effective strength and stiffness and transformed sections methods was not significant.

  14. Asymmetric MRI magnet design using a hybrid numerical method.

    PubMed

    Zhao, H; Crozier, S; Doddrell, D M

    1999-12-01

    This paper describes a hybrid numerical method for the design of asymmetric magnetic resonance imaging magnet systems. The problem is formulated as a field synthesis and the desired current density on the surface of a cylinder is first calculated by solving a Fredholm equation of the first kind. Nonlinear optimization methods are then invoked to fit practical magnet coils to the desired current density. The field calculations are performed using a semi-analytical method. A new type of asymmetric magnet is proposed in this work. The asymmetric MRI magnet allows the diameter spherical imaging volume to be positioned close to one end of the magnet. The main advantages of making the magnet asymmetric include the potential to reduce the perception of claustrophobia for the patient, better access to the patient by attending physicians, and the potential for reduced peripheral nerve stimulation due to the gradient coil configuration. The results highlight that the method can be used to obtain an asymmetric MRI magnet structure and a very homogeneous magnetic field over the central imaging volume in clinical systems of approximately 1.2 m in length. Unshielded designs are the focus of this work. This method is flexible and may be applied to magnets of other geometries. PMID:10579958

  15. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities. PMID:21280050

  16. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  17. Design and implementation of visualization methods for the CHANGES Spatial Decision Support System

    NASA Astrophysics Data System (ADS)

    Cristal, Irina; van Westen, Cees; Bakker, Wim; Greiving, Stefan

    2014-05-01

    The CHANGES Spatial Decision Support System (SDSS) is a web-based system aimed for risk assessment and the evaluation of optimal risk reduction alternatives at local level as a decision support tool in long-term natural risk management. The SDSS use multidimensional information, integrating thematic, spatial, temporal and documentary data. The role of visualization in this context becomes of vital importance for efficiently representing each dimension. This multidimensional aspect of the required for the system risk information, combined with the diversity of the end-users imposes the use of sophisticated visualization methods and tools. The key goal of the present work is to exploit efficiently the large amount of data in relation to the needs of the end-user, utilizing proper visualization techniques. Three main tasks have been accomplished for this purpose: categorization of the end-users, the definition of system's modules and the data definition. The graphical representation of the data and the visualization tools were designed to be relevant to the data type and the purpose of the analysis. Depending on the end-users category, each user should have access to different modules of the system and thus, to the proper visualization environment. The technologies used for the development of the visualization component combine the latest and most innovative open source JavaScript frameworks, such as OpenLayers 2.13.1, ExtJS 4 and GeoExt 2. Moreover, the model-view-controller (MVC) pattern is used in order to ensure flexibility of the system at the implementation level. Using the above technologies, the visualization techniques implemented so far offer interactive map navigation, querying and comparison tools. The map comparison tools are of great importance within the SDSS and include the following: swiping tool for comparison of different data of the same location; raster subtraction for comparison of the same phenomena varying in time; linked views for comparison

  18. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  19. Treatment of Passive Component Reliability in Risk-Informed Safety Margin Characterization FY 2010 Report

    SciTech Connect

    Robert W Youngblood

    2010-09-01

    The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy (DOE) Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). A technical challenge at the core of this effort is to establish the conceptual and technical feasibility of analyzing safety margin in a risk-informed way, which, unlike conventionally defined deterministic margin analysis, is founded on probabilistic characterizations of SSC performance.

  20. Methods of compliance evaluation for ocean outfall design and analysis.

    PubMed

    Mukhtasor; Lye, L M; Sharp, J J

    2002-10-01

    Sewage discharge from an ocean outfall is subject to water quality standards, which are often stated in probabilistic terms. Monte Carlo simulation (MCS) has been used in the past to evaluate the ability of a designed outfall to meet water quality standards or compliance guidelines associated with sewage discharges. In this study, simpler and less computer-intensive probabilistic methods are considered. The probabilistic methods evaluated are the popular mean first-order second-moment (MFOSM) and the advance first-order second-moment (AFOSM) methods. Available data from the Spaniard's Bay Outfall located on the east coast of New-foundland, Canada, were used as inputs for a case study. Both methods were compared with results given by MCS. It was found that AFOSM gave a good approximation of the failure probability for total coliform concentration at points remote from the outfall. However, MFOSM was found to be better when considering only the initial dilutions between the discharge point and the surface. Reasons for the different results may be the difference in complexity of the performance function in both cases. This study does not recommend the use of AFOSM for failure analysis in ocean outfall design and analysis because the analysis requires computational efforts similar to MCS. With the advancement of computer technology, simulation techniques, available software, and its flexibility in handling complex situations, MCS is still the best choice for failure analysis of ocean outfalls when data or estimates on the parameters involved are available or can be assumed. PMID:12481920

  1. Examining trust factors in online food risk information: The case of unpasteurized or 'raw' milk.

    PubMed

    Sillence, Elizabeth; Hardy, Claire; Medeiros, Lydia C; LeJeune, Jeffrey T

    2016-04-01

    The internet has become an increasingly important way of communicating with consumers about food risk information. However, relatively little is known about how consumers evaluate and come to trust the information they encounter online. Using the example of unpasteurized or raw milk this paper presents two studies exploring the trust factors associated with online information about the risks and benefits of raw milk consumption. In the first study, eye-tracking data was collected from 33 pasteurised milk consumers whilst they viewed six different milk related websites. A descriptive analysis of the eye-tracking data was conducted to explore viewing patterns. Reports revealed the importance of images as a way of capturing initial attention and foregrounding other features and highlighted the significance of introductory text within a homepage. In the second, qualitative study, 41 consumers, some of whom drank raw milk, viewed a selection of milk related websites before participating in either a group discussion or interview. Seventeen of the participants also took part in a follow up telephone interview 2 weeks later. The qualitative data supports the importance of good design whilst noting that balance, authorship agenda, the nature of evidence and personal relevance were also key factors affecting consumers trust judgements. The results of both studies provide support for a staged approach to online trust in which consumers engage in a more rapid, heuristic assessment of a site before moving on to a more in-depth evaluation of the information available. Findings are discussed in relation to the development of trustworthy online food safety resources. PMID:26792772

  2. Improved Method of Design for Folding Inflatable Shells

    NASA Technical Reports Server (NTRS)

    Johnson, Christopher J.

    2009-01-01

    An improved method of designing complexly shaped inflatable shells to be assembled from gores was conceived for original application to the inflatable outer shell of a developmental habitable spacecraft module having a cylindrical mid-length section with toroidal end caps. The method is also applicable to inflatable shells of various shapes for terrestrial use. The method addresses problems associated with the assembly, folding, transport, and deployment of inflatable shells that may comprise multiple layers and have complex shapes that can include such doubly curved surfaces as toroids and spheres. One particularly difficult problem is that of mathematically defining fold lines on a gore pattern in a double- curvature region. Moreover, because the fold lines in a double-curvature region tend to be curved, there is a practical problem of how to implement the folds. Another problem is that of modifying the basic gore shapes and sizes for the various layers so that when they are folded as part of the integral structure, they do not mechanically interfere with each other at the fold lines. Heretofore, it has been a common practice to design an inflatable shell to be assembled in the deployed configuration, without regard for the need to fold it into compact form. Typically, the result has been that folding has been a difficult, time-consuming process resulting in a An improved method of designing complexly shaped inflatable shells to be assembled from gores was conceived for original application to the inflatable outer shell of a developmental habitable spacecraft module having a cylindrical mid-length section with toroidal end caps. The method is also applicable to inflatable shells of various shapes for terrestrial use. The method addresses problems associated with the assembly, folding, transport, and deployment of inflatable shells that may comprise multiple layers and have complex shapes that can include such doubly curved surfaces as toroids and spheres. One

  3. A geometric design method for side-stream distillation columns

    SciTech Connect

    Rooks, R.E.; Malone, M.F.; Doherty, M.F.

    1996-10-01

    A side-stream distillation column may replace two simple columns for some applications, sometimes at considerable savings in energy and investment. This paper describes a geometric method for the design of side-stream columns; the method provides rapid estimates of equipment size and utility requirements. Unlike previous approaches, the geometric method is applicable to nonideal and azeotropic mixtures. Several example problems for both ideal and nonideal mixtures, including azeotropic mixtures containing distillation boundaries, are given. The authors make use of the fact that azeotropes or pure components whose classification in the residue curve map is a saddle can be removed as side-stream products. Significant process simplifications are found among some alternatives in example problems, leading to flow sheets with fewer units and a substantial savings in vapor rate.

  4. Sequence design in lattice models by graph theoretical methods

    NASA Astrophysics Data System (ADS)

    Sanjeev, B. S.; Patra, S. M.; Vishveshwara, S.

    2001-01-01

    A general strategy has been developed based on graph theoretical methods, for finding amino acid sequences that take up a desired conformation as the native state. This problem of inverse design has been addressed by assigning topological indices for the monomer sites (vertices) of the polymer on a 3×3×3 cubic lattice. This is a simple design strategy, which takes into account only the topology of the target protein and identifies the best sequence for a given composition. The procedure allows the design of a good sequence for a target native state by assigning weights for the vertices on a lattice site in a given conformation. It is seen across a variety of conformations that the predicted sequences perform well both in sequence and in conformation space, in identifying the target conformation as native state for a fixed composition of amino acids. Although the method is tested in the framework of the HP model [K. F. Lau and K. A. Dill, Macromolecules 22, 3986 (1989)] it can be used in any context if proper potential functions are available, since the procedure derives unique weights for all the sites (vertices, nodes) of the polymer chain of a chosen conformation (graph).

  5. Modified method to improve the design of Petlyuk distillation columns

    PubMed Central

    2014-01-01

    Background A response surface analysis was performed to study the effect of the composition and feeding thermal conditions of ternary mixtures on the number of theoretical stages and the energy consumption of Petlyuk columns. A modification of the pre-design algorithm was necessary for this purpose. Results The modified algorithm provided feasible results in 100% of the studied cases, compared with only 8.89% for the current algorithm. The proposed algorithm allowed us to attain the desired separations, despite the type of mixture and the operating conditions in the feed stream, something that was not possible with the traditional pre-design method. The results showed that the type of mixture had great influence on the number of stages and on energy consumption. A higher number of stages and a lower consumption of energy were attained with mixtures rich in the light component, while higher energy consumption occurred when the mixture was rich in the heavy component. Conclusions The proposed strategy expands the search of an optimal design of Petlyuk columns within a feasible region, which allow us to find a feasible design that meets output specifications and low thermal loads. PMID:25061476

  6. A geometric method for optimal design of color filter arrays.

    PubMed

    Hao, Pengwei; Li, Yan; Lin, Zhouchen; Dubois, Eric

    2011-03-01

    A color filter array (CFA) used in a digital camera is a mosaic of spectrally selective filters, which allows only one color component to be sensed at each pixel. The missing two components of each pixel have to be estimated by methods known as demosaicking. The demosaicking algorithm and the CFA design are crucial for the quality of the output images. In this paper, we present a CFA design methodology in the frequency domain. The frequency structure, which is shown to be just the symbolic DFT of the CFA pattern (one period of the CFA), is introduced to represent images sampled with any rectangular CFAs in the frequency domain. Based on the frequency structure, the CFA design involves the solution of a constrained optimization problem that aims at minimizing the demosaicking error. To decrease the number of parameters and speed up the parameter searching, the optimization problem is reformulated as the selection of geometric points on the boundary of a convex polygon or the surface of a convex polyhedron. Using our methodology, several new CFA patterns are found, which outperform the currently commercialized and published ones. Experiments demonstrate the effectiveness of our CFA design methodology and the superiority of our new CFA patterns. PMID:20858581

  7. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  8. Risk-Informed Safety Margin Characterization Case Study: Selection of Electrical Equipment to Be Subjected to Environmental Qualification

    SciTech Connect

    D. P. Blanchard; R. W. Youngblood

    2014-06-01

    The Risk-Informed Safety Margin Characterization (RISMC) pathway of the DOE’s Light Water Reactor Sustainability (LWRS) program focuses on advancing the state of the art in safety analysis and risk assessment to support decision-making on nuclear power plant operation well beyond the originally designed lifetime of the plants (i.e., beyond 60 years). Among the issues being addressed in RISMC is the significance of SSC aging and how confident we are about our understanding of its impact on the margin between the loads SSCs are expected to see during normal operation and accident conditions, and the SSC capacities (their ability to resist those loads) as the SSCs age. In this paper, a summary is provided of a case study that examines SSC aging from an environmental qualification (EQ) perspective. The case study illustrates how the state of knowledge regarding SSC margin can be characterized given the overall integrated plant design, and was developed to demonstrate a method for deciding on which cables to focus, which cables are not so important from an environmental qualification margin standpoint, and what plant design features or operating characteristics determine the role that environmental qualification plays in establishing a safety case on which decisions regarding margin can be made. The selection of cables for which demonstration of margin with respect to aging and environmental challenges uses a technique known as Prevention Analysis. Prevention Analysis is a Boolean method for optimal selection of SSCs (that is, those combinations of SSCs both necessary and sufficient to meet a predetermined selection criterion) in a manner that allows demonstration that plant-level safety can be demonstrated by the collection of selected SSCs alone. Choosing the set of SSCs that is necessary and sufficient to satisfy the safety objectives, and demonstrating that the safety objectives can be met effectively, determines where resources are best allocated to assure SSC

  9. A formal method for early spacecraft design verification

    NASA Astrophysics Data System (ADS)

    Fischer, P. M.; Ludtke, D.; Schaus, V.; Gerndt, A.

    In the early design phase of a spacecraft, various aspects of the system under development are described and modeled using parameters such as masses, power consumption or data rates. In particular power and data parameters are special since their values can change depending on the spacecrafts operational mode. These mode-dependent parameters can be easily verified to static requirements like a maximumdata rate. Such quick verifications allow the engineers to check the design after every change they apply. In contrast, requirements concerning the mission lifetime such as the amount of downlinked data during the whole mission, demands a more complex procedure. We propose an executable model together with a simulation framework to evaluate complex mission scenarios. In conjunction with a formalized specification of mission requirements it allows a quick verification by means of formal methods.

  10. Collocation methods for distillation design. 2: Applications for distillation

    SciTech Connect

    Huss, R.S.; Westerberg, A.W.

    1996-05-01

    The authors present applications for a collocation method for modeling distillation columns that they developed in a companion paper. They discuss implementation of the model, including discussion of the ASCEND (Advanced System for Computations in ENgineering Design) system, which enables one to create complex models with simple building blocks and interactively learn to solve them. They first investigate applying the model to compute minimum reflux for a given separation task, exactly solving nonsharp and approximately solving sharp split minimum reflux problems. They next illustrate the use of the collocation model to optimize the design a single column capable of carrying out a prescribed set of separation tasks. The optimization picks the best column diameter and total number of trays. It also picks the feed tray for each of the prescribed separations.