Sample records for nasa hazard analysis

  1. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  2. NASA Hydrogen Peroxide Propellant Hazards Technical Manual

    NASA Technical Reports Server (NTRS)

    Baker, David L.; Greene, Ben; Frazier, Wayne

    2005-01-01

    The Fire, Explosion, Compatibility and Safety Hazards of Hydrogen Peroxide NASA technical manual was developed at the NASA Johnson Space Center White Sands Test Facility. NASA Technical Memorandum TM-2004-213151 covers topics concerning high concentration hydrogen peroxide including fire and explosion hazards, material and fluid reactivity, materials selection information, personnel and environmental hazards, physical and chemical properties, analytical spectroscopy, specifications, analytical methods, and material compatibility data. A summary of hydrogen peroxide-related accidents, incidents, dose calls, mishaps and lessons learned is included. The manual draws from art extensive literature base and includes recent applicable regulatory compliance documentation. The manual may be obtained by United States government agencies from NASA Johnson Space Center and used as a reference source for hazards and safe handling of hydrogen peroxide.

  3. Hazard detection and avoidance sensor for NASA's planetary landers

    NASA Technical Reports Server (NTRS)

    Lau, Brian; Chao, Tien-Hsin

    1992-01-01

    An optical terrain analysis based sensor system specifically designed for landing hazard detection as required for NASA's autonomous planetary landers is introduced. This optical hazard detection and avoidance (HDA) sensor utilizes an optoelectronic wedge-and-ting (WRD) filter for Fourier transformed feature extraction and an electronic neural network processor for pattern classification. A fully implemented optical HDA sensor would assure safe landing of the planetary landers. Computer simulation results of a successful feasibility study is reported. Future research for hardware system implementation is also provided.

  4. GHRC: NASAs Hazardous Weather Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Bugbee, Kaylin

    2016-01-01

    The Global Hydrology Resource Center (GHRC; ghrc.nsstc.nasa.gov) is one of NASA's twelve Distributed Active Archive Centers responsible for providing access to NASA's Earth science data to users worldwide. Each of NASA's twelve DAACs focuses on a specific science discipline within Earth science, provides data stewardship services and supports its research community's needs. Established in 1991 as the Marshall Space Flight Center DAAC and renamed GHRC in 1997, the data center's original mission focused on the global hydrologic cycle. However, over the years, data holdings, tools and expertise of GHRC have gradually shifted. In 2014, a User Working Group (UWG) was established to review GHRC capabilities and provide recommendations to make GHRC more responsive to the research community's evolving needs. The UWG recommended an update to the GHRC mission, as well as a strategic plan to move in the new direction. After a careful and detailed analysis of GHRC's capabilities, research community needs and the existing data landscape, a new mission statement for GHRC has been crafted: to provide a comprehensive active archive of both data and knowledge augmentation services with a focus on hazardous weather, its governing dynamical and physical processes, and associated applications. Within this broad mandate, GHRC will focus on lightning, tropical cyclones and storm-induced hazards through integrated collections of satellite, airborne, and in-situ data sets. The new mission was adopted at the recent 2015 UWG meeting. GHRC will retain its current name until such time as it has built substantial data holdings aligned with the new mission.

  5. NASA LaRC Hazardous Material Pharmacy

    NASA Technical Reports Server (NTRS)

    Esquenet, Remy

    1995-01-01

    In 1993-1994 the Office of Environmental Engineering contracted SAIC to develop NASA Langley's Pollution Prevention (P2) Program. One of the priority projects identified in this contract was the development of a hazardous waste minimization (HAZMIN)/hazardous materials reutilization (HAZMART) program in the form of a Hazardous Materials Pharmacy. A hazardous materials pharmacy is designed to reduce hazardous material procurement costs and hazardous waste disposal costs. This is accomplished through the collection and reissue of excess hazardous material. Currently, a rarely used hazardous material may be stored in a shop area, unused, until it passes its expiration date. The material is then usually disposed of as a hazardous waste, often at a greater expense than the original cost of the material. While this material was on the shelf expiring, other shop areas may have ordered new supplies of the same material. The hazardous material pharmacy would act as a clearinghouse for such materials. Material that is not going to be used would be turned in to the pharmacy. Other users could then be issued this material free of charge, thereby reducing procurement costs. The use of this material by another shop prevents it from expiring, thereby reducing hazardous waste disposal costs.

  6. Stennis all-hazards network adopted throughout NASA

    NASA Image and Video Library

    2009-10-13

    Stennis Space Center employees Mike McKinion (left), with Erica Lane Enterprises, and Luke Scianna, with the Jacobs Facility Operating Services Contract, monitor the facility and surrounding area on the 'all-hazards network' known as HazNet. The HazNet system at Stennis was developed by a local Mississippi company to help facilitate coordinated response during emergency and disaster situations. The system was installed in the new Emergency Operations Center built at Stennis before being implemented throughout NASA. HazNet was designed by NVision Solutions Inc. of Bay St. Louis, Miss., through NASA's Innovative Partnerships Program. In addition to being used at Stennis, it has been installed at other Gulf Coast communities to help coordinate response during emergencies and such natural disasters as hurricanes.

  7. Establishing a Disruptive New Capability for NASA to Fly UAV's into Hazardous Conditions

    NASA Technical Reports Server (NTRS)

    Ely, Jay; Nguyen, Truong; Wilson, Jennifer; Brown, Robert; Laughter, Sean; Teets, Ed; Parker, Allen; Chan, Patrick Hon Man; Richards, Lance

    2015-01-01

    A 2015 NASA Aeronautics Mission "Seedling" Proposal is described for a Severe-Environment UAV (SE-UAV) that can perform in-situ measurements in hazardous atmospheric conditions like lightning, volcanic ash and radiation. Specifically, this paper describes the design of a proof-of-concept vehicle and measurement system that can survive lightning attachment during flight operations into thunderstorms. Elements from three NASA centers draw together for the SE-UAV concept. 1) The NASA KSC Genesis UAV was developed in collaboration with the DARPA Nimbus program to measure electric field and X-rays present within thunderstorms. 2) A novel NASA LaRC fiber-optic sensor uses Faraday-effect polarization rotation to measure total lightning electric current on an air vehicle fuselage. 3) NASA AFRC's state-of-the-art Fiber Optics and Systems Integration Laboratory is envisioned to transition the Faraday system to a compact, light-weight, all-fiber design. The SE-UAV will provide in-flight lightning electric-current return stroke and recoil leader data, and serve as a platform for development of emerging sensors and new missions into hazardous environments. NASA's Aeronautics and Science Missions are interested in a capability to perform in-situ volcanic plume measurements and long-endurance UAV operations in various weather conditions. (Figure 1 shows an artist concept of a SE-UAV flying near a volcano.) This paper concludes with an overview of the NASA Aeronautics Strategic Vision, Programs, and how a SE-UAV is envisioned to impact them. The SE-UAV concept leverages high-value legacy research products into a new capability for NASA to fly a pathfinder UAV into hazardous conditions, and is presented in the SPIE DSS venue to explore teaming, collaboration and advocacy opportunities outside NASA.

  8. Establishing a disruptive new capability for NASA to fly UAV's into hazardous conditions

    NASA Astrophysics Data System (ADS)

    Ely, Jay; Nguyen, Truong; Wilson, Jennifer; Brown, Robert; Laughter, Sean; Teets, Ed; Parker, Allen; Chan, Hon M.; Richards, Lance

    2015-05-01

    A 2015 NASA Aeronautics Mission "Seedling" Proposal is described for a Severe-Environment UAV (SE-UAV) that can perform in-situ measurements in hazardous atmospheric conditions like lightning, volcanic ash and radiation. Specifically, this paper describes the design of a proof-of-concept vehicle and measurement system that can survive lightning attachment during flight operations into thunderstorms. Elements from three NASA centers draw together for the SE-UAV concept. 1) The NASA KSC Genesis UAV was developed in collaboration with the DARPA Nimbus program to measure electric field and X-rays present within thunderstorms. 2) A novel NASA LaRC fiber-optic sensor uses Faraday-effect polarization rotation to measure total lightning electric current on an air vehicle fuselage. 3) NASA AFRC's state-of-the-art Fiber Optics and Systems Integration Laboratory is envisioned to transition the Faraday system to a compact, light-weight, all-fiber design. The SE-UAV will provide in-flight lightning electric-current return stroke and recoil leader data, and serve as a platform for development of emerging sensors and new missions into hazardous environments. NASA's Aeronautics and Science Missions are interested in a capability to perform in-situ volcanic plume measurements and long-endurance UAV operations in various weather conditions. (Figure 1 shows an artist concept of a SE-UAV flying near a volcano.) This paper concludes with an overview of the NASA Aeronautics Strategic Vision, Programs, and how a SE-UAV is envisioned to impact them. The SE-UAV concept leverages high-value legacy research products into a new capability for NASA to fly a pathfinder UAV into hazardous conditions, and is presented in the SPIE DSS venue to explore teaming, collaboration and advocacy opportunities outside NASA.

  9. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  10. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  11. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  12. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  13. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  14. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  15. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  16. The NASA F-106B Storm Hazards Program

    NASA Technical Reports Server (NTRS)

    Neely, W. R., Jr.; Fisher, B. D.

    1983-01-01

    During the NASA LRC Storm Hazards Program, 698 thunderstorm precipitations were made from 1980 to 1983 with an F-106B aircraft in order to record direct lightning strike data and the associated flight conditions. It was found that each of the three composite fin caps tested experienced multiple lightning attachments with only minor cosmetic damage. The maximum current level was only 20 ka, which is well below the design standard of 200 ka; however, indications are that the current rate of rise standard has been approached and may be exceeded in a major strike. The peak lightning strike rate occurred at ambient temperatures between -40 and -45 C, while most previously reported strikes have occurred at or near the freezing level. No significant operational difficulties or major aircraft damage resulting from the thunderstorm penetrations have been found.

  17. Resources Available for Hazards Analysis of Aerospace Fluids

    NASA Technical Reports Server (NTRS)

    Woods, S. S.; Stewart, W. F.; Baker, D. L.

    2001-01-01

    In recent years, the legislative and executive branches of the federal government have pushed to make government more efficient and responsive to the needs of the marketplace. One of these initiatives, Public Law 104-113, also known as the National Technology Transfer and Advancement Act of 1995 (NTTAA), is designed to accelerate technology transfer to industry and promote government-industry partnership. Summarized, NTTAA states that '... all Federal agencies and departments shall use technical standards that are developed or adopted by voluntary consensus standards bodies, using such technical standards as a means to carry out policy objectives or activities determined by the agencies and departments. Government agencies must now determine if their in-house requirement-setting activities are sufficiently unique that no public interest is served by having them adopted by a voluntary consensus organization (VCO), or if not, to use or develop voluntary consensus standards. The Office of Management and Budget (OMB) is chartered by the law to monitor federal agency progress and report the results to Congress. In response to NTTAA, agency-wide oxygen and hydrogen safety standards sponsored by the NASA Headquarters (HQ) Office of Safety and Mission Assurance (OSMA) were obvious choices for early adoption by VCOs. In 1996, HQ sought assistance from the Johnson Space Center (JSC) White Sands Test Facility (WSTF), the technical lead for development of these safety standards, to evaluate their adoption by VCOs. At that time, WSTF-developed propellant hazards manuals were likewise identified for possible VCO adoption. Subsequently, WSTF was asked to represent NASA for development of an international ISO safety standard for hydrogen use. Concurrent with these WSTF standards activities are related efforts to develop and publish propellant hazards analysis protocols and safety courses for the industrial, propellant use of oxygen, hydrogen, and hypergols. This paper reports on

  18. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  19. Benefit from NASA

    NASA Image and Video Library

    1997-01-01

    When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.

  20. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...

  1. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...

  2. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less

  3. Characterizing the Preturbulence Environment for Sensor Development, New Hazard Algorithms and NASA Experimental Flight Planning

    NASA Technical Reports Server (NTRS)

    Kaplan, Michael L.; Lin, Yuh-Lang

    2004-01-01

    During the grant period, several tasks were performed in support of the NASA Turbulence Prediction and Warning Systems (TPAWS) program. The primary focus of the research was on characterizing the preturbulence environment by developing predictive tools and simulating atmospheric conditions that preceded severe turbulence. The goal of the research being to provide both dynamical understanding of conditions that preceded turbulence as well as providing predictive tools in support of operational NASA B-757 turbulence research flights. The advancements in characterizing the preturbulence environment will be applied by NASA to sensor development for predicting turbulence onboard commercial aircraft. Numerical simulations with atmospheric models as well as multi-scale observational analyses provided insights into the environment organizing turbulence in a total of forty-eight specific case studies of severe accident producing turbulence on commercial aircraft. These accidents exclusively affected commercial aircraft. A paradigm was developed which diagnosed specific atmospheric circulation systems from the synoptic scale down to the meso-y scale that preceded turbulence in both clear air and in proximity to convection. The emphasis was primarily on convective turbulence as that is what the TPAWS program is most focused on in terms of developing improved sensors for turbulence warning and avoidance. However, the dynamical paradigm also has applicability to clear air and mountain turbulence. This dynamical sequence of events was then employed to formulate and test new hazard prediction indices that were first tested in research simulation studies and then ultimately were further tested in support of the NASA B-757 turbulence research flights. The new hazard characterization algorithms were utilized in a Real Time Turbulence Model (RTTM) that was operationally employed to support the NASA B-757 turbulence research flights. Improvements in the RTTM were implemented in an

  4. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  5. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  6. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  7. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...

  8. K Basin Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PECH, S.H.

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  9. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  10. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  11. Hazard Analysis Guidelines for Transit Projects

    DOT National Transportation Integrated Search

    2000-01-01

    These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...

  12. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less

  13. National Emission Standards for Hazardous Air Pollutants (NESHAP) Memorandum of Agreement (MOA) Between NASA Headquarters and MSFC (Marshall Space Flight Center) for NASA Principal Center for Review of Clean Air Regulations

    NASA Technical Reports Server (NTRS)

    Caruso, Salvadore V.; Clark-Ingram, Marceia A.

    2000-01-01

    This paper presents a memorandum of agreement on Clean Air Regulations. NASA headquarters (code JE and code M) has asked MSFC to serve as principle center for review of Clean Air Act (CAA) regulations. The purpose of the principle center is to provide centralized support to NASA headquarters for the management and leadership of NASA's CAA regulation review process and to identify the potential impact of proposed CAA reguations on NASA program hardware and supporting facilities. The materials and processes utilized in the manufacture of NASA's programmatic hardware contain HAPs (Hazardous Air Pollutants), VOCs (Volatile Organic Compounds), and ODC (Ozone Depleting Chemicals). This paper is presented in viewgraph form.

  14. Canister Storage Building (CSB) Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and

  15. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less

  16. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  17. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  18. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  19. Preliminary hazards analysis -- vitrification process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coordes, D.; Ruggieri, M.; Russell, J.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s constructionmore » and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.« less

  20. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  1. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  2. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  3. Atmospheric Electrical Modeling in Support of the NASA F-106 Storm Hazards Project

    NASA Technical Reports Server (NTRS)

    Helsdon, John H., Jr.

    1988-01-01

    A recently developed storm electrification model (SEM) is used to investigate the operating environment of the F-106 airplane during the NASA Storm Hazards Project. The model is 2-D, time dependent and uses a bulkwater microphysical parameterization scheme. Electric charges and fields are included, and the model is fully coupled dynamically, microphysically and electrically. One flight showed that a high electric field was developed at the aircraft's operating altitude (28 kft) and that a strong electric field would also be found below 20 kft; however, this low-altitude, high-field region was associated with the presence of small hail, posing a hazard to the aircraft. An operational procedure to increase the frequency of low-altitude lightning strikes was suggested. To further the understanding of lightning within the cloud environment, a parameterization of the lightning process was included in the SEM. It accounted for the initiation, propagation, termination, and charge redistribution associated with an intracloud discharge. Finally, a randomized lightning propagation scheme was developed, and the effects of cloud particles on the initiation of lightning investigated.

  4. KSC VAB Aeroacoustic Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  5. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less

  6. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  7. Atmospheric electrical modeling in support of the NASA F106 Storm Hazards Project

    NASA Technical Reports Server (NTRS)

    Helsdon, J. H.

    1986-01-01

    With the use of composite (non-metallic) and microelectronics becoming more prevalent in the construction of both military and commercial aircraft, the control systems have become more susceptible to damage or failure from electromagnetic transients. One source of such transients is the lightning discharge. In order to study the effects of the lightning discharge on the vital components of an aircraft, NASA Langley Research Center has undertaken a Storm Hazards Program in which a specially instrumented F106B jet aircraft is flown into active thunderstorms with the intention of being struck by lightning. One of the specific purposes of the program is to quantify the environmental conditions which are conductive to aircraft lightning strikes.

  8. 1992 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.

  9. Hazards on Hazards, Ensuring Spacecraft Safety While Sampling Asteroid Surface Materials

    NASA Astrophysics Data System (ADS)

    Johnson, C. A.; DellaGiustina, D. N.

    2016-12-01

    The near-Earth object Bennu is a carbonaceous asteroid that is a remnant from the earliest stages of the solar-system formation. It is also a potentially hazardous asteroid with a relatively high probability of impacting Earth late in the 22nd century. While the primary focus of the NASA funded OSIRIS-REx mission is the return of pristine organic material from the asteroid's surface, information about Bennu's physical and chemical properties gleaned throughout operations will be critical for a possible future impact mitigation mission. In order to ensure a regolith sample can be successfully acquired, the sample site and surrounding area must be thoroughly assessed for any potential hazards to the spacecraft. The OSIRIS-REx Image Processing Working Group has been tasked with generating global and site-specific hazard maps using mosaics and a trio of fea­­­ture identification techniques. These techniques include expert-lead manual classification, internet-based amateur classification using the citizen science platform CosmoQuest, and automated classification using machine learning and computer vision tools. Because proximity operations around Bennu do not begin until the end of 2018, we have an opportunity to test t­­­he performance of our software on analogue surfaces of other asteroids from previous NASA and other space agencies missions. The entire pipeline from image processing and mosaicking to hazard identification, analysis and mapping will be performed on asteroids of varying size, shape and surface morphology. As a result, upon arrival at Bennu, we will have the software and processes in place to quickly and confidently produce the hazard maps needed to ensure the success of our mission.

  10. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    PubMed

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  11. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate

  12. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  13. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  14. Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.; Winner, David R.

    2010-01-01

    This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

  15. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design

  16. NASA/MSFC multilayer diffusion models and computer program for operational prediction of toxic fuel hazards

    NASA Technical Reports Server (NTRS)

    Dumbauld, R. K.; Bjorklund, J. R.; Bowers, J. F.

    1973-01-01

    The NASA/MSFC multilayer diffusion models are discribed which are used in applying meteorological information to the estimation of toxic fuel hazards resulting from the launch of rocket vehicle and from accidental cold spills and leaks of toxic fuels. Background information, definitions of terms, description of the multilayer concept are presented along with formulas for determining the buoyant rise of hot exhaust clouds or plumes from conflagrations, and descriptions of the multilayer diffusion models. A brief description of the computer program is given, and sample problems and their solutions are included. Derivations of the cloud rise formulas, users instructions, and computer program output lists are also included.

  17. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...

  18. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  19. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  20. Analysis of wavelet technology for NASA applications

    NASA Technical Reports Server (NTRS)

    Wells, R. O., Jr.

    1994-01-01

    The purpose of this grant was to introduce a broad group of NASA researchers and administrators to wavelet technology and to determine its future role in research and development at NASA JSC. The activities of several briefings held between NASA JSC scientists and Rice University researchers are discussed. An attached paper, 'Recent Advances in Wavelet Technology', summarizes some aspects of these briefings. Two proposals submitted to NASA reflect the primary areas of common interest. They are image analysis and numerical solutions of partial differential equations arising in computational fluid dynamics and structural mechanics.

  1. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  2. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  3. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  4. Combining System Safety and Reliability to Ensure NASA CoNNeCT's Success

    NASA Technical Reports Server (NTRS)

    Havenhill, Maria; Fernandez, Rene; Zampino, Edward

    2012-01-01

    Hazard Analysis, Failure Modes and Effects Analysis (FMEA), the Limited-Life Items List (LLIL), and the Single Point Failure (SPF) List were applied by System Safety and Reliability engineers on NASA's Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT) Project. The integrated approach involving cross reviews of these reports by System Safety, Reliability, and Design engineers resulted in the mitigation of all identified hazards. The outcome was that the system met all the safety requirements it was required to meet.

  5. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... determine whether there are food hazards that are reasonably likely to occur for each type of juice...

  6. 1991 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1991 Life Support Systems Analysis Workshop was sponsored by NASA Headquarters' Office of Aeronautics and Space Technology (OAST) to foster communication among NASA, industrial, and academic specialists, and to integrate their inputs and disseminate information to them. The overall objective of systems analysis within the Life Support Technology Program of OAST is to identify, guide the development of, and verify designs which will increase the performance of the life support systems on component, subsystem, and system levels for future human space missions. The specific goals of this workshop were to report on the status of systems analysis capabilities, to integrate the chemical processing industry technologies, and to integrate recommendations for future technology developments related to systems analysis for life support systems. The workshop included technical presentations, discussions, and interactive planning, with time allocated for discussion of both technology status and time-phased technology development recommendations. Key personnel from NASA, industry, and academia delivered inputs and presentations on the status and priorities of current and future systems analysis methods and requirements.

  7. Use of the X-Band Radar to Support the Detection of In-Flight Icing Hazards by the NASA Icing Remote Sensing System

    NASA Technical Reports Server (NTRS)

    Serke, David J.; Politovich, Marcia K.; Reehorst, Andrew L.; Gaydos, Andrew

    2009-01-01

    The Alliance Icing Research Study-II (AIRS-II) field program was conducted near Montreal, Canada during the winter of 2003. The NASA Icing Remote Detection System (NIRSS) was deployed to detect in-flight icing hazards and consisted of a vertically pointing multichannel radiometer, a ceilometer and an x-band cloud radar. The radiometer was used to derive atmospheric temperature soundings and integrated liquid water, while the ceilometer and radar were used only to define cloud boundaries. The purpose of this study is to show that the radar reflectivity profiles from AIRS-II case studies could be used to provide a qualitative icing hazard.

  8. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  9. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  10. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  11. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  12. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  13. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  14. A Hybrid-Cloud Science Data System Enabling Advanced Rapid Imaging & Analysis for Monitoring Hazards

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Moore, A. W.; Fielding, E. J.; Radulescu, C.; Sacco, G.; Stough, T. M.; Mattmann, C. A.; Cervelli, P. F.; Poland, M. P.; Cruz, J.

    2012-12-01

    Volcanic eruptions, landslides, and levee failures are some examples of hazards that can be more accurately forecasted with sufficient monitoring of precursory ground deformation, such as the high-resolution measurements from GPS and InSAR. In addition, coherence and reflectivity change maps can be used to detect surface change due to lava flows, mudslides, tornadoes, floods, and other natural and man-made disasters. However, it is difficult for many volcano observatories and other monitoring agencies to process GPS and InSAR products in an automated scenario needed for continual monitoring of events. Additionally, numerous interoperability barriers exist in multi-sensor observation data access, preparation, and fusion to create actionable products. Combining high spatial resolution InSAR products with high temporal resolution GPS products--and automating this data preparation & processing across global-scale areas of interests--present an untapped science and monitoring opportunity. The global coverage offered by satellite-based SAR observations, and the rapidly expanding GPS networks, can provide orders of magnitude more data on these hazardous events if we have a data system that can efficiently and effectively analyze the voluminous raw data, and provide users the tools to access data from their regions of interest. Currently, combined GPS & InSAR time series are primarily generated for specific research applications, and are not implemented to run on large-scale continuous data sets and delivered to decision-making communities. We are developing an advanced service-oriented architecture for hazard monitoring leveraging NASA-funded algorithms and data management to enable both science and decision-making communities to monitor areas of interests via seamless data preparation, processing, and distribution. Our objectives: * Enable high-volume and low-latency automatic generation of NASA Solid Earth science data products (InSAR and GPS) to support hazards

  15. Plasma Hazards and Acceptance for International Space Station Extravehicular Activities

    NASA Astrophysics Data System (ADS)

    Patton, Thomas

    2010-09-01

    Extravehicular activity(EVA) is accepted by NASA and other space faring agencies as a necessary risk in order to build and maintain a safe and efficient laboratory in space. EVAs are used for standard construction and as contingency operations to repair critical equipment for vehicle sustainability and safety of the entire crew in the habitable volume. There are many hazards that are assessed for even the most mundane EVA for astronauts, and the vast majority of these are adequately controlled per the rules of the International Space Station Program. The need for EVA repair and construction has driven acceptance of a possible catastrophic hazard to the EVA crewmember which cannot currently be controlled adequately. That hazard is electrical shock from the very environment in which they work. This paper describes the environment, causes and contributors to the shock of EVA crewmembers attributed to the ionospheric plasma environment in low Earth orbit. It will detail the hazard history, and acceptance process for the risk associated with these hazards that give assurance to a safe EVA. In addition to the hazard acceptance process this paper will explore other factors that go into the decision to accept a risk including criticality of task, hardware design and capability, and the probability of hazard occurrence. Also included will be the required interaction between organizations at NASA(EVA Office, Environments, Engineering, Mission Operations, Safety) in order to build and eventually gain adequate acceptance rationale for a hazard of this kind. During the course of the discussion, all current methods of mitigating the hazard will be identified. This paper will capture the history of the plasma hazard analysis and processes used by the International Space Station Program to formally assess and qualify the risk. The paper will discuss steps that have been taken to identify and perform required analysis of the floating potential shock hazard from the ISS environment

  16. NASA Aeronautics and Space Database for bibliometric analysis

    NASA Technical Reports Server (NTRS)

    Powers, R.; Rudman, R.

    2004-01-01

    The authors use the NASA Aeronautics and Space Database to perform bibliometric analysis of citations. This paper explains their research methodology and gives some sample results showing collaboration trends between NASA Centers and other institutions.

  17. Earthquake Hazard Analysis Methods: A Review

    NASA Astrophysics Data System (ADS)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  18. NASA Enterprise Visual Analysis

    NASA Technical Reports Server (NTRS)

    Lopez-Tellado, Maria; DiSanto, Brenda; Humeniuk, Robert; Bard, Richard, Jr.; Little, Mia; Edwards, Robert; Ma, Tien-Chi; Hollifield, Kenneith; White, Chuck

    2007-01-01

    NASA Enterprise Visual Analysis (NEVA) is a computer program undergoing development as a successor to Launch Services Analysis Tool (LSAT), formerly known as Payload Carrier Analysis Tool (PCAT). NEVA facilitates analyses of proposed configurations of payloads and packing fixtures (e.g. pallets) in a space shuttle payload bay for transport to the International Space Station. NEVA reduces the need to use physical models, mockups, and full-scale ground support equipment in performing such analyses. Using NEVA, one can take account of such diverse considerations as those of weight distribution, geometry, collision avoidance, power requirements, thermal loads, and mechanical loads.

  19. An Overview of NASA Engine Ice-Crystal Icing Research

    NASA Technical Reports Server (NTRS)

    Addy, Harold E., Jr.; Veres, Joseph P.

    2011-01-01

    Ice accretions that have formed inside gas turbine engines as a result of flight in clouds of high concentrations of ice crystals in the atmosphere have recently been identified as an aviation safety hazard. NASA s Aviation Safety Program (AvSP) has made plans to conduct research in this area to address the hazard. This paper gives an overview of NASA s engine ice-crystal icing research project plans. Included are the rationale, approach, and details of various aspects of NASA s research.

  20. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  1. The Evolution of the NASA Commercial Crew Program Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy C.

    2016-01-01

    In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.

  2. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  3. NASA Armstrong's Approach to Store Separation Analysis

    NASA Technical Reports Server (NTRS)

    Acuff, Chris; Bui, Trong

    2015-01-01

    Presentation will an overview of NASA Armstrong's store separation capabilities and how they have been applied recently. Objective of the presentation is to brief Generation Orbit and other potential partners on NASA Armstrong's store separation capabilities. It will include discussions on the use of NAVSEP and Cart3D, as well as some Python scripting work to perform the analysis, and a short overview of this methodology applied to the Towed Glider Air Launch System. Collaboration with potential customers in this area could lead to funding for the further development of a store separation capability at NASA Armstrong, which would boost the portfolio of engineering expertise at the center.

  4. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less

  5. NASA Operational Environment Team (NOET): NASA's key to environmental technology

    NASA Technical Reports Server (NTRS)

    Cook, Beth

    1993-01-01

    NASA has stepped forward to face the environmental challenge to eliminate the use of Ozone-Layer Depleting Substances (OLDS) and to reduce our Hazardous Air Pollutants (HAP) by 50 percent in 1995. These requirements have been issued by the Clean Air Act, the Montreal Protocol, and various other legislative acts. A proactive group, the NASA Operational Environment Team or NOET, received its charter in April 1992 and was tasked with providing a network through which replacement activities and development experiences can be shared. This is a NASA-wide team which supports the research and development community by sharing information both in person and via a computerized network, assisting in specification and standard revisions, developing cleaner propulsion systems, and exploring environmentally-compliant alternatives to current processes.

  6. Guide for Oxygen Hazards Analyses on Components and Systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, Joel M.; Dees, Jesse; Poe, Robert F.

    1996-01-01

    Because most materials, including metals, will burn in an oxygen-enriched environment, hazards are always present when using oxygen. Most materials will ignite at lower temperatures in an oxygen-enriched environment than in air, and once ignited, combustion rates are greater in the oxygen-enriched environment. Many metals burn violently in an oxygen-enriched environment when ignited. Lubricants, tapes, gaskets, fuels, and solvents can increase the possibility of ignition in oxygen systems. However, these hazards do not preclude the use of oxygen. Oxygen may be safely used if all the materials in a system are not flammable in the end-use environment or if ignition sources are identified and controlled. These ignition and combustion hazards necessitate a proper oxygen hazards analysis before introducing a material or component into oxygen service. The objective of this test plan is to describe the White Sands Test Facility oxygen hazards analysis to be performed on components and systems before oxygen is introduced and is recommended before implementing the oxygen component qualification procedure. The plan describes the NASA Johnson Space Center White Sands Test Facility method consistent with the ASTM documents for analyzing the hazards of components and systems exposed to an oxygen-enriched environment. The oxygen hazards analysis is a useful tool for oxygen-system designers, system engineers, and facility managers. Problem areas can be pinpointed before oxygen is introduced into the system, preventing damage to hardware and possible injury or loss of life.

  7. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  8. [Hazard function and life table: an introduction to the failure time analysis].

    PubMed

    Matsushita, K; Inaba, H

    1987-04-01

    Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.

  9. Hazardous-waste analysis plan for LLNL operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, R.S.

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan willmore » address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.« less

  10. The Economics of NASA Mission Cost Reserves

    NASA Technical Reports Server (NTRS)

    Whitley, Sally; Shinn, Stephen

    2012-01-01

    Increases in NASA mission costs have led to analysis of the causes and magnitude of historical mission overruns as well as mitigation and prevention attempts. This paper hypothesizes that one cause is that the availability of reserves may reduce incentives to control costs. We draw a comparison to the insurance concept of moral hazard, and we use actuarial techniques to better understand the increase in mission costs due to the availability of reserves. NASA's CADRe database provided the data against which we tested our hypothesis and discovered that there is correlation between the amount of available reserves and project overruns, particularly for mission hardware cost increases. We address the question of how to prevent reserves from increasing mission spending without increasing cost risk to projects.

  11. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  12. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  13. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  14. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  15. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...

  16. The NASA Commercial Crew Program (CCP) Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy

    2016-01-01

    In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.

  17. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of

  18. Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) Project Status as of May 2010

    NASA Technical Reports Server (NTRS)

    Striepe, Scott A.; Epp, Chirold D.; Robertson, Edward A.

    2010-01-01

    This paper includes the current status of NASA s Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) Project. The ALHAT team has completed several flight tests and two major design analysis cycles. These tests and analyses examine terrain relative navigation sensors, hazard detection and avoidance sensors and algorithms, and hazard relative navigation algorithms, and the guidance and navigation system using these ALHAT functions. The next flight test is scheduled for July 2010. The paper contains results from completed flight tests and analysis cycles. ALHAT system status, upcoming tests and analyses is also addressed. The current ALHAT plans as of May 2010 are discussed. Application of the ALHAT system to landing on bodies other than the Moon is included

  19. NASA gateway requirements analysis

    NASA Technical Reports Server (NTRS)

    Duncan, Denise R.; Doby, John S.; Shockley, Cynthia W.

    1991-01-01

    NASA devotes approximately 40 percent of its budget to R&D. Twelve NASA Research Centers and their contractors conduct this R&D, which ranges across many disciplines and is fueled by information about previous endeavors. Locating the right information is crucial. While NASA researchers use peer contacts as their primary source of scientific and technical information (STI), on-line bibliographic data bases - both Government-owned and commercial - are also frequently consulted. Once identified, the STI must be delivered in a usable format. This report assesses the appropriateness of developing an intelligent gateway interface for the NASA R&D community as a means of obtaining improved access to relevant STI resources outside of NASA's Remote Console (RECON) on-line bibliographic database. A study was conducted to determine (1) the information requirements of the R&D community, (2) the information sources to meet those requirements, and (3) ways of facilitating access to those information sources. Findings indicate that NASA researchers need more comprehensive STI coverage of disciplines not now represented in the RECON database. This augmented subject coverage should preferably be provided by both domestic and foreign STI sources. It was also found that NASA researchers frequently request rapid delivery of STI, in its original format. Finally, it was found that researchers need a better system for alerting them to recent developments in their areas of interest. A gateway that provides access to domestic and international information sources can also solve several shortcomings in the present STI delivery system. NASA should further test the practicality of a gateway as a mechanism for improved STI access.

  20. Agent-based simulation for human-induced hazard analysis.

    PubMed

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  1. NASA Laboratory Analysis for Manned Exploration Missions

    NASA Technical Reports Server (NTRS)

    Krihak, Michael (Editor); Shaw, Tianna

    2014-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability Element under the NASA Human Research Program. ELA instrumentation is identified as an essential capability for future exploration missions to diagnose and treat evidence-based medical conditions. However, mission architecture limits the medical equipment, consumables, and procedures that will be available to treat medical conditions during human exploration missions. Allocated resources such as mass, power, volume, and crew time must be used efficiently to optimize the delivery of in-flight medical care. Although commercial instruments can provide the blood and urine based measurements required for exploration missions, these commercial-off-the-shelf devices are prohibitive for deployment in the space environment. The objective of the ELA project is to close the technology gap of current minimally invasive laboratory capabilities and analytical measurements in a manner that the mission architecture constraints impose on exploration missions. Besides micro gravity and radiation tolerances, other principal issues that generally fail to meet NASA requirements include excessive mass, volume, power and consumables, and nominal reagent shelf-life. Though manned exploration missions will not occur for nearly a decade, NASA has already taken strides towards meeting the development of ELA medical diagnostics by developing mission requirements and concepts of operations that are coupled with strategic investments and partnerships towards meeting these challenges. This paper focuses on the remote environment, its challenges, biomedical diagnostics requirements and candidate technologies that may lead to successful blood/urine chemistry and biomolecular measurements in future space exploration missions. SUMMARY The NASA Exploration Laboratory Analysis project seeks to develop capability to diagnose anticipated space exploration medical conditions on future manned missions. To achieve

  2. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Scroggins, Sharon; Duda, Kristen

    2008-01-01

    This viewgraph presentation gives an overview of NASA's risk analysis communication programs associated with changing environmental policies. The topics include: 1) NASA Program Transition; 2) Principal Center for Regulatory Risk Analysis and Communication (RRAC PC); and 3) Regulatory Tracking and Communication Process.

  3. A Windshear Hazard Index

    NASA Technical Reports Server (NTRS)

    Proctor, Fred H.; Hinton, David A.; Bowles, Roland L.

    2000-01-01

    An aircraft exposed to hazardous low-level windshear may suffer a critical loss of airspeed and altitude, thus endangering its ability to remain airborne. In order to characterize this hazard, a nondimensional index was developed based oil aerodynamic principals and understanding of windshear phenomena, 'This paper reviews the development and application of the Bowles F-tactor. which is now used by onboard sensors for the detection of hazardous windshear. It was developed and tested during NASA/I:AA's airborne windshear program and is now required for FAA certification of onboard radar windshear detection systems. Reviewed in this paper are: 1) definition of windshear and description of atmospheric phenomena that may cause hazardous windshear. 2) derivation and discussion of the F-factor. 3) development of the F-factor hazard threshold, 4) its testing during field deployments, and 5) its use in accident reconstructions,

  4. Performance Analysis: Control of Hazardous Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Grange, Connie E.; Freeman, Jeff W.; Kerr, Christine E.

    2010-10-06

    LLNL experienced 26 occurrences related to the control of hazardous energy from January 1, 2008 through August 2010. These occurrences were 17% of the total number of reported occurrences during this 32-month period. The Performance Analysis and Reporting Section of the Contractor Assurance Office (CAO) routinely analyzes reported occurrences and issues looking for patterns that may indicate changes in LLNL’s performance and early indications of performance trends. It became apparent through these analyses that LLNL might have experienced a change in the control of hazardous energy and that these occurrences should be analyzed in more detail to determine if themore » perceived change in performance was real, whether that change is significant and if the causes of the occurrences are similar. This report documents the results of this more detailed analysis.« less

  5. Bit of History and Some Lessons Learned in Using NASA Remote Sensing Data in Public Health Applications

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Estes, Sue

    2011-01-01

    The NASA Applied Sciences Program's public health initiative began in 2004 to illustratethe potential benefits for using remote sensing in public health applications. Objectives/Purpose: The CDC initiated a st udy with NASA through the National Center for Environmental Health (NCEH) to establish a pilot effort to use remote sensing data as part of its Environmental Public Health Tracking Network (EPHTN). As a consequence, the NCEH and NASA developed a project called HELIX-Atlanta (Health and Environment Linkage for Information Exchange) to demonstrate a process for developing a local environmental public health tracking and surveillance network that integrates non-infectious health and environment systems for the Atlanta metropolitan area. Methods: As an ongo ing, systematic integration, analysis and interpretation of data, an EPHTN focuses on: 1 -- environmental hazards; 2 -- human exposure to environmental hazards; and 3 -- health effects potentially related to exposure to environmental hazards. To satisfy the definition of a surveillance system the data must be disseminated to plan, implement, and evaluate environmental public health action. Results: A close working r elationship developed with NCEH where information was exchanged to assist in the development of an EPHTN that incorporated NASA remote sensing data into a surveillance network for disseminating public health tracking information to users. This project?s success provided NASA with the opportunity to work with other public health entities such as the University of Mississippi Medical Center, the University of New Mexico and the University of Arizona. Conclusions: HELIX-Atlanta became a functioning part of the national EPHTN for tracking environmental hazards and exposure, particularly as related to air quality over Atlanta. Learning Objectives: 1 -- remote sensing data can be integral to an EPHTN; 2 -- public tracking objectives can be enhanced through remote sensing data; 3 -- NASA's involvement in

  6. Overview of NASA Langley's Systems Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    Cavanaugh, Stephen; Kumar, Ajay; Brewer, Laura; Kimmel, Bill; Korte, John; Moul, Tom

    2006-01-01

    The Systems Analysis and Concepts Directorate (SACD) has been in the systems analysis business line supporting National Aeronautics and Space Administration (NASA) aeronautics, exploration, space operations and science since the 1960 s. Our current organization structure is shown in Figure 1. SACD mission can be summed up in the following statements: 1. We conduct advanced concepts for Agency decision makers and programs. 2. We provide aerospace systems analysis products such as mission architectures, advanced system concepts, system and technology trades, life cycle cost and risk analysis, system integration and pre-decisional sensitive information. 3. Our work enables informed technical, programmatic and budgetary decisions. SACD has a complement of 114 government employees and approximately 50 on-site contractors which is equally split between supporting aeronautics and exploration. SACD strives for technical excellence and creditability of the systems analysis products delivered to its customers. The Directorate office is continuously building market intelligence and working with other NASA centers and external partners to expand our business base. The Branches strive for technical excellence and credibility of our systems analysis products by seeking out existing and new partnerships that are critical for successful systems analysis. The Directorates long term goal is to grow the amount of science systems analysis business base.

  7. NASA/ASEE Summer Faculty Fellowship Program. 1991 Research Reports

    NASA Technical Reports Server (NTRS)

    Hosler, E. Ramon (Editor); Beymer, Mark A. (Editor); Armstrong, Dennis W. (Editor)

    1991-01-01

    Reports from the NASA/ASEE Summer Faculty Fellowship Program are presented. The editors are responsible for selecting appropriately qualified faculty to address some of the many problems of current interest to NASA Kennedy. Some representative titles are as follows: Development of an Accelerated Test Method for the Determination of Susceptibility to Atmospheric Corrosion; Hazardous Gas Leak Analysis in the Space Shuttle; Modeling and Control of the Automated Radiator Inspection Device; Study of the Finite Element Software Packages at KSC; Multispectral Image Processing for Plants; Algorithms for Contours Depicting Static Electric Fields during Adverse Weather Conditions; Transient Study of a Cryogenic Hydrogen Filling System; and Precision Cleaning Verification of Nonvolatile Residues by using Water, Ultrasonics, and Turbidity Analyses.

  8. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  9. Tracking Hazard Analysis Data in a Jungle of Changing Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robin S.; Young, Jonathan

    2006-05-16

    Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.

  10. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  11. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  12. NASA Safety Manual. Volume 3: System Safety

    NASA Technical Reports Server (NTRS)

    1970-01-01

    This Volume 3 of the NASA Safety Manual sets forth the basic elements and techniques for managing a system safety program and the technical methods recommended for use in developing a risk evaluation program that is oriented to the identification of hazards in aerospace hardware systems and the development of residual risk management information for the program manager that is based on the hazards identified. The methods and techniques described in this volume are in consonance with the requirements set forth in NHB 1700.1 (VI), Chapter 3. This volume and future volumes of the NASA Safety Manual shall not be rewritten, reprinted, or reproduced in any manner. Installation implementing procedures, if necessary, shall be inserted as page supplements in accordance with the provisions of Appendix A. No portion of this volume or future volumes of the NASA Safety Manual shall be invoked in contracts.

  13. Systems Analysis of NASA Aviation Safety Program: Final Report

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen

    2013-01-01

    A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.

  14. NASA Applied Sciences Program

    NASA Technical Reports Server (NTRS)

    Frederick, Martin

    2006-01-01

    This presentation highlights the NASA Applied Sciences Program. The goal of the program is to extend the results of scientific research and knowledge beyond the science community to contribute to NASA's partners' applications of national priority, such as agricultural efficiency, energy management and Homeland Security. Another purpose of the program's scientific research is to increase knowledge of the Earth-Sun system to enable improved predictions of climate, weather, and natural hazards. The program primarily optimizes benefits for citizens by contributing to partnering on applications that are used by state, local and tribal governments.

  15. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...

  16. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...

  17. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...

  18. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...

  19. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...

  20. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  1. NASA program decisions using reliability analysis.

    NASA Technical Reports Server (NTRS)

    Steinberg, A.

    1972-01-01

    NASA made use of the analytical outputs of reliability people to make management decisions on the Apollo program. Such decisions affected the amount of the incentive fees, how much acceptance testing was necessary, how to optimize development testing, whether to approve engineering changes, and certification of flight readiness. Examples of such analysis are discussed and related to programmatic decisions.-

  2. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    PubMed

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  3. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  4. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  5. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  6. An integrated knowledge system for the Space Shuttle hazardous gas detection system

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Shi, George Z.; Bangasser, Carl; Fensky, Connie; Cegielski, Eric; Overbey, Glenn

    1993-01-01

    A computer-based integrated Knowledge-Based System, the Intelligent Hypertext Manual (IHM), was developed for the Space Shuttle Hazardous Gas Detection System (HGDS) at NASA Marshall Space Flight Center (MSFC). The IHM stores HGDS related knowledge and presents it in an interactive and intuitive manner. This manual is a combination of hypertext and an expert system which store experts' knowledge and experience in hazardous gas detection and analysis. The IHM's purpose is to provide HGDS personnel with the capabilities of: locating applicable documentation related to procedures, constraints, and previous fault histories; assisting in the training of personnel; enhancing the interpretation of real time data; and recognizing and identifying possible faults in the Space Shuttle sub-systems related to hazardous gas detection.

  7. NASA's Space Environments and Effects (SEE) Program

    NASA Technical Reports Server (NTRS)

    Kauffman, Billy; Hardage, Donna; Minor, Jody; Barth, Janet; LaBel, Ken

    2003-01-01

    This viewgraph presentation gives a broad overview of NASA's Space Enivronments and Effects (SEE) Program. The purpose of the program is to protect spacecraft and their systems from damage by radiation, spacecraft charging, micrometeoroids, contamination, and other hazards posed by aerospace environments. The presentation profiles SEE activities to address each of these hazards. SEE is responsible for overseeing research and product development with a variety of partners.

  8. Space Propulsion Hazards Analysis Manual (SPHAM). Volume 2. Appendices

    DTIC Science & Technology

    1988-10-01

    lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Volume I Chapter 2 - Requirementb and the Hazards Analysis Process .... Volume I Chapter 3 - Accident Scenarios...list of the hazardous materials that are discussed; 3 ) description of the failure scenarios; 4) type of post-accident environment that is discussed

  9. 75 FR 5629 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-03

    ... NATIONAL AERONAUTICS AND SPACE ADMINISTRATION [Notice (10-018)] NASA Advisory Council; Audit... Analysis Committee of the NASA Advisory Council. DATES: Wednesday, February 17, 2010, 9 a.m.-4 p.m. EST. ADDRESSES: NASA Headquarters, 300 E Street, SW., Washington, DC 20456, Conference Room 8D48. FOR FURTHER...

  10. NASA's Planetary Defense Coordination Office at NASA HQ

    NASA Astrophysics Data System (ADS)

    Daou, D.; Johnson, L.; Fast, K. E.; Landis, R.; Friedensen, V. P.; Kelley, M.

    2017-09-01

    NASA and its partners maintain a watch for near-Earth objects (NEOs), asteroids and comets that pass close to the Earth, as part of an ongoing effort to discover, catalog, and characterize these bodies. The PDCO is responsible for: • Ensuring the early detection of potentially hazardous objects (PHOs) - asteroids and comets whose orbit are predicted to bring them within 0.05 Astronomical Units of Earth; and of a size large enough to reach Earth's surface - that is, greater than perhaps 30 to 50 meters; • Tracking and characterizing PHOs and issuing warnings about potential impacts; • Providing timely and accurate communications about PHOs; and • Performing as a lead coordination node in U.S. Government planning for response to an actual impact threat. The PDCO collaborates with other U.S. Government agencies, other national and international agencies, and professional and amateur astronomers around the world. The PDCO also is responsible for facilitating communications between the science community and the public should any potentially hazardous NEO be discovered. In addition, the PDCO works closely with the United Nations Office of Outer Space Affairs, its Committee on the Peaceful Uses of Outer Space, and its Action Team on Near Earth Objects (also known as Action Team 14). The PDCO is a leading member of the International Asteroid Warning Network (IAWN) and the Space Missions Planning Advisory Group (SMPAG), multinational endeavors recommended by the United Nations for an international response to the NEO impact hazard and established and operated by the spacecapable nations. The PDCO also communicates with the scientific community through channels such as NASA's Small Bodies Assessment Group (SBAG). In this talk, we will provide an update to the office's various efforts and new opportunities for partnerships in the continuous international effort for Planetary Defense.

  11. NASA's Planetary Defense Coordination Office at NASA HQ

    NASA Astrophysics Data System (ADS)

    Daou, D.; Johnson, L.; Fast, K. E.; Landis, R.; Friedensen, V. P.; Kelley, M.

    2017-12-01

    NASA and its partners maintain a watch for near-Earth objects (NEOs), asteroids and comets that pass close to the Earth, as part of an ongoing effort to discover, catalog, and characterize these bodies. The PDCO is responsible for: Ensuring the early detection of potentially hazardous objects (PHOs) - asteroids and comets whose orbit are predicted to bring them within 0.05 Astronomical Units of Earth; and of a size large enough to reach Earth's surface - that is, greater than perhaps 30 to 50 meters; Tracking and characterizing PHOs and issuing warnings about potential impacts; Providing timely and accurate communications about PHOs; and Performing as a lead coordination node in U.S. Government planning for response to an actual impact threat. The PDCO collaborates with other U.S. Government agencies, other national and international agencies, and professional and amateur astronomers around the world. The PDCO also is responsible for facilitating communications between the science community and the public should any potentially hazardous NEO be discovered. In addition, the PDCO works closely with the United Nations Office of Outer Space Affairs, its Committee on the Peaceful Uses of Outer Space, and its Action Team on Near Earth Objects (also known as Action Team 14). The PDCO is a leading member of the International Asteroid Warning Network (IAWN) and the Space Missions Planning Advisory Group (SMPAG), multinational endeavors recommended by the United Nations for an international response to the NEO impact hazard and established and operated by the space-capable nations. The PDCO also communicates with the scientific community through channels such as NASA's Small Bodies Assessment Group (SBAG). In this talk, we will provide an update to the office's various efforts and new opportunities for partnerships in the continuous international effort for Planetary Defense.

  12. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  13. 78 FR 21631 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-11

    ...This is an amended version of NASA's earlier Federal Register Notice [13-043] published on April 5, 2013 [page 20696]. The dates and agenda for the meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council have been revised. The revised date and agenda are provided below. In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  14. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis

    PubMed Central

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011

  15. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...

  16. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...

  17. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...

  18. Flood Hazard and Risk Analysis in Urban Area

    NASA Astrophysics Data System (ADS)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  19. Tracking Hazard Analysis Data in a Jungle of Changing Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site inmore » Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.« less

  20. Managing Space System Faults: Coalescing NASA's Views

    NASA Technical Reports Server (NTRS)

    Muirhead, Brian; Fesq, Lorraine

    2012-01-01

    Managing faults and their resultant failures is a fundamental and critical part of developing and operating aerospace systems. Yet, recent studies have shown that the engineering "discipline" required to manage faults is not widely recognized nor evenly practiced within the NASA community. Attempts to simply name this discipline in recent years has been fraught with controversy among members of the Integrated Systems Health Management (ISHM), Fault Management (FM), Fault Protection (FP), Hazard Analysis (HA), and Aborts communities. Approaches to managing space system faults typically are unique to each organization, with little commonality in the architectures, processes and practices across the industry.

  1. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum

  2. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  3. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  4. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  5. Exploring NASA and ESA Atmospheric Data Using GIOVANNI, the Online Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory

    2007-01-01

    Giovanni, the NASA Goddard online visualization and analysis tool (http://giovanni.gsfc.nasa.gov) allows users explore various atmospheric phenomena without learning remote sensing data formats and downloading voluminous data. Using NASA MODIS (Terra and Aqua) and ESA MERIS (ENVISAT) aerosol data as an example, we demonstrate Giovanni usage for online multi-sensor remote sensing data comparison and analysis.

  6. NASA Multidisciplinary Design and Analysis Fellowship Program

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This report is a Year 1 interim report of the progress on the NASA multidisciplinary Design and Analysis Fellowship Program covering the period, January 1, 1995 through September 30, 1995. It summarizes progress in establishing the MDA Fellowship Program at Georgia Tech during the initial year. Progress in the advertisement of the program, recruiting results for the 1995-96 academic year, placement of the Fellows in industry during Summer 1995, program development at the M.S. and Ph.D. levels, and collaboration and dissemination of results are summarized in this report. Further details of the first year's progress will be included in the report from the Year 1 Workshop to be held at NASA Langley on December 7-8, 1995.

  7. Characterizing the Hazard of a Wake Vortex Encounter

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.; Brandon, Jay; Greene, George; Rivers, Robert; Shah, Gautam; Stewart, Eric; Stuever, Robert

    1998-01-01

    The National Aeronautics and Space Administration (NASA) is conducting research with the goal of enabling safe improvements in the capacity of the nation's air transportation system. The wake vortex upset hazard is an important factor in establishing the minimum safe spacing between aircraft during landing and take-off operations, thus impacting airport capacity. Static and free-flight wind tunnel tests and flight tests have provided an extensive data set for improved understanding of vortex encounter dynamics and simulation. Piloted and batch simulation studies are also ongoing to establish a first-order hazard metric and determine the limits of an operationally acceptable wake induced upset. This paper outlines NASA's research in these areas.

  8. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  9. NASA International Environmental Partnerships

    NASA Technical Reports Server (NTRS)

    Lewis, Pattie; Valek, Susan

    2010-01-01

    For nearly five decades, the National Aeronautics and Space Administration (NASA) has been preeminent in space exploration. NASA has landed Americans on the moon, robotic rovers on Mars, and led cooperative scientific endeavors among nations aboard the International Space Station. But as Earth's population increases, the environment is subject to increasing challenges and requires more efficient use of resources. International partnerships give NASA the opportunity to share its scientific and engineering expertise. They also enable NASA to stay aware of continually changing international environmental regulations and global markets for materials that NASA uses to accomplish its mission. Through international partnerships, NASA and this nation have taken the opportunity to look globally for solutions to challenges we face here on Earth. Working with other nations provides NASA with collaborative opportunities with the global science/engineering community to explore ways in which to protect our natural resources, conserve energy, reduce the use of hazardous materials in space and earthly applications, and reduce greenhouse gases that potentially affect all of Earth's inhabitants. NASA is working with an ever-expanding list of international partners including the European Union, the European Space Agency and, especially, the nation of Portugal. Our common goal is to foster a sustainable future in which partners continue to explore the universe while protecting our home planet's resources for future generations. This brochure highlights past, current, and future initiatives in several important areas of international collaboration that can bring environmental, economic, and other benefits to NASA and the wider international space community.

  10. The Autonomous Precision Landing and Hazard Detection and Avoidance Technology (ALHAT)

    NASA Technical Reports Server (NTRS)

    Epp, Chirold D.; Smith, Thomas B.

    2007-01-01

    As NASA plans to send humans back to the Moon and develop a lunar outpost, technologies must be developed to place humans and cargo safely, precisely, repeatedly, on the lunar surface with the capability to avoid surface hazards. Exploration Space Architecture Study requirements include the need for global lunar surface access with safe, precise landing without lighting constraints on terrain that may have landing hazards for human scale landing vehicles. Landing accuracies of perhaps 1,000 meters for sortie crew missions to 10 s of meters for Outpost class missions are required. The Autonomous precision Landing Hazard Avoidance Technology (ALHAT) project will develop the new and unique descent and landing Guidance, Navigation and Control (GNC) hardware and software technologies necessary for these capabilities. The ALHAT project will qualify a lunar descent and landing GNC system to a Technology Readiness Level (TRL) of 6 capable of supporting lunar crewed, cargo, and robotic missions. The (ALHAT) development project was chartered by NASA Headquarters in October 2006. The initial effort to write a project plan and define an ALHAT Team was followed by a fairly aggressive research and analysis effort to determine what technologies existed that could be developed and applied to the lunar landing problems indicated above. This paper describes the project development, research, analysis and concept evolution that has occurred since the assignment of the project. This includes the areas of systems engineering, GNC, sensors, sensor algorithms, simulations, fielding testing, laboratory testing, Hardware-In-The-Loop testing, system avionics and system certification concepts.

  11. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  12. Application of systems and control theory-based hazard analysis to radiation oncology.

    PubMed

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve

  13. NASA Reactor Facility Hazards Summary. Volume 1

    NASA Technical Reports Server (NTRS)

    1959-01-01

    The Lewis Research Center of the National Aeronautics and Space Administration proposes to build a nuclear research reactor which will be located in the Plum Brook Ordnance Works near Sandusky, Ohio. The purpose of this report is to inform the Advisory Committee on Reactor Safeguards of the U. S. Atomic Energy Commission in regard to the design Lq of the reactor facility, the characteristics of the site, and the hazards of operation at this location. The purpose of this research reactor is to make pumped loop studies of aircraft reactor fuel elements and other reactor components, radiation effects studies on aircraft reactor materials and equipment, shielding studies, and nuclear and solid state physics experiments. The reactor is light water cooled and moderated of the MTR-type with a primary beryllium reflector and a secondary water reflector. The core initially will be a 3 by 9 array of MTR-type fuel elements and is designed for operation up to a power of 60 megawatts. The reactor facility is described in general terms. This is followed by a discussion of the nuclear characteristics and performance of the reactor. Then details of the reactor control system are discussed. A summary of the site characteristics is then presented followed by a discussion of the larger type of experiments which may eventually be operated in this facility. The considerations for normal operation are concluded with a proposed method of handling fuel elements and radioactive wastes. The potential hazards involved with failures or malfunctions of this facility are considered in some detail. These are examined first from the standpoint of preventing them or minimizing their effects and second from the standpoint of what effect they might have on the reactor facility staff and the surrounding population. The most essential feature of the design for location at the proposed site is containment of the maximum credible accident.

  14. A Hybrid FPGA/Tilera Compute Element for Autonomous Hazard Detection and Navigation

    NASA Technical Reports Server (NTRS)

    Villalpando, Carlos Y.; Werner, Robert A.; Carson, John M., III; Khanoyan, Garen; Stern, Ryan A.; Trawny, Nikolas

    2013-01-01

    To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.

  15. A hybrid FPGA/Tilera compute element for autonomous hazard detection and navigation

    NASA Astrophysics Data System (ADS)

    Villalpando, C. Y.; Werner, R. A.; Carson, J. M.; Khanoyan, G.; Stern, R. A.; Trawny, N.

    To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.

  16. NASA Manufacturing and Test Requirements for Normally Closed Pyrovalves for Hazardous Flight Systems Applications

    NASA Technical Reports Server (NTRS)

    McDougle, Stephen H.

    2015-01-01

    Pyrovalves (figure 1, Basic Pyrovalve Design and Features,) are typically lighter, more reliable, and in most cases less expensive than other types of valves. They also consume less electrical power. They are single-use devices that are used in propulsion systems to isolate propellants or pressurant gases. These fluids may be hazardous because of their toxicity, reactivity, temperature, or high pressure. Note that in the simplified block diagram below not all detail features are shown so that those of major interest are more prominent. The diagram is provided to point out the various features that are discussed in this Specification. Features of some NC parent metal valve designs may differ. In 2013, the NESC concluded an extensive study of the reliability and safety of NC parent metal valves used in payloads carried aboard ELVs. The assessment successfully evaluated technical data to determine the risk of NC parent metal valve leakage or inadvertent activation in ELV payloads. The study resulted in numerous recommendations to ensure personnel and hardware/facility safety during ground processing of ELV payloads. One of those recommendations was to establish a NASA specification for NC parent metal valves. This Specification is a result of that recommendation, which is documented in NESC-RP-10-00614.

  17. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  18. Helicopter Field Testing of NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System fully Integrated with the Morpheus Vertical Test Bed Avionics

    NASA Technical Reports Server (NTRS)

    Epp, Chirold D.; Robertson, Edward A.; Ruthishauser, David K.

    2013-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second

  19. Helicopter Field Testing of NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System fully integrated with the Morpheus Vertical Test Bed Avionics

    NASA Technical Reports Server (NTRS)

    Rutishauser, David; Epp, Chirold; Robertson, Edward

    2013-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second

  20. Radiation Hazard Detector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA technology has made commercially available a new, inexpensive, conveniently-carried device for protection, of people exposed to potentially dangerous levels of microwave radiation. Microwaves are radio emissions of extremely high frequency. They can be hazardous but the degree of hazard is not yet well understood. Generally, it is believed that low intensity radiation of short duration is not harmful but that exposure to high levels can induce deep internal burns, affecting the circulatory and nervous systems, and particularly the eyes. The Department of Labor's Occupational Safety and Health Administration (OSHA) has established an allowable safe threshold of exposure. However, people working near high intensity sources of microwave energy-for example, radar antennas and television transmitters-may be unknowingly exposed to radiation levels beyond the safe limit. This poses not only a personal safety problem but also a problem for employers in terms of productivity loss, workman's compensation claims and possible liability litigation. Earlier-developed monitoring devices which warn personnel of dangerous radiation levels have their shortcomings. They can be cumbersome and awkward to use while working. They also require continual visual monitoring to determine if a person is in a dangerous area of radiation, and they are relatively expensive, another deterrent to their widespread adoption. In response to the need for a cheaper and more effective warning system, Jet Propulsion Laboratory developed, under NASA auspices, a new, battery-powered Microwave Radiation Hazard Detector. To bring the product to the commercial market, California Institute Research Foundation, the patent holder, granted an exclusive license to Cicoil Corporation, Chatsworth, California, an electronic components manufacturer.

  1. NASA Langley Systems Analysis & Concepts Directorate Technology Assessment/Portfolio Analysis

    NASA Technical Reports Server (NTRS)

    Cavanaugh, Stephen; Chytka, Trina; Arcara, Phil; Jones, Sharon; Stanley, Doug; Wilhite, Alan W.

    2006-01-01

    Systems analysis develops and documents candidate mission and architectures, associated system concepts, enabling capabilities and investment strategies to achieve NASA s strategic objectives. The technology assessment process connects the mission and architectures to the investment strategies. In order to successfully implement a technology assessment, there is a need to collect, manipulate, analyze, document, and disseminate technology-related information. Information must be collected and organized on the wide variety of potentially applicable technologies, including: previous research results, key technical parameters and characteristics, technology readiness levels, relationships to other technologies, costs, and potential barriers and risks. This information must be manipulated to facilitate planning and documentation. An assessment is included of the programmatic and technical risks associated with each technology task as well as potential risk mitigation plans. Risks are assessed and tracked in terms of likelihood of the risk occurring and consequences of the risk if it does occur. The risk assessments take into account cost, schedule, and technical risk dimensions. Assessment data must be simplified for presentation to decision makers. The Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center has a wealth of experience in performing Technology Assessment and Portfolio Analysis as this has been a business line since 1978.

  2. Hazardous Materials Routing Study Phase II: Analysis of Hazardous Materials Truck Routes in Proximity to the Dallas Central Business District

    DOT National Transportation Integrated Search

    1985-10-01

    This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...

  3. 77 FR 38679 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration (NASA) announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council (NAC).

  4. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  5. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  6. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  7. The NASA Applied Science Program Disasters Area: Disaster Applications Research and Response

    NASA Astrophysics Data System (ADS)

    Murray, J. J.; Lindsay, F. E.; Stough, T.; Jones, C. E.

    2014-12-01

    The goal of the Natural Disaster Application Area is to use NASA's capabilities in spaceborne, airborne, surface observations, higher-level derived data products, and modeling and data analysis to improve natural disaster forecasting, mitigation, and response. The Natural Disaster Application Area applies its remote sensing observations, modeling and analysis capabilities to provide hazard and disaster information where and when it is needed. Our application research activities specifically contribute to 1) Understanding the natural processes that produce hazards, 2)Developing hazard mitigation technologies, and 3)Recognizing vulnerability of interdependent critical infrastructure. The Natural Disasters Application area selects research projects through a rigorous, impartial peer-review process that address a broad spectrum of disasters which afflict populations within the United States, regionally and globally. Currently there are 19 active projects in the research portfolio which address the detection, characterization, forecasting and response to a broad range of natural disasters including earthquakes, tsunamis, volcanic eruptions and ash dispersion, wildfires, hurricanes, floods, tornado damage assessment, oil spills and disaster data mining. The Disasters team works with federal agencies to aid the government in meeting the challenges associated with natural disaster response and to transfer technologies to agencies as they become operational. Internationally, the Disasters Area also supports the Committee on Earth Observations Working Group on Disasters, and the International Charter on Space and Disasters to increase, strengthen, and coordinate contributions of NASA Earth-observing satellites and applications products to disaster risk management. The CEOS group will lead pilot efforts focused on identifying key systems to support flooding, earthquake, and volcanic events.

  8. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho

  9. Environmental Public Health Surveillance for Exposure to Respiratory Health Hazards: A Joint NASA/CDC Project to Use Remote Sensing Data for Estimating Airborne Particulate Matter Over the Atlanta, Georgia Metropolitan Area

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Al-Hamdan, Mohammad; Estes, Maurice; Crosson, William

    2007-01-01

    As part of the National Environmental Public Health Tracking Network (EPHTN) the National Center for Environmental Health (NCEH) at the Centers for Disease Control and Prevention (CDC) is leading a project called Health and Environment Linked for Information Exchange (HELiX-Atlanta). The goal of developing the National Environmental Public Health Tracking Network is to improve the health of communities. Currently, few systems exist at the state or national level to concurrently track many of the exposures and health effects that might be associated with environmental hazards. An additional challenge is estimating exposure to environmental hazards such as particulate matter whose aerodynamic diameter is less than or equal to 2.5 micrometers (PM2.5). HELIX-Atlanta's goal is to examine the feasibility of building an integrated electronic health and environmental data network in five counties of Metropolitan Atlanta, GA. NASA Marshall Space Flight Center (NASA/MSFC) is collaborating with CDC to combine NASA earth science satellite observations related to air quality and environmental monitoring data to model surface estimates of PM2.5 concentrations that can be linked with clinic visits for asthma. While use of the Air Quality System (AQS) PM2.5 data alone could meet HELIX-Atlanta specifications, there are only five AQS sites in the Atlanta area, thus the spatial coverage is not ideal. We are using NASA Moderate Resolution Imaging Spectroradiometer (MODIS) satellite Aerosol Optical Depth (AOD) data for estimating daily ground level PM2.5 at 10 km resolution over the metropolitan Atlanta area supplementing the AQS ground observations and filling their spatial and temporal gaps.

  10. OSI in the NASA science internet: An analysis

    NASA Technical Reports Server (NTRS)

    Nitzan, Rebecca

    1990-01-01

    The Open Systems Interconnection (OSI) protocol suite is a result of a world-wide effort to develop international standards for networking. OSI is formalized through the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). The goal of OSI is to provide interoperability between network products without relying on one particular vendor, and to do so on a multinational basis. The National Institute for Standards and Technology (NIST) has developed a Government OSI Profile (GOSIP) that specified a subset of the OSI protocols as a Federal Information Processing Standard (FIPS 146). GOSIP compatibility has been adopted as the direction for all U.S. government networks. OSI is extremely diverse, and therefore adherence to a profile will facilitate interoperability within OSI networks. All major computer vendors have indicated current or future support of GOSIP-compliant OSI protocols in their products. The NASA Science Internet (NSI) is an operational network, serving user requirements under NASA's Office of Space Science and Applications. NSI consists of the Space Physics Analysis Network (SPAN) that uses the DECnet protocols and the NASA Science Network (NSN) that uses TCP/IP protocols. The NSI Project Office is currently working on an OSI integration analysis and strategy. A long-term goal is to integrate SPAN and NSN into one unified network service, using a full OSI protocol suite, which will support the OSSA user community.

  11. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  12. Hazard function theory for nonstationary natural hazards

    DOE PAGES

    Read, Laura K.; Vogel, Richard M.

    2016-04-11

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  13. Hazard function theory for nonstationary natural hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Read, Laura K.; Vogel, Richard M.

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable  X with corresponding failure time series  T should have application to a wide class of natural hazards with opportunities for future extensions.« less

  14. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L. K.; Vogel, R. M.

    2015-11-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.

  15. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...

  16. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    , however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.

  17. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  18. NECAP - NASA's Energy Cost Analysis Program. Operations manual

    NASA Technical Reports Server (NTRS)

    Miner, D. L.

    1982-01-01

    The use of the NASA'S ENERGY COST ANALYSIS PROGRAM (NECAP) is described. Supplementary information on new capabilities and program options is also provided. The Control Data Corporation (CDC) NETWORK OPERATING SYSTEM (NOS) is discussed. The basic CDC NOS instructions which are required to successfully operate NECAP are provided.

  19. NASA GISS Surface Temperature (GISTEMP) Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, G.; Ruedy, R.; Persin, A

    The NASA GISS Surface Temperature (GISTEMP) analysis provides a measure of the changing global surface temperature with monthly resolution for the period since 1880, when a reasonably global distribution of meteorological stations was established. The input data that the GISTEMP Team use for the analysis, collected by many national meteorological services around the world, are the adjusted data of the Global Historical Climatology Network (GHCN) Vs. 3 (this represents a change from prior use of unadjusted Vs. 2 data) (Peterson and Vose, 1997 and 1998), United States Historical Climatology Network (USHCN) data, and SCAR (Scientific Committee on Antarctic Research) datamore » from Antarctic stations. Documentation of the basic analysis method is provided by Hansen et al. (1999), with several modifications described by Hansen et al. (2001). The GISS analysis is updated monthly, however CDIAC's presentation of the data here is updated annually.« less

  20. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  1. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  2. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-26

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...

  3. Historical Evolution of NASA Standard Materials Testing with Hypergolic Propellants and Ammonia (NASA Standard 6001 Test 15)

    NASA Technical Reports Server (NTRS)

    Greene, Benjamin; McClure, Mark B.

    2012-01-01

    The NASA Johnson Space Center White Sands Test Facility (WSTF) has performed testing of hazardous and reactive aerospace fluids, including hypergolic propellants, with materials since the 1960s with the Apollo program. Amongst other test activities, Test 15 is a NASA standard test for evaluating the reactivity of materials with selected aerospace fluids, in particular hydrazine, monomethylhydrazine, uns-dimethylhydrazine, Aerozine 50, dinitrogen tetroxide oxidizers, and ammonia. This manuscript provides an overview of the history of Test 15 over a timeline ranging from prior to its development and first implementation as a NASA standard test in 1974 to its current refinement. Precursor documents to NASA standard tests, as they are currently known, are reviewed. A related supplementary test, international standardization, and enhancements to Test 15 are also discussed. Because WSTF was instrumental in the development and implementation of Test 15, WSTF experience and practices are referred to in this manuscript.

  4. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  5. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  6. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  7. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  8. NASA Systems Analysis and Concepts Directorate Mission and Trade Study Analysis

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell; Guynn, Mark; Hahn, Andrew; Lepsch, Roger; Mazanek, Dan; Dollyhigh, Sam

    2006-01-01

    Mission analysis, as practiced by the NASA Langley Research Center's Systems Analysis and Concepts Directorate (SACD), consists of activities used to define, assess, and evaluate a wide spectrum of aerospace systems for given requirements. The missions for these systems encompass a broad range from aviation to space exploration. The customer, who is usually another NASA organization or another government agency, often predefines the mission. Once a mission is defined, the goals and objectives that the system will need to meet are delineated and quantified. A number of alternative systems are then typically developed and assessed relative to these goals and objectives. This is done in order to determine the most favorable design approaches for further refinement. Trade studies are performed in order to understand the impact of a requirement on each system and to select among competing design options. Items varied in trade studies typically include: design variables or design constraints; technology and subsystem options; and operational approaches. The results of trade studies are often used to refine the mission and system requirements. SACD studies have been integral to the decision processes of many organizations for decades. Many recent examples of SACD mission and trade study analyses illustrate their excellence and influence. The SACD-led, Agency-wide effort to analyze a broad range of future human lunar exploration scenarios for NASA s Exploration Systems Mission Directorate (ESMD) and the Mars airplane design study in support of the Aerial Regional-scale Environment Survey of Mars (ARES) mission are two such examples. This paper describes SACD's mission and trade study analysis activities in general and presents the lunar exploration and Mars airplane studies as examples of type of work performed by the SACD.

  9. Recent Progress in Understanding Natural-Hazards-Generated TEC Perturbations: Measurements and Modeling Results

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.

    2015-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.

  10. Freezing Rain as an In-Flight Icing Hazard

    NASA Technical Reports Server (NTRS)

    Bernstein, Ben C.; Ratvasky, Thomas P.; Miller, Dean R.; McDonough, Frank

    2000-01-01

    Exposure to supercooled large drops (SLD-subfreezing water droplets with diameters greater than approx. 50 microns) can pose a significant threat to the safety of some aircraft. Although SLD includes both freezing drizzle (FZDZ) and freezing rain (FZRA), much of the SLD research and development of operational SLD forecast tools has focused on FZDZ and ignored FZRA, regarding is as less of a hazard to aviation. This paper provides a counterpoint case study that demonstrates FZRA as a significant in-flight icing hazard. The case study is based on flight and meteorological data from a joint NASA/FAA/NCAR SLD icing research project collected on February 4, 1998. The NASA Twin Otter Icing Research Aircraft experienced a prolonged exposure to "classical" FZRA that formed extensive ice formations including ridges and nodules on the wing and tail, and resulted in a substantial performance penalty. Although the case study provides only a singular FZRA event with one aircraft type, it is clear that classical FZRA can pose a significant in-flight icing hazard, and should not be ignored when considering SLD issues.

  11. 76 FR 65540 - NASA Advisory Council; Audit, Finance, and Analysis Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-21

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration (NASA) announces that the meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council scheduled to be held at NASA Goddard Space Flight Center in Greenbelt, Maryland, on November 1-2, 2011, has been moved to a new location. It will now be held as follows: NASA Headquarters, Room 8D48, 300 E Street, SW., Washington, DC 20546, Tuesday, November 1, 2011, 2:00-5:15 p.m. and Wednesday, November 2, 2011, 9:00-9:55 a.m., Local Time.

  12. GRC Payload Hazard Assessment: Supporting the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Schoren, William R.; Zampino, Edward J.

    2004-01-01

    A hazard assessment was conducted on the GRC managed payloads in support of a NASA Headquarters Code Q request to examine STS-107 payloads and determine if they were credible contributors to the Columbia accident. This assessment utilized each payload's Final Flight Safety Data Package for hazard identification. An applicability assessment was performed and most of the hazards were eliminated because they dealt with payload operations or crew interactions. A Fault Tree was developed for all the hazards deemed applicable and the safety verification documentation was reviewed for these applicable hazards. At the completion of this hazard assessment, it was concluded that none of the GRC managed payloads were credible contributors to the Columbia accident.

  13. GEO Collisional Risk Assessment Based on Analysis of NASA-WISE Data and Modeling

    DTIC Science & Technology

    2015-10-18

    GEO Collisional Risk Assessment Based on Analysis of NASA -WISE Data and Modeling Jeremy Murray Krezan1, Samantha Howard1, Phan D. Dao1, Derek...Surka2 1AFRL Space Vehicles Directorate,2Applied Technology Associates Incorporated From December 2009 through 2011 the NASA Wide-Field Infrared...of known debris. The NASA -WISE GEO belt debris population adds potentially thousands previously uncataloged objects. This paper describes

  14. NASA Tests New Robotic Refueling Technologies

    NASA Image and Video Library

    2014-03-05

    RROxiTT lead roboticist Alex Janas stands with the Oxidizer Nozzle Tool as he examines the work site. Credit: NASA/Goddard/Chris Gunn NASA has successfully concluded a remotely controlled test of new technologies that would empower future space robots to transfer hazardous oxidizer – a type of propellant – into the tanks of satellites in space today. Concurrently on the ground, NASA is incorporating results from this test and the Robotic Refueling Mission on the International Space Station to prepare for an upcoming ground-based test of a full-sized robotic servicer system that will perform tasks on a mock satellite client. Collectively, these efforts are part of an ongoing and aggressive technology development campaign to equip robots and humans with the tools and capabilities needed for spacecraft maintenance and repair, the assembly of large space telescopes, and extended human exploration. Read more here: www.nasa.gov/content/goddard/nasa-tests-new-robotic-refue... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. NASA technical advances in aircraft occupant safety. [clear air turbulence detectors, fire resistant materials, and crashworthiness

    NASA Technical Reports Server (NTRS)

    Enders, J. H.

    1978-01-01

    NASA's aviation safety technology program examines specific safety problems associated with atmospheric hazards, crash-fire survival, control of aircraft on runways, human factors, terminal area operations hazards, and accident factors simulation. While aircraft occupants are ultimately affected by any of these hazards, their well-being is immediately impacted by three specific events: unexpected turbulence encounters, fire and its effects, and crash impact. NASA research in the application of laser technology to the problem of clear air turbulence detection, the development of fire resistant materials for aircraft construction, and to the improvement of seats and restraint systems to reduce crash injuries are reviewed.

  16. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  17. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is

  18. Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO

    NASA Technical Reports Server (NTRS)

    Stallworth, R.; Meyers, C. A.; Stinson, H. C.

    1989-01-01

    Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.

  19. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  20. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    DOE Data Explorer

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  1. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  2. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    PubMed

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  3. The NASA NASTRAN structural analysis computer program - New content

    NASA Technical Reports Server (NTRS)

    Weidman, D. J.

    1978-01-01

    Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.

  4. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  5. NASA GES DISC Level 2 Aerosol Analysis and Visualization Services

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer; Petrenko, Maksym; Ichoku, Charles; Yang, Wenli; Johnson, James; Zhao, Peisheng; Kempler, Steve

    2015-01-01

    Overview of NASA GES DISC Level 2 aerosol analysis and visualization services: DQViz (Data Quality Visualization)MAPSS (Multi-sensor Aerosol Products Sampling System), and MAPSS_Explorer (Multi-sensor Aerosol Products Sampling System Explorer).

  6. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    EPA Pesticide Factsheets

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  7. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  8. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  9. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  10. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  11. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  12. Contribution of Asteroid Generated Tsunami to the Impact Hazard

    NASA Technical Reports Server (NTRS)

    Morrison, David; Venkatapathy, Ethiraj

    2017-01-01

    The long-standing uncertainty about the importance of asteroid-generated tsunami was addressed at a workshop in August 2016, co-sponsored by NASA and NOAA. Experts from NASA, NOAA, the DoE tri-labs (LLNL, SNL, and LANL), DHS, FEMA, and academia addressed the hazard of tsunami created by asteroid impacts, focusing primarily on NEAs with diameter less than 250m. Participants jointly identified key issues and shared information for nearly a year to coordinate their results for discussion at the workshop. They used modern computational tools to examine 1) Near-field wave generation by the impact; 2) Long-distance wave propagation; 3) Damage from coastal run-up and inundation, and associated hazard. The workshop resulted in broad consensus that the asteroid impact tsunami threat is not as great as previously thought.

  13. Rapid SAR and GPS Measurements and Models for Hazard Science and Situational Awareness

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Moore, A. W.; Rosen, P. A.; Simons, M.; Webb, F.; Linick, J.; Fielding, E. J.; Lundgren, P.; Sacco, G. F.; Polet, J.; Manipon, G.

    2016-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating higher level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR), Differential Global Positioning System (DGPS), SAR-based change detection, and image pixel tracking have recently become critical additions to our toolset for understanding and mapping the damage caused by earthquakes, volcanic eruptions, landslides, and floods. Analyses of these data sets are still largely handcrafted following each event and are not generated rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by California Institute of Technology (Caltech) and by NASA through the Jet Propulsion Laboratory (JPL), has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition, the ARIA project is developing the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the imminent increase in raw data from geodetic imaging missions planned for launch by NASA, as well as international space agencies. We will present the progress we have made on automating the analysis of SAR data for hazard monitoring and response using data from Sentinel 1a/b as well as continuous GPS stations. Since the beginning of our project, our team has imaged events and generated response products for events around the world. These response products have enabled many conversations with those in the disaster response community

  14. JANNAF 18th Propulsion Systems Hazards Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Cocchiaro, James E. (Editor); Gannaway, Mary T. (Editor)

    1999-01-01

    This volume, the first of two volumes is a compilation of 18 unclassified/unlimited-distribution technical papers presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 18th Propulsion Systems Hazards Subcommittee (PSHS) meeting held jointly with the 36th Combustion Subcommittee (CS) and 24th Airbreathing Propulsion Subcommittee (APS) meetings. The meeting was held 18-21 October 1999 at NASA Kennedy Space Center and The DoubleTree Oceanfront Hotel, Cocoa Beach, Florida. Topics covered at the PSHS meeting include: shaped charge jet and kinetic energy penetrator impact vulnerability of gun propellants; thermal decomposition and cookoff behavior of energetic materials; violent reaction; detonation phenomena of solid energetic materials subjected to shock and impact stimuli; and hazard classification, insensitive munitions, and propulsion systems safety.

  15. Hazardous Materials Hazard Analysis, Portland, Oregon.

    DTIC Science & Technology

    1981-06-01

    combustible liquids, primarily petroleum products such as gasoline and fuel oil . Although less prevalent, compressed flammable gases (such as liquid...magnitude when hazardous materials are involved. -- Texas City, Texas, 1947--A ship being loaded with ammonium nitrate exploded, killing 468 people...An overturned gasoline or home heating fuel oil tanker or natural gas leak which does not ignite would be a Level 1 emergency. Level 2: A spill or

  16. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  17. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  18. NASA-Produced Maps Help Gauge Italy Earthquake Damage

    NASA Image and Video Library

    2016-10-05

    A NASA-funded program provided valuable information for responders and groups supporting the recovery efforts for the Aug. 24, 2016, magnitude 6.2 earthquake that struck central Italy. The earthquake caused significant loss of life and property damage in the town of Amatrice. To assist in the disaster response efforts, scientists at NASA's Jet Propulsion Laboratory and Caltech, both in Pasadena, California, obtained and used radar imagery of the earthquake's hardest-hit region to discriminate areas of damage from that event. The views indicate the extent of damage caused by the earthquake and subsequent aftershocks in and around Amatrice, based on changes to the ground surface detected by radar. The color variations from yellow to red indicate increasingly more significant ground surface change. The damage maps were created from data obtained before and after the earthquake by satellites belonging to the Italian Space Agency (ASI) and the Japan Aerospace Exploration Agency (JAXA). The radar-derived damage maps compare well with a damage map produced by the European Commission Copernicus Emergency Management Service based upon visual inspection of high-resolution pre-earthquake aerial photographs and post-earthquake satellite optical imagery, and provide broader geographic coverage of the earthquake's impact in the region. The X-band COSMO-SkyMed (CSK) data were provided through a research collaboration with ASI and were acquired on July 3, August 20, and August 28, 2016. The L-band ALOS/PALSAR-2 data were provided by JAXA through its science research program and were acquired on September 9, 2015, January 27, 2016, and August 24, 2016. The radar data were processed by the Advanced Rapid Imaging and Analysis (ARIA) team at JPL and Caltech. ARIA is a NASA-funded project that is building an automated system for demonstrating the ability to rapidly and reliably provide GPS and satellite data to support the local, national and international hazard monitoring and

  19. Image Analysis via Fuzzy-Reasoning Approach: Prototype Applications at NASA

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steven J.

    2004-01-01

    A set of imaging techniques based on Fuzzy Reasoning (FR) approach was built for NASA at Kennedy Space Center (KSC) to perform complex real-time visual-related safety prototype tasks, such as detection and tracking of moving Foreign Objects Debris (FOD) during the NASA Space Shuttle liftoff and visual anomaly detection on slidewires used in the emergency egress system for Space Shuttle at the launch pad. The system has also proved its prospective in enhancing X-ray images used to screen hard-covered items leading to a better visualization. The system capability was used as well during the imaging analysis of the Space Shuttle Columbia accident. These FR-based imaging techniques include novel proprietary adaptive image segmentation, image edge extraction, and image enhancement. Probabilistic Neural Network (PNN) scheme available from NeuroShell(TM) Classifier and optimized via Genetic Algorithm (GA) was also used along with this set of novel imaging techniques to add powerful learning and image classification capabilities. Prototype applications built using these techniques have received NASA Space Awards, including a Board Action Award, and are currently being filed for patents by NASA; they are being offered for commercialization through the Research Triangle Institute (RTI), an internationally recognized corporation in scientific research and technology development. Companies from different fields, including security, medical, text digitalization, and aerospace, are currently in the process of licensing these technologies from NASA.

  20. Evaluating the Emergency Notification Systems of the NASA White Sands Test

    NASA Technical Reports Server (NTRS)

    Chavez, Alfred Paul

    2004-01-01

    The problem was that the NASA Fire and Emergency Services did not know if the current emergency notification systems on the NASA White Sands Test Facility were appropriate for alerting the employees of an emergency. The purpose of this Applied Research Project was to determine if the current emergency notification systems of the White Sands Test Facility are appropriate for alerting the employees of an emergency. This was a descriptive research project. The research questions were: 1) What are similar facilities using to alert the employees of an emergency?; 2) Are the current emergency notification systems suitable for the community hazards on the NASA White Sands Test Facility?; 3) What is the NASA Fire and Emergency Services currently using to measure the effectiveness of the emergency notification systems?; and 4) What are the current training methods used to train personnel to the emergency notification systems at the NASA White Sands Test Facility? The procedures involved were to research other established facilities, research published material from credible sources, survey the facility to determine the facility perception of the emergency notification systems, and evaluate the operating elements of the established emergency notification systems for the facility. The results were that the current systems are suitable for the type of hazards the facility may endure. The emergency notification systems are tested frequently to ensure effectiveness in the event of an emergency. Personnel are trained and participate in a yearly drill to make certain personnel are educated on the established systems. The recommendations based on the results were to operationally improve the existing systems by developing and implementing one system that can overall notify the facility of a hazard. Existing procedures and training should also be improved to ensure that all personnel are educated on what to do when the emergency notification systems are activated.

  1. NASA Aeronautics Multidisciplinary Analysis and Design Fellowship Program

    NASA Technical Reports Server (NTRS)

    Grossman, B.; Gurdal, Z.; Kapania, R. K.; Mason, W. H.; Schetz, J. A.

    1999-01-01

    This program began as a grant from NASA Headquarters, NGT-10025, which was in effect from 10/l/93 until 10/31/96. The remaining funding for this effort was transferred from NASA Headquarters to NASA Langley and a new grant NGT-1-52155 was issued covering the period II/l/96 to 5/15/99. This report serves as the final report of NGT-1-52155. For a number of years, Virginia Tech had been on the forefront of research in the area of multidisciplinary analysis and design. In June of 1994, faculty members from aerospace and ocean engineering, engineering science and mechanics, mechanical engineering, industrial engineering, mathematics and computer sciences, at Virginia Tech joined together to form the Multidisciplinary Analysis and Design (MAD) Center for Advanced Vehicles. The center was established with the single goal: to perform research that is relevant to the needs of the US industry and to foster collaboration between the university, government and industry. In October of 1994, the center was chosen by NASA headquarters as one of the five university centers to establish a fellowship program to develop a graduate program in multidisciplinary analysis and design. The fellowship program provides full stipend and tuition support for seven U. S. students per year during their graduate studies. The grant is currently being administered by the NMO Branch of NASA Langley. To advise us regarding the problems faced by the industry, an industrial advisory board has been formed consisting of representatives from industry as well as government laboratories. The present membership includes major aerospace companies: Aurora Flight Sciences, Boeing: Philadelphia, Boeing: Long Beach, Boeing: Seattle, Boeing: St. Louis, Cessna, Ford, General Electric, Hughes, Lockheed-Martin: Palo Alto, Northrop-Grumman, Sikorsky, smaller, aerospace software companies: Aerosoft, Phoenix Integration and Proteus Engineering, along with representatives from government agencies, including: NASA Ames

  2. PO*WW*ER mobile treatment unit process hazards analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damagemore » and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less

  3. Online Analysis Enhances Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Leptoukh, Gregory

    2007-01-01

    Giovanni, the Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization and Analysis Infrastructure, has provided researchers with advanced capabilities to perform data exploration and analysis with observational data from NASA Earth observation satellites. In the past 5-10 years, examining geophysical events and processes with remote-sensing data required a multistep process of data discovery, data acquisition, data management, and ultimately data analysis. Giovanni accelerates this process by enabling basic visualization and analysis directly on the World Wide Web. In the last two years, Giovanni has added new data acquisition functions and expanded analysis options to increase its usefulness to the Earth science research community.

  4. Safety and business benefit analysis of NASA's aviation safety program

    DOT National Transportation Integrated Search

    2004-09-20

    NASA Aviation Safety Program elements encompass a wide range of products that require both public and private investment. Therefore, two methods of analysis, one relating to the public and the other to the private industry, must be combined to unders...

  5. Techno-Political Space Cooperation: A Longitudinal Analysis of NASA's Bilateral and Multilateral Agreements

    NASA Technical Reports Server (NTRS)

    Hudiburg, John J.

    2004-01-01

    NASA's international programs are both numerous and successful, with over two thousand international agreements forming a foundation of U.S. government cooperation that involved over half the United Nation's membership. Previous research, by the author, into these agreements has identified five variables underlying NASA's international cooperation efforts and these variables form a framework for explaining international cooperation behavior on a macro-level. This paper builds upon that research to effectively explain lower-level patterns of cooperation in NASA's experience. Two approaches for analyzing the space agency's history are used: aggregation of all agreements and a cluster (disaggregated) analysis of four key segments. While researchers of NASA's international cooperation often considered individual cases first, and then generalize to macro-level explanations. This study, in contrast, begins by considering all agreements together in order to explain as much as possible at the macro level before proceeding to lower tier explanations. These lower tier assessments are important to understanding regional and political influences on bilateral and multilateral cooperation. In order to accomplish this lower-tier analysis, the 2000 agreements are disaggregated into logical groupings enabling an analysis of important questions and clearer focus on key patterns concerning developing states, such as the role of international institutions or privatization on international cooperation in space technology.

  6. Supercooled Liquid Water Content Instrument Analysis and Winter 2014 Data with Comparisons to the NASA Icing Remote Sensing System and Pilot Reports

    NASA Technical Reports Server (NTRS)

    King, Michael C.

    2016-01-01

    The National Aeronautics and Space Administration (NASA) has developed a system for remotely detecting the hazardous conditions leading to aircraft icing in flight, the NASA Icing Remote Sensing System (NIRSS). Newly developed, weather balloon-borne instruments have been used to obtain in-situ measurements of supercooled liquid water during March 2014 to validate the algorithms used in the NIRSS. A mathematical model and a processing method were developed to analyze the data obtained from the weather balloon soundings. The data from soundings obtained in March 2014 were analyzed and compared to the output from the NIRSS and pilot reports.

  7. NASA Laboratory Analysis for Manned Exploration Missions

    NASA Technical Reports Server (NTRS)

    Krihak, Michael K.; Shaw, Tianna E.

    2014-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability Element under the NASA Human Research Program. ELA instrumentation is identified as an essential capability for future exploration missions to diagnose and treat evidence-based medical conditions. However, mission architecture limits the medical equipment, consumables, and procedures that will be available to treat medical conditions during human exploration missions. Allocated resources such as mass, power, volume, and crew time must be used efficiently to optimize the delivery of in-flight medical care. Although commercial instruments can provide the blood and urine based measurements required for exploration missions, these commercial-off-the-shelf devices are prohibitive for deployment in the space environment. The objective of the ELA project is to close the technology gap of current minimally invasive laboratory capabilities and analytical measurements in a manner that the mission architecture constraints impose on exploration missions. Besides micro gravity and radiation tolerances, other principal issues that generally fail to meet NASA requirements include excessive mass, volume, power and consumables, and nominal reagent shelf-life. Though manned exploration missions will not occur for nearly a decade, NASA has already taken strides towards meeting the development of ELA medical diagnostics by developing mission requirements and concepts of operations that are coupled with strategic investments and partnerships towards meeting these challenges. This paper focuses on the remote environment, its challenges, biomedical diagnostics requirements and candidate technologies that may lead to successful blood-urine chemistry and biomolecular measurements in future space exploration missions.

  8. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...

  9. Farfield Plume Measurement and Analysis on the NASA-300M

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Shastry, Rohit; Soulas, George C.; Kamhawi, Hani

    2013-01-01

    NASA is developing a 15-kW Hall thruster to support future NASA missions. This activity is funded under the Space Technology Mission Directorate Solar Electric Propulsion Technology Demonstration project. As a part of the development process, the far-field plume characteristics of the NASA-300M, a 20-kW Hall thruster, were studied. The results will be used to study how various aspects of the operation of this thruster affect the overall performance. This data will be used to guide future design work and serve as a baseline for comparison to a magnetically shielded version of the NASA-300M that will be tested in the future. For this study, a far-field Faraday probe was swept in a polar fashion to map the ion current density. An ExB probe (Wien filter), two retarding potential analyzers, and a Langmuir probe were mounted at a fixed location on the thruster axis in the far-field plume. The data reduction method followed recommendations in recent studies by Brown, Reid, and Shastry with modifications that are tailored to the plasma plume environment of high-power Hall thrusters. Results from this and prior testing show that the plume is richer in doubly-charged ions, larger in spatial extent, and capable of greater probe heating than lower power thrusters. These characteristics require special treatment in experimental setup and data analysis, which will be the main focus of this paper. In particular, covered topics will include a new, more accurate, method of integration for analysis of the ExB probe data and effect of secondary electron emission on the Faraday probe data.

  10. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Duda, Kristen; Scroggins, Sharon

    2008-01-01

    NASA's mission is to pioneer the future in space exploration, scientific discovery, and aeronautics research. To help enable existing and future programs to pursue this mission, NASA has established the Principal Center for Regulatory Risk Analysis and Communication (RRAC PC) to proactively identify, analyze, and communicate environmental regulatory risks to the NASA community. The RRAC PC is chartered to evaluate the risks posed to NASA Programs and facilities by environmentally related drivers. The RRAC PC focuses on emerging environmental regulations, as well as risks related to operational changes that can trigger existing environmental requirements. Changing regulations have the potential to directly affect program activities. For example, regulatory changes can restrict certain activities or operations by mandating changes in how operations may be done or limiting where or how certain operations can take place. Regulatory changes also can directly affect the ability to use certain materials by mandating a production phase-out or restricting usage applications of certain materials. Such changes can result in NASA undertaking material replacement efforts. Even if a regulation does not directly affect NASA operations, U.S. and international regulations can pose program risks indirectly through requirements levied on manufacturers and vendors of components and materials. For example, manufacturers can change their formulations to comply with new regulatory requirements. Such changes can require time-consuming and costly requalification certification for use in human spaceflight programs. The RRAC PC has implemented several strategies for proactively managing regulatory change to minimize potential adverse impacts to NASA Programs and facilities. This presentation highlights the lessons learned through establishing the RRAC PC, the process by which the RRAC PC monitors and distributes information about emerging regulatory requirements, and the cross

  11. NASA's Agency-wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Duda, Kristen; Scroggins. Sharon

    2008-01-01

    NASA's mission is to pioneer the future in space exploration, scientific discovery, and aeronautics research. To help enable existing and future programs to pursue this mission, NASA has established the Principal Center for Regulatory Risk Analysis and Communication (RRAC PC) to proactively identify, analyze, and communicate environmental regulatory risks to the NASA community. The RRAC PC is chartered to evaluate the risks posed to NASA Programs and facilities by environmentally related drivers. The RRAC PC focuses on emerging environmental regulations, as well as risks related to operational changes that can trigger existing environmental requirements. Changing regulations have the potential to directly affect program activities. For example, regulatory changes can restrict certain activities or operations by mandating changes in how operations may be done or limiting where or how certain operations can take place. Regulatory changes also can directly affect the ability to use certain materials by mandating a production phase-out or restricting usage aPi'iications of certain materials. Such changes can result in NASA undertaking material replacement efforts. Even if a regulation does not directly affect NASA operations, U.S. and international regulations can pose program risks indirectly through requirements levied on manufacturers and vendors of components and materials. For example, manufacturers can change their formulations to comply with new regulatory requirements. Such changes can require time-consuming and costly requalification certification for use in human spaceflight programs. The RRAC PC has implemented several strategies for proactively managing regulatory change to minimize potential adverse impacts to NASA Programs and facilities. This presentation highlights the lessons learned through establishing the RRAC PC, the process by which the RRAC PC monitors and distributes information about emerging regulatory requirements, and the cross

  12. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  13. Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements

    NASA Astrophysics Data System (ADS)

    Krause, Marcin

    2017-11-01

    This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.

  14. COLD-SAT feasibility study safety analysis

    NASA Technical Reports Server (NTRS)

    Mchenry, Steven T.; Yost, James M.

    1991-01-01

    The Cryogenic On-orbit Liquid Depot-Storage, Acquisition, and Transfer (COLD-SAT) satellite presents some unique safety issues. The feasibility study conducted at NASA-Lewis desired a systems safety program that would be involved from the initial design in order to eliminate and/or control the inherent hazards. Because of this, a hazards analysis method was needed that: (1) identified issues that needed to be addressed for a feasibility assessment; and (2) identified all potential hazards that would need to be controlled and/or eliminated during the detailed design phases. The developed analysis method is presented as well as the results generated for the COLD-SAT system.

  15. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  16. NASA-Langley Research Center's Aircraft Condition Analysis and Management System Implementation

    NASA Technical Reports Server (NTRS)

    Frye, Mark W.; Bailey, Roger M.; Jessup, Artie D.

    2004-01-01

    This document describes the hardware implementation design and architecture of Aeronautical Radio Incorporated (ARINC)'s Aircraft Condition Analysis and Management System (ACAMS), which was developed at NASA-Langley Research Center (LaRC) for use in its Airborne Research Integrated Experiments System (ARIES) Laboratory. This activity is part of NASA's Aviation Safety Program (AvSP), the Single Aircraft Accident Prevention (SAAP) project to develop safety-enabling technologies for aircraft and airborne systems. The fundamental intent of these technologies is to allow timely intervention or remediation to improve unsafe conditions before they become life threatening.

  17. Electrical Systems Analysis at NASA Glenn Research Center: Status and Prospects

    NASA Technical Reports Server (NTRS)

    Freeh, Joshua E.; Liang, Anita D.; Berton, Jeffrey J.; Wickenheiser, Timothy J.

    2003-01-01

    An analysis of an electrical power and propulsion system for a 2-place general aviation aircraft is presented to provide a status of such modeling at NASA Glenn Research Center. The thermodynamic/ electrical model and mass prediction tools are described and the resulting system power and mass are shown. Three technology levels are used to predict the effect of advancements in component technology. Methods of fuel storage are compared by mass and volume. Prospects for future model development and validation at NASA as well as possible applications are also summarized.

  18. Quantification of hazard prediction ability at hazard prediction training (Kiken-Yochi Training: KYT) by free-response receiver-operating characteristic (FROC) analysis.

    PubMed

    Hashida, Masahiro; Kamezaki, Ryousuke; Goto, Makoto; Shiraishi, Junji

    2017-03-01

    The ability to predict hazards in possible situations in a general X-ray examination room created for Kiken-Yochi training (KYT) is quantified by use of free-response receiver-operating characteristics (FROC) analysis for determining whether the total number of years of clinical experience, involvement in general X-ray examinations, occupation, and training each have an impact on the hazard prediction ability. Twenty-three radiological technologists (RTs) (years of experience: 2-28), four nurses (years of experience: 15-19), and six RT students observed 53 scenes of KYT: 26 scenes with hazardous points (hazardous points are those that might cause injury to patients) and 27 scenes without points. Based on the results of these observations, we calculated the alternative free-response receiver-operating characteristic (AFROC) curve and the figure of merit (FOM) to quantify the hazard prediction ability. The results showed that the total number of years of clinical experience did not have any impact on hazard prediction ability, whereas recent experience with general X-ray examinations greatly influenced this ability. In addition, the hazard prediction ability varied depending on the occupations of the observers while they were observing the same scenes in KYT. The hazard prediction ability of the radiologic technology students was improved after they had undergone patient safety training. This proposed method with FROC observer study enabled the quantification and evaluation of the hazard prediction capability, and the application of this approach to clinical practice may help to ensure the safety of examinations and treatment in the radiology department.

  19. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  20. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  1. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  2. Electric Propulsion Requirements and Mission Analysis Under NASA's In-Space Propulsion Technology Project

    NASA Technical Reports Server (NTRS)

    Dudzinski, Leonard a.; Pencil, Eric J.; Dankanich, John W.

    2007-01-01

    The In-Space Propulsion Technology Project (ISPT) is currently NASA's sole investment in electric propulsion technologies. This project is managed at NASA Glenn Research Center (GRC) for the NASA Headquarters Science Mission Directorate (SMD). The objective of the electric propulsion project area is to develop near-term and midterm electric propulsion technologies to enhance or enable future NASA science missions while minimizing risk and cost to the end user. Systems analysis activities sponsored by ISPT seek to identify future mission applications in order to quantify mission requirements, as well as develop analytical capability in order to facilitate greater understanding and application of electric propulsion and other propulsion technologies in the ISPT portfolio. These analyses guide technology investments by informing decisions and defining metrics for technology development to meet identified mission requirements. This paper discusses the missions currently being studied for electric propulsion by the ISPT project, and presents the results of recent electric propulsion (EP) mission trades. Recent ISPT systems analysis activities include: an initiative to standardize life qualification methods for various electric propulsion systems in order to retire perceived risk to proposed EP missions; mission analysis to identify EP requirements from Discovery, New Frontiers, and Flagship classes of missions; and an evaluation of system requirements for radioisotope-powered electric propulsion. Progress and early results of these activities is discussed where available.

  3. Flutter Analysis of the Thermal Protection Layer on the NASA HIAD

    NASA Technical Reports Server (NTRS)

    Goldman, Benjamin D.; Dowell, Earl H.; Scott, Robert C.

    2013-01-01

    A combination of classical plate theory and a supersonic aerodynamic model is used to study the aeroelastic flutter behavior of a proposed thermal protection system (TPS) for the NASA HIAD. The analysis pertains to the rectangular configurations currently being tested in a NASA wind-tunnel facility, and may explain why oscillations of the articles could be observed. An analysis using a linear flat plate model indicated that flutter was possible well within the supersonic flow regime of the wind tunnel tests. A more complex nonlinear analysis of the TPS, taking into account any material curvature present due to the restraint system or substructure, indicated that significantly greater aerodynamic forcing is required for the onset of flutter. Chaotic and periodic limit cycle oscillations (LCOs) of the TPS are possible depending on how the curvature is imposed. When the pressure from the base substructure on the bottom of the TPS is used as the source of curvature, the flutter boundary increases rapidly and chaotic behavior is eliminated.

  4. Assessment and control of electrostatic charges. [hazards to space missions

    NASA Technical Reports Server (NTRS)

    Barrett, M.

    1974-01-01

    The experience is described of NASA and DOD with electrostatic problems, generation mechanisms, and type of electrostatic hazards. Guidelines for judging possible effects of electrostatic charges on space missions are presented along with mathematical formulas and definitions.

  5. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  6. Analysis and improved design considerations for airborne pulse Doppler radar signal processing in the detection of hazardous windshear

    NASA Technical Reports Server (NTRS)

    Lee, Jonggil

    1990-01-01

    High resolution windspeed profile measurements are needed to provide reliable detection of hazardous low altitude windshear with an airborne pulse Doppler radar. The system phase noise in a Doppler weather radar may degrade the spectrum moment estimation quality and the clutter cancellation capability which are important in windshear detection. Also the bias due to weather return Doppler spectrum skewness may cause large errors in pulse pair spectral parameter estimates. These effects are analyzed for the improvement of an airborne Doppler weather radar signal processing design. A method is presented for the direct measurement of windspeed gradient using low pulse repetition frequency (PRF) radar. This spatial gradient is essential in obtaining the windshear hazard index. As an alternative, the modified Prony method is suggested as a spectrum mode estimator for both the clutter and weather signal. Estimation of Doppler spectrum modes may provide the desired windshear hazard information without the need of any preliminary processing requirement such as clutter filtering. The results obtained by processing a NASA simulation model output support consideration of mode identification as one component of a windshear detection algorithm.

  7. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP) classes.

    PubMed

    Oyarzabal, Omar A; Rowe, Ellen

    2017-04-01

    The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP). In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56%) provided the most appropriate definition of hazard, 19 participants (27%) provided the most appropriate definition of risk, 14 participants (20%) provided the most appropriate definitions of both hazard and risk, and 23 participants (32%) did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05). Thirty participants (42%) stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65%) responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important food safety

  8. Solutions Network Formulation Report. NASA's Potential Contributions using ASTER Data in Marine Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Fletcher, Rose

    2010-01-01

    The 28-foot storm surge from Hurricane Katrina pushed inland along bays and rivers for a distance of 12 miles in some areas, contributing to the damage or destruction of about half of the fleet of boats in coastal Mississippi. Most of those boats had sought refuge in back bays and along rivers. Some boats were spared damage because the owners chose their mooring site well. Gulf mariners need a spatial analysis tool that provides guidance on the safest places to anchor their boats during future hurricanes. This product would support NOAA s mission to minimize the effects of coastal hazards through awareness, education, and mitigation strategies and could be incorporated in the Coastal Risk Atlas decision support tool.

  9. Overview of NASA's Integrated Design and Engineering Analysis (IDEA)Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.; Martin John G.

    2008-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures) each of which performs their design and analysis in relative isolation from others. This is possible in most cases either because the amount of interdisciplinary coupling is minimal or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA s X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design & Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary launch vehicle configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, configuration, propulsion, aerodynamics, aerothermodynamics, trajectory, closure and structural analysis into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics Research Mission Directorate. The environment is currently focused around a two-stage-to-orbit configuration

  10. Systems Analysis Approach for the NASA Environmentally Responsible Aviation Project

    NASA Technical Reports Server (NTRS)

    Kimmel, William M.

    2011-01-01

    This conference paper describes the current systems analysis approach being implemented for the Environmentally Responsible Aviation Project within the Integrated Systems Research Program under the NASA Aeronautics Research Mission Directorate. The scope and purpose of these systems studies are introduced followed by a methodology overview. The approach involves both top-down and bottoms-up components to provide NASA s stakeholders with a rationale for the prioritization and tracking of a portfolio of technologies which enable the future fleet of aircraft to operate with a simultaneous reduction of aviation noise, emissions and fuel-burn impacts to our environment. Examples of key current results and relevant decision support conclusions are presented along with a forecast of the planned analyses to follow.

  11. Analysis of Auroral Data from Nasa's 1968 and 1969 Airborne Auroral Expedition

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of a methodical compilation, reduction, and correlated analysis of spectrophotometric data obtained by various scientific groups during NASA's 1968 and 1969 Airborne Auroral Expedition are presented.

  12. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  13. LSENS, The NASA Lewis Kinetics and Sensitivity Analysis Code

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, K.

    2000-01-01

    A general chemical kinetics and sensitivity analysis code for complex, homogeneous, gas-phase reactions is described. The main features of the code, LSENS (the NASA Lewis kinetics and sensitivity analysis code), are its flexibility, efficiency and convenience in treating many different chemical reaction models. The models include: static system; steady, one-dimensional, inviscid flow; incident-shock initiated reaction in a shock tube; and a perfectly stirred reactor. In addition, equilibrium computations can be performed for several assigned states. An implicit numerical integration method (LSODE, the Livermore Solver for Ordinary Differential Equations), which works efficiently for the extremes of very fast and very slow reactions, is used to solve the "stiff" ordinary differential equation systems that arise in chemical kinetics. For static reactions, the code uses the decoupled direct method to calculate sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of dependent variables and/or the rate coefficient parameters. Solution methods for the equilibrium and post-shock conditions and for perfectly stirred reactor problems are either adapted from or based on the procedures built into the NASA code CEA (Chemical Equilibrium and Applications).

  14. Autonomous Landing and Hazard Avoidance Technology (ALHAT)

    NASA Technical Reports Server (NTRS)

    Epp, Chirold

    2007-01-01

    This viewgraph presentation reviews the work towards technology that will result in an autonomous landing on the lunar surface, that will avoid the hazards of lunar landing. In October 2005, the Exploration Systems Mission Directorate at NASA Headquarters assigned the development of new technologies to support the return to the moon. One of these was Autonomous Precision Landing and Hazard Detection and Avoidance Technology now known as ALHAT ALHAT is a lunar descent and landing GNC technology development project led by Johnson Space Center (JSC) with team members from Langley Research Center (LaRC), Jet Propulsion Laboratory (JPL), Draper Laboratories (CSDL) and the Applied Physics Laboratory (APL)

  15. NASA's Current Earth Science Program

    NASA Technical Reports Server (NTRS)

    Charles, Leslie Bermann

    1998-01-01

    NASA's Earth science program is a scientific endeavor whose goal is to provide long-term understanding of the Earth as an integrated system of land, water, air and life. A highly developed scientific knowledge of the Earth system is necessary to understand how the environment affects humanity, and how humanity may be affecting the environment. The remote sensing technologies used to gather the global environmental data used in such research also have numerous practical applications. Current applications of remote sensing data demonstrate their practical benefits in areas such as the monitoring of crop conditions and yields, natural disasters and forest fires; hazardous waste clean up; and tracking of vector-borne diseases. The long-term availability of environmental data is essential for the continuity of important research and applications efforts. NASA's Earth observation program has undergone many changes in the recent past.

  16. Job Hazard Analysis

    DTIC Science & Technology

    1998-01-01

    Performing Organization Name(s) and Address(es) U.S. Department of Labor Occupational Safety & Health Administration 200 Constitution Avenue Washington, DC...20210 Performing Organization Report Number OSHA 3071 Sponsoring/Monitoring Agency Name(s) and Address(es) Sponsor/Monitor’s Acronym(s) Sponsor...identifying existing or potential job hazards (both safety and health), and determining the best way to perform the job or to reduce or eliminate these

  17. Artificial recharge for subsidence abatement at the NASA-Johnson Space Center, Phase I

    USGS Publications Warehouse

    Garza, Sergio

    1977-01-01

    Regional decline of aquifer head due to ground-water withdrawal in the Houston area has caused extensive land-surface subsidence. The NASA-Johnson Space Center (NASA-JSC) in southeastern Harris County, Texas, was about 13 to 19 feet above mean sea level in 1974 and sinking at a rate of more than 0.2 foot per year. NASA-JSC officials, concerned about the hurricane flooding hazard, requested the U.S. Geological Survey to study the feasibility of artificially recharging the aquifers for subsidence abatement. Hydrologic digital models were developed for theoretical determinations of quantities of water needed, under various well-array plans, for artificial recharge of the Chicot and Evangeline aquifers in order to halt the local subsidence at NASA-JSC. The programs for the models were developed for analysis of three-dimensional ground-water flow. Total injection rates of between 2,000 and 14,000 gallons per minute under three general well-array plans were determined for a range of residual clay pore pressures of 10 to 70 feet of hydraulic head. The space distributions of the resultant hydraulic heads, illustrated for injection rates of 3,600 and 8 ,400 gallons per minute, indicated that, for the same rate, increasing the number and spread of the injection locations reduces the head gradients within NASA-JSC. (Woodard-USGS)

  18. Multi-Disciplinary Analysis for Future Launch Systems Using NASA's Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Monell, D.; Mathias, D.; Reuther, J.; Garn, M.

    2003-01-01

    A new engineering environment constructed for the purposes of analyzing and designing Reusable Launch Vehicles (RLVs) is presented. The new environment has been developed to allow NASA to perform independent analysis and design of emerging RLV architectures and technologies. The new Advanced Engineering Environment (AEE) is both collaborative and distributed. It facilitates integration of the analyses by both vehicle performance disciplines and life-cycle disciplines. Current performance disciplines supported include: weights and sizing, aerodynamics, trajectories, propulsion, structural loads, and CAD-based geometries. Current life-cycle disciplines supported include: DDT&E cost, production costs, operations costs, flight rates, safety and reliability, and system economics. Involving six NASA centers (ARC, LaRC, MSFC, KSC, GRC and JSC), AEE has been tailored to serve as a web-accessed agency-wide source for all of NASA's future launch vehicle systems engineering functions. Thus, it is configured to facilitate (a) data management, (b) automated tool/process integration and execution, and (c) data visualization and presentation. The core components of the integrated framework are a customized PTC Windchill product data management server, a set of RLV analysis and design tools integrated using Phoenix Integration's Model Center, and an XML-based data capture and transfer protocol. The AEE system has seen production use during the Initial Architecture and Technology Review for the NASA 2nd Generation RLV program, and it continues to undergo development and enhancements in support of its current main customer, the NASA Next Generation Launch Technology (NGLT) program.

  19. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    PubMed

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  20. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    PubMed Central

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  1. Fire hazard analysis for Plutonium Finishing Plant complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41,more » Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.« less

  2. A hazard control system for robot manipulators

    NASA Technical Reports Server (NTRS)

    Carter, Ruth Chiang; Rad, Adrian

    1991-01-01

    A robot for space applications will be required to complete a variety of tasks in an uncertain, harsh environment. This fact presents unusual and highly difficult challenges to ensuring the safety of astronauts and keeping the equipment they depend on from becoming damaged. The systematic approach being taken to control hazards that could result from introducing robotics technology in the space environment is described. First, system safety management and engineering principles, techniques, and requirements are discussed as they relate to Shuttle payload design and operation in general. The concepts of hazard, hazard category, and hazard control, as defined by the Shuttle payload safety requirements, is explained. Next, it is shown how these general safety management and engineering principles are being implemented on an actual project. An example is presented of a hazard control system for controlling one of the hazards identified for the Development Test Flight (DTF-1) of NASA's Flight Telerobotic Servicer, a teleoperated space robot. How these schemes can be applied to terrestrial robots is discussed as well. The same software monitoring and control approach will insure the safe operation of a slave manipulator under teleoperated or autonomous control in undersea, nuclear, or manufacturing applications where the manipulator is working in the vicinity of humans or critical hardware.

  3. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  4. Development of the NASA/FLAGRO computer program for analysis of airframe structures

    NASA Technical Reports Server (NTRS)

    Forman, R. G.; Shivakumar, V.; Newman, J. C., Jr.

    1994-01-01

    The NASA/FLAGRO (NASGRO) computer program was developed for fracture control analysis of space hardware and is currently the standard computer code in NASA, the U.S. Air Force, and the European Agency (ESA) for this purpose. The significant attributes of the NASGRO program are the numerous crack case solutions, the large materials file, the improved growth rate equation based on crack closure theory, and the user-friendly promptive input features. In support of the National Aging Aircraft Research Program (NAARP); NASGRO is being further developed to provide advanced state-of-the-art capability for damage tolerance and crack growth analysis of aircraft structural problems, including mechanical systems and engines. The project currently involves a cooperative development effort by NASA, FAA, and ESA. The primary tasks underway are the incorporation of advanced methodology for crack growth rate retardation resulting from spectrum loading and improved analysis for determining crack instability. Also, the current weight function solutions in NASGRO or nonlinear stress gradient problems are being extended to more crack cases, and the 2-d boundary integral routine for stress analysis and stress-intensity factor solutions is being extended to 3-d problems. Lastly, effort is underway to enhance the program to operate on personal computers and work stations in a Windows environment. Because of the increasing and already wide usage of NASGRO, the code offers an excellent mechanism for technology transfer for new fatigue and fracture mechanics capabilities developed within NAARP.

  5. Hazard analysis of Arid and semi-Arid (ASAL) regions of Kenya.

    PubMed

    Tabu, J S; Otwelo, J A; Koskei, P; Makokha, P

    2013-06-01

    water scarcity, poverty and low educational levels. The region suffers from a variety of hazards in particular Drought and famine, Epidemics including HIV/AIDS and War and conflict. Environmental degradation though given a low score may be more of a perception. There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up.

  6. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  7. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  8. Fairfield Plume Measurement and Analysis on the NASA-300M and NASA-300MS

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Shastry, Rohit; Soulas, George C.; Kamhawi, Hani

    2013-01-01

    NASA is developing a 10- to 15-kW Hall thruster system to support future NASA missions. This activity is funded under the Space Technology Mission Directorate Solar Electric Propulsion Technology Demonstration Mission project. As a part of the development process, the NASA-300M, a 20-kW Hall thruster, was modified to incorporate the magnetic shielding concept and named the NASA-300MS. This activity was undertaken to assess the viability of using the magnetic shielding concept on a high-power Hall thruster to greatly reduce discharge channel erosion. This paper reports on the study to characterize the far-field plumes of the NASA-300M and NASA-300MS. Diagnostics deployed included a polarlyswept Faraday probe, a Wien filter (ExB probe), a retarding potential analyzer, and a Langmuir probe. During the study, a new, more accurate, integration method for analyzing Wien filter probe data was implemented and effect of secondary electron emission on the Faraday probe data was treated. Comparison of the diagnostic results from the two thrusters showed that the magnetically shielded version performed with 2 percent higher voltage utilization efficiency, 2 percent lower plume divergence efficiency, and 2 percent lower mass utilization efficiency compared to the baseline version. The net change in efficiency is within the aggregate measurement uncertainty so the overall performance is roughly equal for the two versions of the thruster. Anode efficiency calculated from thrust stand measurement corroborates this finding.

  9. Regional Frequency Analysis of Ocean Hazard

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc

    2015-04-01

    The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.

  10. The October 1973 NASA mission model analysis and economic assessment

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Results are presented of the 1973 NASA Mission Model Analysis. The purpose was to obtain an economic assessment of using the Shuttle to accommodate the payloads and requirements as identified by the NASA Program Offices and the DoD. The 1973 Payload Model represents a baseline candidate set of future payloads which can be used as a reference base for planning purposes. The cost of implementing these payload programs utilizing the capabilities of the shuttle system is analyzed and compared with the cost of conducting the same payload effort using expendable launch vehicles. There is a net benefit of 14.1 billion dollars as a result of using the shuttle during the 12-year period as compared to using an expendable launch vehicle fleet.

  11. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevougian, S. David

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less

  12. NASA Systems Engineering Handbook

    NASA Technical Reports Server (NTRS)

    Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; Mcduffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou

    1995-01-01

    This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive. Superseded by: NASA/SP-2007-6105 Rev 1 (20080008301).

  13. NASA systems engineering handbook

    NASA Astrophysics Data System (ADS)

    Shishko, Robert; Aster, Robert; Chamberlain, Robert G.; McDuffee, Patrick; Pieniazek, Les; Rowell, Tom; Bain, Beth; Cox, Renee I.; Mooz, Harold; Polaski, Lou

    1995-06-01

    This handbook brings the fundamental concepts and techniques of systems engineering to NASA personnel in a way that recognizes the nature of NASA systems and environment. It is intended to accompany formal NASA training courses on systems engineering and project management when appropriate, and is designed to be a top-level overview. The concepts were drawn from NASA field center handbooks, NMI's/NHB's, the work of the NASA-wide Systems Engineering Working Group and the Systems Engineering Process Improvement Task team, several non-NASA textbooks and guides, and material from independent systems engineering courses taught to NASA personnel. Five core chapters cover systems engineering fundamentals, the NASA Project Cycle, management issues in systems engineering, systems analysis and modeling, and specialty engineering integration. It is not intended as a directive.

  14. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard

  15. Final results of the NASA storm hazards program

    NASA Technical Reports Server (NTRS)

    Fisher, Bruce D.; Brown, Philip W.; Plumer, J. Anderson; Wunschel, Alfred J., Jr.

    1988-01-01

    Lightning swept-flash attachment patterns and the associated flight conditions were recorded from 1980-1986 during 1496 thunderstorm penetrations and 714 direct strikes with a NASA F-1068 research airplane. These data were studied with an emphasis on lightning avoidance by aircraft and on aircraft protection design. The individual lightning attachment spots, along with crew comments and on-board photographic data were used to identify lightning swept-flash attachment patterns and the orientations of the lightning channels with respect to the airplane. The full-scale in-flight data were compared to results from scale-model arc-attachment tests. The airborne and scale-model data showed that any exterior surface of this airplane may be susceptible to direct lightning attachment. In addition, the altitudes, ambient temperatures, and the relative turbulence and precipitation levels at which the strikes occurred in thunderstorms are summarized and discussed. It was found that the peak strike rate occurred at pressure altitudes betwen 38,000 ft and 40,000 ft, corresponding to ambient temperatures colder than -40 C.

  16. An Overview of NASA's Integrated Design and Engineering Analysis (IDEA) Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.

    2011-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures), each of which performs design and analysis in relative isolation from others. This is possible, in most cases, either because the amount of interdisciplinary coupling is minimal, or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA's X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable, as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective, can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design and Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary designs for launch vehicle and high speed atmospheric flight configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, packaging, propulsion, trajectory, aerodynamics, aerothermodynamics, engine and airframe subsystem design, thermal and structural analysis, and vehicle closure into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA?s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics

  17. Complexity analysis of the cost effectiveness of PI-led NASA science missions

    NASA Astrophysics Data System (ADS)

    Yoshida, J.; Cowdin, M.; Mize, T.; Kellogg, R.; Bearden, D.

    For the last 20 years, NASA has allowed Principal Investigators (PIs) to manage the development of many unmanned space projects. Advocates of PI-led projects believe that a PI-led implementation can result in a project being developed at lower cost and shorter schedule than other implementation modes. This paper seeks to test this hypothesis by comparing the actual costs of NASA and other comparable projects developed under different implementation modes. The Aerospace Corporation's Complexity-Based Risk Assessment (CoBRA) analysis tool is used to normalize the projects such that the cost can be compared for equivalent project complexities. The data is examined both by complexity and by launch year. Cost growth will also be examined for any correlation with implementation mode. Defined in many NASA Announcements of Opportunity (AOs), a PI-led project is characterized by a central, single person with full responsibility for assembling a team and for the project's scientific integrity and the implementation and integrity of all other aspects of the mission, while operating under a cost cap. PIs have larger degrees of freedom to achieve the stated goals within NASA guidelines and oversight. This study leverages the definitions and results of previous National Research Council studies of PI-led projects. Aerospace has defined a complexity index, derived from mission performance, mass, power, and technology choices, to arrive at a broad representation of missions for purposes of comparison. Over a decade of research has established a correlation between mission complexity and spacecraft development cost and schedule. This complexity analysis, CoBRA, is applied to compare a PI-led set of New Frontiers, Discovery, Explorers, and Earth System Science Pathfinder missions to the overall NASA mission dataset. This reveals the complexity trends against development costs, cost growth, and development era.

  18. NASA Life Sciences Data Repositories: Tools for Retrospective Analysis and Future Planning

    NASA Technical Reports Server (NTRS)

    Thomas, D.; Wear, M.; VanBaalen, M.; Lee, L.; Fitts, M.

    2011-01-01

    As NASA transitions from the Space Shuttle era into the next phase of space exploration, the need to ensure the capture, analysis, and application of its research and medical data is of greater urgency than at any other previous time. In this era of limited resources and challenging schedules, the Human Research Program (HRP) based at NASA s Johnson Space Center (JSC) recognizes the need to extract the greatest possible amount of information from the data already captured, as well as focus current and future research funding on addressing the HRP goal to provide human health and performance countermeasures, knowledge, technologies, and tools to enable safe, reliable, and productive human space exploration. To this end, the Science Management Office and the Medical Informatics and Health Care Systems Branch within the HRP and the Space Medicine Division have been working to make both research data and clinical data more accessible to the user community. The Life Sciences Data Archive (LSDA), the research repository housing data and information regarding the physiologic effects of microgravity, and the Lifetime Surveillance of Astronaut Health (LSAH-R), the clinical repository housing astronaut data, have joined forces to achieve this goal. The task of both repositories is to acquire, preserve, and distribute data and information both within the NASA community and to the science community at large. This is accomplished via the LSDA s public website (http://lsda.jsc.nasa.gov), which allows access to experiment descriptions including hardware, datasets, key personnel, mission descriptions and a mechanism for researchers to request additional data, research and clinical, that is not accessible from the public website. This will result in making the work of NASA and its partners available to the wider sciences community, both domestic and international. The desired outcome is the use of these data for knowledge discovery, retrospective analysis, and planning of future

  19. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    NASA Astrophysics Data System (ADS)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  20. Computer Model Locates Environmental Hazards

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  1. NASA Aviation Safety Reporting System (ASRS)

    NASA Technical Reports Server (NTRS)

    Connell, Linda

    2011-01-01

    The NASA Aviation Safety Reporting System (ASRS) collects, analyzes, and distributes de-identified safety information provided through confidentially submitted reports from frontline aviation personnel. Since its inception in 1976, the ASRS has collected over 900,000 reports and has never breached the identity of the people sharing their information about events or safety issues. From this volume of data, the ASRS has released over 5,500 aviation safety alerts concerning potential hazards and safety concerns. The ASRS processes these reports, evaluates the information, and provides de-identified report information through the online ASRS Database at http://asrs.arc.nasa.gov. The NASA ASRS is also a founding member of the International Confidential Aviation Safety Systems (ICASS) group which is a collection of other national aviation reporting systems throughout the world. The ASRS model has also been replicated for application to improving safety in railroad, medical, fire fighting, and other domains. This presentation \\vill discuss confidential, voluntary, and non-punitive reporting systems and their advantages in providing information for safety improvements.

  2. NASA Aviation Safety Reporting System (ASRS)

    NASA Technical Reports Server (NTRS)

    Connell, Linda J.

    2017-01-01

    The NASA Aviation Safety Reporting System (ASRS) collects, analyzes, and distributes de-identified safety information provided through confidentially submitted reports from frontline aviation personnel. Since its inception in 1976, the ASRS has collected over 1.4 million reports and has never breached the identity of the people sharing their information about events or safety issues. From this volume of data, the ASRS has released over 6,000 aviation safety alerts concerning potential hazards and safety concerns. The ASRS processes these reports, evaluates the information, and provides selected de-identified report information through the online ASRS Database at http:asrs.arc.nasa.gov. The NASA ASRS is also a founding member of the International Confidential Aviation Safety Systems (ICASS) group which is a collection of other national aviation reporting systems throughout the world. The ASRS model has also been replicated for application to improving safety in railroad, medical, fire fighting, and other domains. This presentation will discuss confidential, voluntary, and non-punitive reporting systems and their advantages in providing information for safety improvements.

  3. Grand Junction projects office mixed-waste treatment program, VAC*TRAX mobile treatment unit process hazards analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloom, R.R.

    1996-04-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less

  4. First NASA Aviation Safety Program Weather Accident Prevention Project Annual Review

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2000-01-01

    The goal of this Annual Review was to present NASA plans and accomplishments that will impact the national aviation safety goal. NASA's WxAP Project focuses on developing the following products: (1) Aviation Weather Information (AWIN) technologies (displays, sensors, pilot decision tools, communication links, etc.); (2) Electronic Pilot Reporting (E-PIREPS) technologies; (3) Enhanced weather products with associated hazard metrics; (4) Forward looking turbulence sensor technologies (radar, lidar, etc.); (5) Turbulence mitigation control system designs; Attendees included personnel from various NASA Centers, FAA, National Weather Service, DoD, airlines, aircraft and pilot associations, industry, aircraft manufacturers and academia. Attendees participated in discussion sessions aimed at collecting aviation user community feedback on NASA plans and R&D activities. This CD is a compilation of most of the presentations presented at this Review.

  5. Hydrogen and Storage Initiatives at the NASA JSC White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Maes, Miguel; Woods, Stephen S.

    2006-01-01

    NASA WSTF Hydrogen Activities: a) Aerospace Test; b) System Certification & Verification; c) Component, System, & Facility Hazard Assessment; d) Safety Training Technical Transfer: a) Development of Voluntary Consensus Standards and Practices; b) Support of National Hydrogen Infrastructure Development.

  6. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...

  7. Probabilistic and Scenario Seismic and Liquefaction Hazard Analysis of the Mississippi Embayment Incorporating Nonlinear Site Effects

    NASA Astrophysics Data System (ADS)

    Cramer, C. H.; Dhar, M. S.

    2017-12-01

    The influence of deep sediment deposits of the Mississippi Embayment (ME) on the propagation of seismic waves is poorly understood and remains a major source of uncertainty for site response analysis. Many researchers have studied the effects of these deposits on seismic hazard of the area using available information at the time. In this study, we have used updated and newly available resources for seismic and liquefaction hazard analyses of the ME. We have developed an improved 3D geological model. Additionally, we used surface geological maps from Cupples and Van Arsdale (2013) to prepare liquefaction hazard maps. Both equivalent linear and nonlinear site response codes were used to develop site amplification distributions for use in generating hazard maps. The site amplification distributions are created using the Monte Carlo approach of Cramer et al. (2004, 2006) on a 0.1-degree grid. The 2014 National Seismic Hazard model and attenuation relations (Petersen et al., 2014) are used to prepare seismic hazard maps. Then liquefaction hazard maps are generated using liquefaction probability curves from Holzer (2011) and Cramer et al. (2015). Equivalent linear response (w/ increased precision, restricted nonlinear behavior with depth) shows similar hazard for the ME compared to nonlinear analysis (w/o pore pressure) results. At short periods nonlinear deamplification dominates the hazard, but at long periods resonance amplification dominates. The liquefaction hazard tends to be high in Holocene and late Pleistocene lowland sediments, even with lowered ground water levels, and low in Pleistocene loess of the uplands. Considering pore pressure effects in nonlinear site response analysis at a test site on the lowlands shows amplification of ground motion at short periods. PGA estimates from ME liquefaction and MMI observations are in the 0.25 to 0.4 g range. Our estimated M7.5 PGA hazard within 10 km of the fault can exceed this. Ground motion observations from

  8. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  9. 78 FR 20696 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  10. 75 FR 41240 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  11. 76 FR 20717 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  12. 76 FR 64112 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-17

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  13. 77 FR 67677 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-13

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  14. 78 FR 72718 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  15. 77 FR 9997 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-21

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council.

  16. CFD Analysis in Advance of the NASA Juncture Flow Experiment

    NASA Technical Reports Server (NTRS)

    Lee, H. C.; Pulliam, T. H.; Neuhart, D. H.; Kegerise, M. A.

    2017-01-01

    NASA through its Transformational Tools and Technologies Project (TTT) under the Advanced Air Vehicle Program, is supporting a substantial effort to investigate the formation and origin of separation bubbles found on wing-body juncture zones. The flow behavior in these regions is highly complex, difficult to measure experimentally, and challenging to model numerically. Multiple wing configurations were designed and evaluated using Computational Fluid Dynamics (CFD), and a series of wind tunnel risk reduction tests were performed to further down-select the candidates for the final experiment. This paper documents the CFD analysis done in conjunction with the 6 percent scale risk reduction experiment performed in NASA Langley's 14- by 22-Foot Subsonic Tunnel. The combined CFD and wind tunnel results ultimately helped the Juncture Flow committee select the wing configurations for the final experiment.

  17. Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquakes

    NASA Astrophysics Data System (ADS)

    Wdowinski, S.; Peng, Z.; Ferrier, K.; Lin, C. H.; Hsu, Y. J.; Shyu, J. B. H.

    2017-12-01

    Earthquakes, landslides, and tropical cyclones are extreme hazards that pose significant threats to human life and property. Some of the couplings between these hazards are well known. For example, sudden, widespread landsliding can be triggered by large earthquakes and by extreme rainfall events like tropical cyclones. Recent studies have also shown that earthquakes can be triggered by erosional unloading over 100-year timescales. In a NASA supported project, titled "Cascading hazards: Understanding triggering relations between wet tropical cyclones, landslides, and earthquake", we study triggering relations between these hazard types. The project focuses on such triggering relations in Taiwan, which is subjected to very wet tropical storms, landslides, and earthquakes. One example for such triggering relations is the 2009 Morakot typhoon, which was the wettest recorded typhoon in Taiwan (2850 mm of rain in 100 hours). The typhoon caused widespread flooding and triggered more than 20,000 landslides, including the devastating Hsiaolin landslide. Six months later, the same area was hit by the 2010 M=6.4 Jiashian earthquake near Kaohsiung city, which added to the infrastructure damage induced by the typhoon and the landslides. Preliminary analysis of temporal relations between main-shock earthquakes and the six wettest typhoons in Taiwan's past 50 years reveals similar temporal relations between M≥5 events and wet typhoons. Future work in the project will include remote sensing analysis of landsliding, seismic and geodetic monitoring of landslides, detection of microseismicity and tremor activities, and mechanical modeling of crustal stress changes due to surface unloading.

  18. Reduced hazard chemicals for solid rocket motor production

    NASA Technical Reports Server (NTRS)

    Caddy, Larry A.; Bowman, Ross; Richards, Rex A.

    1995-01-01

    During the last three years. the NASA/Thiokol/industry team has developed and started implementation of an environmentally sound manufacturing plan for the continued production of solid rocket motors. NASA Marshall Space Flight Center (MSFC) and Thiokol Corporation have worked with other industry representatives and the U.S. Environmental Protection Agency (EPA) to prepare a comprehensive plan to eliminate all ozone depleting chemicals from manufacturing processes and reduce the use of other hazardous materials used to produce the space shuttle reusable solid rocket motors. The team used a classical approach for problem-solving combined with a creative synthesis of new approaches to attack this challenge.

  19. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  20. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  1. WE-G-BRA-06: Application of Systems and Control Theory-Based Hazard Analysis to Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlicki, T; Samost, A; Leveson, N

    Purpose: The process of delivering radiation occurs in a complex socio-technical system heavily reliant on human operators. Furthermore, both humans and software are notoriously challenging to account for in traditional hazard analysis models. High reliability industries such as aviation have approached this problem through using hazard analysis techniques grounded in systems and control theory. The purpose of this work is to apply the Systems Theoretic Accident Model Processes (STAMP) hazard model to radiotherapy. In particular, the System-Theoretic Process Analysis (STPA) approach is used to perform a hazard analysis of a proposed on-line adaptive cranial radiosurgery procedure that omits the CTmore » Simulation step and uses only CBCT for planning, localization, and treatment. Methods: The STPA procedure first requires the definition of high-level accidents and hazards leading to those accidents. From there, hierarchical control structures were created followed by the identification and description of control actions for each control structure. Utilizing these control structures, unsafe states of each control action were created. Scenarios contributing to unsafe control action states were then identified and translated into system requirements to constrain process behavior within safe boundaries. Results: Ten control structures were created for this new CBCT-only process which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Twenty three control actions were identified that contributed to over 80 unsafe states of those control actions resulting in over 220 failure scenarios. Conclusion: The interaction of people, hardware, and software are highlighted through the STPA approach. STPA provides a hierarchical model for understanding the role of management decisions in impacting system safety so that a process design requirement can be traced back to the hazard and accident that it is intended to mitigate

  2. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  3. Back analysis of Swiss flood danger map to define local flood hazards

    NASA Astrophysics Data System (ADS)

    Choffet, Marc; Derron, Marc-Henri; Jaboyedoff, Michel; Leroi, Eric; Mayis, Arnaud

    2010-05-01

    The flood hazard maps for the entire Switzerland will be available at the end of 2011. Furthermore, the Swiss territory has been covered by aerial laser scanning (ALS) providing high resolution digital elevation model (DEM). This paper describes the development of a method for analyzing the local flood hazard based on Swiss hazard maps and HR-DEM. In their original state, Swiss hazard maps are constructed on the basis of an aggregation of information, a matrix intensity, and frequency. The degree of danger represented by the yellow, blue and red zones gives no information on the water level at each point of the territory. The developed method is based on a superposition of the danger map with the HR-DEM to determine the water level in a hazard area. To perform this method, (1) a triangulation is based on the intersection of the hazard map with the HR-DEM. It uses the limits of area where information is contrain. The hazard map perimeter and the boundaries of hazard areas give information on the widest possible overflow in case of flooding. It is also possible to associate it with a return period. (2) Based on these areas and the difference with the DEM, it is possible to calibrate the highest flood level and the extract water levels for the entire area. This analysis of existing documents opens up interesting perspectives for understanding how infrastructures are threatened by flood hazard by predicting water levels and potential damages to buildings while proposing remedial measures. Indeed, this method allows estimating the water level at each point of a building in case of flooding. It is designed to provide spatial information on water height levels; this offers a different approach of buildings in danger zones. Indeed, it is possible to discern several elements, such as areas of water accumulation involving longer flood duration, possible structural damages to buildings due to high hydrostatic pressure, determination of a local hazard, or the display of water

  4. ANALYSIS OF GEOTHERMAL WASTES FOR HAZARDOUS COMPONENTS

    EPA Science Inventory

    Regulations governing the disposal of hazardous wastes led to an assessment for geothermal solid wastes for potentially hazardous properties. Samples were collected from three active geothermal sites in the western United States: The Geysers, Imperial Valley, and northwestern Nev...

  5. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  6. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  7. Failure Modes and Effects Analysis (FMEA): A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  8. Compiling a Comprehensive EVA Training Dataset for NASA Astronauts

    NASA Technical Reports Server (NTRS)

    Laughlin, M. S.; Murry, J. D.; Lee, L. R.; Wear, M. L.; Van Baalen, M.

    2016-01-01

    Training for a spacewalk or extravehicular activity (EVA) is considered hazardous duty for NASA astronauts. This activity places astronauts at risk for decompression sickness as well as various musculoskeletal disorders from working in the spacesuit. As a result, the operational and research communities over the years have requested access to EVA training data to supplement their studies.

  9. NASA Aeronautics Multidisciplinary Analysis and Design Fellowship Program

    NASA Technical Reports Server (NTRS)

    Grossman, B.; Guerdal, Z.; Haftka, R. T.; Kapania, R. K.; Mason, W. H.; Mook, D. T.

    1998-01-01

    For a number of years, Virginia Tech had been on the forefront of research in the area of multidisciplinary analysis and design. In June of 1994, faculty members from aerospace and ocean engineering, engineering science and mechanics, mechanical engineering, industrial engineering, mathematics and computer sciences, at Virginia Tech joined together to form the Multidisciplinary Analysis and Design (MAD) Center for Advanced Vehicles. The center was established with the single goal: to perform research that is relevant to the needs of the US industry and to foster collaboration between the university, government and industry. In October of 1994, the center was chosen by NASA headquarters as one of the five university centers to establish a fellowship program to develop a graduate program in multidisciplinary analysis and design. The fellowship program provides full stipend and tuition support for seven U. S. students per year during their graduate studies. To advise us regarding the problems faced by the industry, an industrial advisory board has been formed consisting of representatives from industry as well as government laboratories. The function of the advisory board is to channel information from its member companies to faculty members concerning problems that need research attention in the general area of multidisciplinary design optimization (MDO). The faculty and their graduate students make proposals to the board on how to address these problems. At the annual board meeting in Blacksburg, the board discusses the proposals and suggests which students get funded under the NASA fellowship program. All students participating in the program are required to spend 3-6 months in industry working on their research projects. We are completing the third year of the fellowship program and have had three advisory board meetings in Blacksburg.

  10. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  11. An Analysis of NASA Technology Transfer. Degree awarded by Pennsylvania State Univ.

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.

    1996-01-01

    A review of previous technology transfer metrics, recommendations, and measurements is presented within the paper. A quantitative and qualitative analysis of NASA's technology transfer efforts is performed. As a relative indicator, NASA's intellectual property performance is benchmarked against a database of over 100 universities. Successful technology transfer (commercial sales, production savings, etc.) cases were tracked backwards through their history to identify the key critical elements that lead to success. Results of this research indicate that although NASA's performance is not measured well by quantitative values (intellectual property stream data), it has a net positive impact on the private sector economy. Policy recommendations are made regarding technology transfer within the context of the documented technology transfer policies since the framing of the Constitution. In the second thrust of this study, researchers at NASA Langley Research Center were surveyed to determine their awareness of, attitude toward, and perception about technology transfer. Results indicate that although researchers believe technology transfer to be a mission of the Agency, they should not be held accountable or responsible for its performance. In addition, the researchers are not well educated about the mechanisms to perform, or policies regarding, technology transfer.

  12. Performing a preliminary hazard analysis applied to administration of injectable drugs to infants.

    PubMed

    Hfaiedh, Nadia; Kabiche, Sofiane; Delescluse, Catherine; Balde, Issa-Bella; Merlin, Sophie; Carret, Sandra; de Pontual, Loïc; Fontan, Jean-Eudes; Schlatter, Joël

    2017-08-01

    Errors in hospitals during the preparation and administration of intravenous drugs to infants and children have been reported to a rate of 13% to 84%. This study aimed to investigate the potential for hazardous events that may lead to an accident for preparation and administration of drug injection in a pediatric department and to describe a reduction plan of risks. The preliminary hazard analysis (PHA) method was implemented by a multidisciplinary working group over a period of 5 months (April-August 2014) in infants aged from 28 days to 2 years. The group identified required hazard controls and follow-up actions to reduce the error risk. To analyze the results, the STATCART APR software was used. During the analysis, 34 hazardous situations were identified, among 17 were quoted very critical and drawn 69 risk scenarios. After follow-up actions, the scenarios with unacceptable risk declined from 17.4% to 0%, and these with acceptable under control from 46.4% to 43.5%. The PHA can be used as an aid in the prioritization of corrective actions and the implementation of control measures to reduce risk. The PHA is a complement of the a posteriori risk management already exists. © 2017 John Wiley & Sons, Ltd.

  13. Remote rainfall sensing for landslide hazard analysis

    USGS Publications Warehouse

    Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay

    2001-01-01

    Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.

  14. Transportation systems safety hazard analysis tool (SafetyHAT) user guide (version 1.0)

    DOT National Transportation Integrated Search

    2014-03-24

    This is a user guide for the transportation system Safety Hazard Analysis Tool (SafetyHAT) Version 1.0. SafetyHAT is a software tool that facilitates System Theoretic Process Analysis (STPA.) This user guide provides instructions on how to download, ...

  15. NASA/MSFC FY-82 atmospheric processes research review

    NASA Technical Reports Server (NTRS)

    Turner, R. E. (Compiler)

    1982-01-01

    The NASA/MSFC FY-82 Atmospheric Processes Research Program was reviewed. The review covered research tasks in the areas of upper atmosphere, global weather, and severe storms and local weather. Also included was research on aviation safety environmental hazards. The research project summaries, in narrative outline form, supplied by the individual investigators together with the agenda and other information about the review are presented.

  16. NASA's future Earth observation plans

    NASA Astrophysics Data System (ADS)

    Neeck, Steven P.; Paules, Granville E.; McCuistion Ramesh, J. D.

    2004-11-01

    NASA's Science Mission Directorate, working with its domestic and international partners, provides accurate, objective scientific data and analysis to advance our understanding of Earth system processes. Learning more about these processes will enable improved prediction capability for climate, weather, and natural hazards. Earth interactions occur on a continuum of spatial and temporal scales ranging from short-term weather to long-term climate, and from local and regional to global. Quantitatively describing these changes means precisely measuring from space scores of biological and geophysical parameters globally. New missions that SMD will launch in the coming decade will complement the first series of the Earth Observing System. These next generation systematic measurement missions are being planned to extend or enhance the record of science-quality data necessary for understanding and predicting global change. These missions include the NPOESS Preparatory Project, Ocean Surface Topography Mission, Global Precipitation Measurement, Landsat Data Continuity Mission, and an aerosol polarimetry mission called Glory. New small explorer missions will make first of a kind Earth observations. The Orbiting Carbon Observatory will measure sources and sinks of carbon to help the Nation and the world formulate effective strategies to constrain the amount of this greenhouse gas in the atmosphere. Aquarius will measure ocean surface salinity which is key to ocean circulation in the North Atlantic that produces the current era's mild climate in northern Europe. HYDROS will measure soil moisture globally. Soil moisture is critical to agriculture and to managing fresh water resources. NASA continues to design, develop and launch the Nation's civilian operational environmental satellites, in both polar and geostationary orbits, by agreement with the National Oceanic and Atmospheric Administration (NOAA). NASA plans to develop an advanced atmospheric sounder, GIFTS, for

  17. NASA and COTS Electronics: Past Approach and Successes - Future Considerations

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.

    2018-01-01

    NASA has a long history of using commercial grade electronics in space. In this talk, a brief history of NASAâ's trends and approaches to commercial grade electronics focusing on processing and memory systems will be presented. This will include providing summary information on the space hazards to electronics as well as NASA mission trade space. We will also discuss developing recommendations for risk management approaches to Electrical, Electronic and Electromechanical (EEE) parts and reliability in space. The final portion of the talk will discuss emerging aerospace trends and the future for Commercial Off The Shelf (COTS) usage.

  18. FAA/NASA Joint University Program for Air Transportation Research 1994-1995

    NASA Technical Reports Server (NTRS)

    Remer, J. H.

    1998-01-01

    The Joint University Program for Air Transportation Research (JUP) is a coordinated set of three grants co-sponsored by the Federal Aviation Administration (FAA) and the National Aeronautics and Space Administration (NASA). Under JUP, three institutions: the Massachusetts Institute of Technology, Princeton, and Ohio Universities receive research grants and collaborate with FAA and NASA in defining and performing civil aeronautics research in a multitude of areas. Some of these disciplines are artificial intelligence, control theory, atmospheric hazards, navigation, avionics, human factors, flight dynamics, air traffic management, and electronic communications.

  19. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less

  20. Resource analysis applications in Michigan. [NASA remote sensing

    NASA Technical Reports Server (NTRS)

    Schar, S. W.; Enslin, W. R.; Sattinger, I. J.; Robinson, J. G.; Hosford, K. R.; Fellows, R. S.; Raad, J. H.

    1974-01-01

    During the past two years, available NASA imagery has been applied to a broad spectrum of problems of concern to Michigan-based agencies. These demonstrations include the testing of remote sensing for the purposes of (1) highway corridor planning and impact assessments, (2) game management-area information bases, (3) multi-agency river basin planning, (4) timber resource management information systems, (5) agricultural land reservation policies, and (6) shoreline flooding damage assessment. In addition, cost accounting procedures have been developed for evaluating the relative costs of utilizing remote sensing in land cover and land use analysis data collection procedures.

  1. Analysis of Near-field of Circular Aperture Antennas with Application to Study of High Intensity Radio Frequency (HIRF) Hazards to Aviation from JPL/NASA Deep Space Network Antennas

    NASA Technical Reports Server (NTRS)

    Jamnejad, Vahraz; Statman, Joseph

    2013-01-01

    This work includes a simplified analysis of the radiated near to mid-field from JPL/NASA Deep Space Network (DSN) reflector antennas and uses an averaging technique over the main beam region and beyond for complying with FAA regulations in specific aviation environments. The work identifies areas that require special attention, including the implications of the very narrow beam of the DSN transmitters. The paper derives the maximum averaged power densities allowed and identifies zones where mitigation measures are required.

  2. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  3. Environmental Risk Assessment: Spatial Analysis of Chemical Hazards and Risks in South Korea

    NASA Astrophysics Data System (ADS)

    Yu, H.; Heo, S.; Kim, M.; Lee, W. K.; Jong-Ryeul, S.

    2017-12-01

    This study identified chemical hazard and risk levels in Korea by analyzing the spatial distribution of chemical factories and accidents. The number of chemical factories and accidents in 5-km2 grids were used as the attribute value for spatial analysis. First, semi-variograms were conducted to examine spatial distribution patterns and to identify spatial autocorrelation of chemical factories and accidents. Semi-variograms explained that the spatial distribution of chemical factories and accidents were spatially autocorrelated. Second, the results of the semi-variograms were used in Ordinary Kriging to estimate chemical hazard and risk level. The level values were extracted from the Ordinary Kriging result and their spatial similarity was examined by juxtaposing the two values with respect to their location. Six peaks were identified in both the hazard and risk estimation result, and the peaks correlated with major cities in Korea. Third, the estimated hazard and risk levels were classified with geometrical interval and could be classified into four quadrants: Low Hazard and Low Risk (LHLR), Low Hazard and High Risk (LHHR), High Hazard and Low Risk (HHLR), and High Hazard and High Risk (HHHR). The 4 groups identified different chemical safety management issues in Korea; relatively safe LHLR group, many chemical reseller factories were found in HHLR group, chemical transportation accidents were in the LHHR group, and an abundance of factories and accidents were in the HHHR group. Each quadrant represented different safety management obstacles in Korea, and studying spatial differences can support the establishment of an efficient risk management plan.

  4. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  5. How to Interactively Operate the Global Hawk UAS NOAA/NASA ENSO Payload, from Your Armchair, Five Thousand Kilometers Away

    NASA Technical Reports Server (NTRS)

    Sullivan, Don

    2016-01-01

    This paper will describe the information technologies developed by NASA and NOAA for the February 2016 Sensing Hazards with Operational Unmanned Technology (SHOUT) El Niño Southern Oscillation (ENSO) Campaign. The air vehicle is a NASA Global Hawk UAS, with a primary payload of four instruments, two developed by NASA, two developed by NOAA. The aircraft is based at the NASA Armstrong Flight Research Center, Edwards Air Force Base, California. The payload components are remotely operated by scientists at various facilities, and the data collected downloaded over satellite links in real time for analysis and collaboration. NOAA: Advanced Vertical Atmospheric Profiling System (AVAPS), developed by NCAR, which deploys dozens of dropsondes at altitudes up to 65,000 ft to collect high vertical resolution measurements of the temperature, pressure, relative humidity, and wind speed and direction. NASA: High-Altitude Imaging Wind and Rain Airborne Profiler (HIWRAP), a radar designed to examine the factors of storm intensity: formation, structure and intensification. NOAA: O3 Photometer (UAS-O3), designed specifically for autonomous, precise, and accurate O3 measurements in the upper troposphere and lower stratosphere (UT/LS). NASA JPL: High Altitude MMIC Sounding Radiometer (HAMSR), an atmospheric microwave temperature and humidity sounder instrument that looks at the microwave spectrum.

  6. 78 FR 41804 - NASA Advisory Council; Audit, Finance and Analysis Committee; Meeting.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-11

    ...In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announces a meeting of the Audit, Finance and Analysis Committee of the NASA Advisory Council (NAC). This Committee reports to the NAC.

  7. The Advanced Rapid Imaging and Analysis (ARIA) Project: Providing Standard and On-Demand SAR products for Hazard Science and Hazard Response

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Hua, H.; Rosen, P. A.; Agram, P. S.; Webb, F.; Simons, M.; Yun, S. H.; Sacco, G. F.; Liu, Z.; Fielding, E. J.; Lundgren, P.; Moore, A. W.

    2017-12-01

    A new era of geodetic imaging arrived with the launch of the ESA Sentinel-1A/B satellites in 2014 and 2016, and with the 2016 confirmation of the NISAR mission, planned for launch in 2021. These missions assure high quality, freely and openly distributed regularly sampled SAR data into the indefinite future. These unprecedented data sets are a watershed for solid earth sciences as we progress towards the goal of ubiquitous InSAR measurements. We now face the challenge of how to best address the massive volumes of data and intensive processing requirements. Should scientists individually process the same data independently themselves? Should a centralized service provider create standard products that all can use? Are there other approaches to accelerate science that are cost effective and efficient? The Advanced Rapid Imaging and Analysis (ARIA) project, a joint venture co-sponsored by California Institute of Technology (Caltech) and by NASA through the Jet Propulsion Laboratory (JPL), is focused on rapidly generating higher level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. However, there are challenges in defining the optimal InSAR data products for the solid earth science community. In this presentation, we will present our experience with InSAR users, our lessons learned the advantages of on demand and standard products, and our proposal for the most effective path forward.

  8. Analysis of Potential Alternatives to Reduce NASA's Cost of Human Access to Space

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The purpose of this report is to analyze NASA's potential options for significantly reducing the cost of human access to space. The opinions expressed in this report are based on Hawthorne, Krauss & Associates' ("HKA") interaction with NASA and several of its key contractors over the past nine months. This report is not intended to be an exhaustive quantitative analysis of the various options available to NASA. Instead, its purpose is to outline key decision-related issues that the agency should consider prior to making a decision as to which option to pursue. This report attempts to bring a private-sector perspective to bear on the issue of reducing the cost of human access to space. HKA believes that the key to the NASA's success in reducing those costs over the long-term is the involvement of the private-sector incentives and disciplines--which is achieved only through the assumption of risk by the private sector, not through a traditional contractor relationship--is essential to achieve significant long-term cost reductions.

  9. Operations planning and analysis handbook for NASA/MSFC phase B development projects

    NASA Technical Reports Server (NTRS)

    Batson, Robert C.

    1986-01-01

    Current operations planning and analysis practices on NASA/MSFC Phase B projects were investigated with the objectives of (1) formalizing these practices into a handbook and (2) suggesting improvements. The study focused on how Science and Engineering (S&E) Operational Personnel support Program Development (PD) Task Teams. The intimate relationship between systems engineering and operations analysis was examined. Methods identified for use by operations analysts during Phase B include functional analysis, interface analysis methods to calculate/allocate such criteria as reliability, Maintainability, and operations and support cost.

  10. Acoustic prediction methods for the NASA generalized advanced propeller analysis system (GAPAS)

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Block, P. J. W.

    1984-01-01

    Classical methods of propeller performance analysis are coupled with state-of-the-art Aircraft Noise Prediction Program (ANOPP:) techniques to yield a versatile design tool, the NASA Generalized Advanced Propeller Analysis System (GAPAS) for the novel quiet and efficient propellers. ANOPP is a collection of modular specialized programs. GAPAS as a whole addresses blade geometry and aerodynamics, rotor performance and loading, and subsonic propeller noise.

  11. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  12. NASA Johnson Space Center Usability Testing and Analysis Facility (UTAF) Overview

    NASA Technical Reports Server (NTRS)

    Whitmore, M.

    2004-01-01

    The Usability Testing and Analysis Facility (UTAF) is part of the Space Human Factors Laboratory at the NASA Johnson Space Center in Houston, Texas. The facility provides support to the Office of Biological and Physical Research, the Space Shuttle Program, the International Space Station Program, and other NASA organizations. In addition, there are ongoing collaborative research efforts with external businesses and universities. The UTAF provides human factors analysis, evaluation, and usability testing of crew interfaces for space applications. This includes computer displays and controls, workstation systems, and work environments. The UTAF has a unique mix of capabilities, with a staff experienced in both cognitive human factors and ergonomics. The current areas of focus are: human factors applications in emergency medical care and informatics; control and display technologies for electronic procedures and instructions; voice recognition in noisy environments; crew restraint design for unique microgravity workstations; and refinement of human factors processes. This presentation will provide an overview of ongoing activities, and will address how the projects will evolve to meet new space initiatives.

  13. NASA Johnson Space Center Usability Testing and Analysis Facility (WAF) Overview

    NASA Technical Reports Server (NTRS)

    Whitmore, M.

    2004-01-01

    The Usability Testing and Analysis Facility (UTAF) is part of the Space Human Factors Laboratory at the NASA Johnson Space Center in Houston, Texas. The facility provides support to the Office of Biological and Physical Research, the Space Shuttle Program, the International Space Station Program, and other NASA organizations. In addition, there are ongoing collaborative research efforts with external businesses and universities. The UTAF provides human factors analysis, evaluation, and usability testing of crew interfaces for space applications. This includes computer displays and controls, workstation systems, and work environments. The UTAF has a unique mix of capabilities, with a staff experienced in both cognitive human factors and ergonomics. The current areas of focus are: human factors applications in emergency medical care and informatics; control and display technologies for electronic procedures and instructions; voice recognition in noisy environments; crew restraint design for unique microgravity workstations; and refinement of human factors processes. This presentation will provide an overview of ongoing activities, and will address how the projects will evolve to meet new space initiatives.

  14. Final Hazard Search

    NASA Image and Video Library

    2015-07-08

    This single frame from a four-frame movie shows New Horizons' final deep search for hazardous material around Pluto, obtained on July 1, 2015. These data allow a highly sensitive search for any new moons. The images were taken with the spacecraft's Long Range Reconnaissance Imager (LORRI) over a 100-minute period, and were the final observations in the series of dedicated searches for hazards in the Pluto system which began on May 11. The images show all five known satellites of Pluto moving in their orbits around the dwarf planet, but analysis of these data has so far not revealed the existence of any additional moons. This means that any undiscovered Plutonian moons further than a few thousand miles from Pluto must be smaller than about 1 mile (1.6 kilometers) in diameter, if their surfaces have similar brightness to Pluto's big moon Charon. For comparison, Pluto's faintest known moon, Styx, which is conspicuous in the lower left quadrant of these images, is about 4 miles (7 kilometers) across, assuming the same surface brightness. The absence of additional moons, and also the absence of detectable rings in the hazard search data, imply that the spacecraft is very unlikely to be damaged by collisions with rings, or dust particles ejected from moons, during its high-speed passage through the Pluto system. The four movie frames were taken at 16:28, 16:38, 17:52, and 18:04 UTC on July 1, from a range of 9.4 million miles (15.2 million kilometers). Each frame is a mosaic of four sets of overlapping images, with a total exposure time of 120 seconds. The images have been heavily processed to remove the glare of Pluto and Charon, and the dense background of stars, though blemishes remain at the locations of many of the brighter stars. The "tails" extending to the right or downward from Pluto and Charon are camera artifacts caused by the extreme overexposure of both objects. Pluto and its five moons Charon, Styx, Nix, Kerberos and Hydra are identified by their initials

  15. NASA Hypersonics Overview

    NASA Technical Reports Server (NTRS)

    Dryer, Jay

    2017-01-01

    This briefing is an overview of NASA's hypersonic portfolio and core capabilities. The scope of work is fundamental research spanning technology readiness and system complexity levels; critical technologies enabling re-usable hypersonic systems; system-level research, design, analysis, validation; and, engage, invigorate and train the next generation of engineers. This briefing was requested by the Aeronautics Subcommittee of the NASA Advisory Council.

  16. Understanding the Role of Biology in the Global Environment: NASA'S Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Townsend, William F.

    1996-01-01

    NASA has long used the unique perspective of space as a means of expanding our understanding of how the Earth's environment functions. In particular, the linkages between land, air, water, and life-the elements of the Earth system-are a focus for NASA's Mission to Planet Earth. This approach, called Earth system science, blends together fields like meteorology, biology, oceanography, and atmospheric science. Mission to Planet Earth uses observations from satellites, aircraft, balloons, and ground researchers as the basis for analysis of the elements of the Earth system, the interactions between those elements, and possible changes over the coming years and decades. This information is helping scientists improve our understanding of how natural processes affect us and how we might be affecting them. Such studies will yield improved weather forecasts, tools for managing agriculture and forests, information for fishermen and local planners, and, eventually, an enhanced ability to predict how the climate will change in the future. NASA has designed Mission to Planet Earth to focus on five primary themes: Land Cover and Land Use Change; Seasonal to Interannual Climate Prediction; Natural Hazards; Long-Term Climate Variability; and Atmosphere Ozone.

  17. ALHAT: Autonomous Landing and Hazard Avoidance Technology

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.; Carson, John M., III

    2015-01-01

    The ALHAT project was chartered by NASA HQ in 2006 to develop and mature to TRL 6 an autonomous lunar landing GN&C and sensing system for crewed, cargo, and robotic planetary landing vehicles. The multi-center ALHAT team was tasked with providing a system capable of identifying and avoiding surface hazards in real time to enable safe precision landing to within tens of meters of a designated planetary landing site under any lighting conditions.

  18. Seismic Hazard Analysis for Armenia and its Surrounding Areas

    NASA Astrophysics Data System (ADS)

    Klein, E.; Shen-Tu, B.; Mahdyiar, M.; Karakhanyan, A.; Pagani, M.; Weatherill, G.; Gee, R. C.

    2017-12-01

    The Republic of Armenia is located within the central part of a large, 800 km wide, intracontinental collision zone between the Arabian and Eurasian plates. Active deformation occurs along numerous structures in the form of faulting, folding, and volcanism distributed throughout the entire zone from the Bitlis-Zargos suture belt to the Greater Caucasus Mountains and between the relatively rigid Back Sea and Caspian Sea blocks without any single structure that can be claimed as predominant. In recent years, significant work has been done on mapping active faults, compiling and reviewing historic and paleoseismological studies in the region, especially in Armenia; these recent research contributions have greatly improved our understanding of the seismogenic sources and their characteristics. In this study we performed a seismic hazard analysis for Armenia and its surrounding areas using the latest detailed geological and paleoseismological information on active faults, strain rates estimated from kinematic modeling of GPS data and all available historic earthquake data. The seismic source model uses a combination of characteristic earthquake and gridded seismicity models to take advantage of the detailed knowledge of the known faults while acknowledging the distributed deformation and regional tectonic environment of the collision zone. In addition, the fault model considers earthquake ruptures that include single and multi-segment or fault rupture scenarios with earthquakes that can rupture any part of a multiple segment fault zone. The ground motion model uses a set of ground motion prediction equations (GMPE) selected from a pool of GMPEs based on the assessment of each GMPE against the available strong motion data in the region. The hazard is computed in the GEM's OpenQuake engine. We will present final hazard results and discuss the uncertainties associated with various input data and their impact on the hazard at various locations.

  19. NASA Strategy to Safely Live and Work in the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis; Wu, Honglu; Corbin, Barbara; Sulzman, Frank; Kreneck, Sam

    2007-01-01

    This viewgraph document reviews the radiation environment that is a significant potential hazard to NASA's goals for space exploration, of living and working in space. NASA has initiated a Peer reviewed research program that is charged with arriving at an understanding of the space radiation problem. To this end NASA Space Radiation Laboratory (NSRL) was constructed to simulate the harsh cosmic and solar radiation found in space. Another piece of the work was to develop a risk modeling tool that integrates the results from research efforts into models of human risk to reduce uncertainties in predicting risk of carcinogenesis, central nervous system damage, degenerative tissue disease, and acute radiation effects acute radiation effects.

  20. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  1. Detection of Natural Hazards Generated TEC Perturbations and Related New Applications

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Yang, Y.; Langley, R. B.

    2013-12-01

    Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface. This continuing research is expected to provide early warning for tsunamis, earthquakes, volcanic eruptions, and meteor impacts, for example, using GPS and other global navigation satellite systems. We will demonstrate new and upcoming applications including recent natural hazards and artificial explosions that generated TEC perturbations to perform state-of-the-art imaging and modeling of earthquakes, tsunamis and meteor impacts. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage.

  2. Economic Analysis on the Space Transportation Architecture Study (STAS) NASA Team

    NASA Technical Reports Server (NTRS)

    Shaw, Eric J.

    1999-01-01

    The National Aeronautics and Space Administration (NASA) performed the Space Transportation Architecture Study (STAS) to provide information to support end-of-the-decade decisions on possible near-term US Government (USG) investments in space transportation. To gain a clearer understanding of the costs and benefits of the broadest range of possible space transportation options, six teams, five from aerospace industry companies and one internal to NASA, were tasked to answer three primary questions: a) If the Space Shuttle system should be replaced; b) If so, when the replacement should take place and how the transition should be implemented; and c) If not, what is the upgrade strategy to continue safe and affordable flight of the Space Shuttle beyond 2010. The overall goal of the Study was "to develop investment options to be considered by the Administration for the President's FY2001 budget to meet NASA's future human space flight requirements with significant reductions in costs." This emphasis on government investment, coupled with the participation by commercial f'trms, required an unprecedented level of economic analysis of costs and benefits from both industry and government viewpoints. This paper will discuss the economic and market models developed by the in-house NASA Team to analyze space transportation architectures, the results of those analyses, and how those results were reflected in the conclusions and recommendations of the STAS NASA Team. Copyright 1999 by the American Institute of Aeronautics and Astronautics, Inc. No copyright is asserted in the United States under Title 17, U.$. Code. The U.S. Government has a royalty-free license to exercise all rights under the copyright claimed herein for Governmental purposes. All other rights are reserved by the copyright owner.

  3. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  4. DOD/NASA system impact analysis (study 2.1). Volume 2: Study results

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Results of the tug turnaround cost study and the space transportation system (STS) abort modes and effects study are presented for DOD/NASA system impact analysis. Cost estimates are given for tug turnabout; and vehicle description, abort assessment, and abort performance capability are given for the STS.

  5. NASA Weather Support 2017

    NASA Technical Reports Server (NTRS)

    Carroll, Matt

    2017-01-01

    In the mid to late 1980's, as NASA was studying ways to improve weather forecasting capabilities to reduce excessive weather launch delays and to reduce excessive weather Launch Commit Criteria (LCC) waivers, the Challenger Accident occurred and the AC-67 Mishap occurred.[1] NASA and USAF weather personnel had advance knowledge of extremely high levels of weather hazards that ultimately caused or contributed to both of these accidents. In both cases, key knowledge of the risks posed by violations of weather LCC was not in the possession of final decision makers on the launch teams. In addition to convening the mishap boards for these two lost missions, NASA convened expert meteorological boards focusing on weather support. These meteorological boards recommended the development of a dedicated organization with the highest levels of weather expertise and influence to support all of American spaceflight. NASA immediately established the Weather Support Office (WSO) in the Office of Space Flight (OSF), and in coordination with the United Stated Air Force (USAF), initiated an overhaul of the organization and an improvement in technology used for weather support as recommended. Soon after, the USAF established a senior civilian Launch Weather Officer (LWO) position to provide meteorological support and continuity of weather expertise and knowledge over time. The Applied Meteorology Unit (AMU) was established by NASA, USAF, and the National Weather Service to support initiatives to place new tools and methods into an operational status. At the end of the Shuttle Program, after several weather office reorganizations, the WSO function had been assigned to a weather branch at Kennedy Space Center (KSC). This branch was dismantled in steps due to further reorganization, loss of key personnel, and loss of budget line authority. NASA is facing the loss of sufficient expertise and leadership required to provide current levels of weather support. The recommendation proposed

  6. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    NASA Astrophysics Data System (ADS)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  7. The Third Air Force/NASA Symposium on Recent Advances in Multidisciplinary Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The third Air Force/NASA Symposium on Recent Advances in Multidisciplinary Analysis and Optimization was held on 24-26 Sept. 1990. Sessions were on the following topics: dynamics and controls; multilevel optimization; sensitivity analysis; aerodynamic design software systems; optimization theory; analysis and design; shape optimization; vehicle components; structural optimization; aeroelasticity; artificial intelligence; multidisciplinary optimization; and composites.

  8. Nasa's Planetary Geologic Mapping Program: Overview

    NASA Astrophysics Data System (ADS)

    Williams, D. A.

    2016-06-01

    NASA's Planetary Science Division supports the geologic mapping of planetary surfaces through a distinct organizational structure and a series of research and analysis (R&A) funding programs. Cartography and geologic mapping issues for NASA's planetary science programs are overseen by the Mapping and Planetary Spatial Infrastructure Team (MAPSIT), which is an assessment group for cartography similar to the Mars Exploration Program Assessment Group (MEPAG) for Mars exploration. MAPSIT's Steering Committee includes specialists in geological mapping, who make up the Geologic Mapping Subcommittee (GEMS). I am the GEMS Chair, and with a group of 3-4 community mappers we advise the U.S. Geological Survey Planetary Geologic Mapping Coordinator (Dr. James Skinner) and develop policy and procedures to aid the planetary geologic mapping community. GEMS meets twice a year, at the Annual Lunar and Planetary Science Conference in March, and at the Annual Planetary Mappers' Meeting in June (attendance is required by all NASA-funded geologic mappers). Funding programs under NASA's current R&A structure to propose geological mapping projects include Mars Data Analysis (Mars), Lunar Data Analysis (Moon), Discovery Data Analysis (Mercury, Vesta, Ceres), Cassini Data Analysis (Saturn moons), Solar System Workings (Venus or Jupiter moons), and the Planetary Data Archiving, Restoration, and Tools (PDART) program. Current NASA policy requires all funded geologic mapping projects to be done digitally using Geographic Information Systems (GIS) software. In this presentation we will discuss details on how geologic mapping is done consistent with current NASA policy and USGS guidelines.

  9. Optimal Fragmentation and Dispersion of Hazardous Near-Earth Objects

    NASA Technical Reports Server (NTRS)

    Wie, Bong

    2012-01-01

    The complex problem of protecting the Earth from the possibility of a catastrophic impact by a hazardous near-Earth object (NEO) has been recently reassessed in [1]. In a letter on NEOs from the White House Office of Science and Technology Policy (OSTP) to the U.S. Senate and Congress in 2010, the White House OSTP strongly recommended that NASA take the lead in conducting research activities for NEO detection, characterization, and deflection technologies. Furthermore, President Obama's new National Space Policy specifically directs NASA to "pursue capabilities, in cooperation with other departments, agencies, and commercial partners, to detect, track, catalog, and characterize NEOs to reduce the risk of harm to humans from an unexpected impact on our planet." The Planetary Defense Task Force of the NASA Advisory Council also recommended that the NASA Office of the Chief Technologist (OCT) begin efforts to investigate asteroid deflection techniques. With national interest growing in the United States, the NEO threat detection and mitigation problem was recently identified as one of NASA's Space Technology Grand Challenges. An innovative solution to NASA's NEO Impact Threat Mitigation Grand Challenge problem was developed through a NIAC Phase I study (9/16/11 - 9/15/12), and it will be further investigated for a NIAC Phase II study (9/10/12 - 9/9/14). Various NEO deflection technologies, including nuclear explosions, kinetic impactors, and slow-pull gravity tractors, have been proposed and examined during the past two decades. Still, there is no consensus on how to reliably deflect or disrupt hazardous NEOs in a timely manner. It is expected that the most probable mission scenarios will have a mission lead time much shorter than 10 years, so the use of nuclear explosives becomes the most feasible method for planetary defense. Direct intercept missions with a short warning time will result in arrival closing velocities of 10-30 kilometers per second with respect to

  10. Possible safety hazards associated with the operation of the 0.3-m transonic cryogenic tunnel at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Webster, T. J.

    1982-01-01

    The 0.3 m Transonic Cryogenic Tunnel (TCT) at the NASA Langley Research Center was built in 1973 as a facility intended to be used for no more than 60 hours in order to verify the validity of the cryogenic wind tunnel concept at transonic speeds. The role of the 0.3 m TCT has gradually changed until now, after over 3000 hours of operation, it is classified as a major NASA research facility and, under the administration of the Experimental Techniques Branch, it is used extensively for the testing of airfoils at high Reynolds numbers and for the development of various technologies related to the efficient operation and use of cryogenic wind tunnels. The purpose of this report is to document the results of a recent safety analysis of the 0.3 m TCT facility. This analysis was made as part of an on going program with the Experimental Techniques Branch designed to ensure that the existing equipment and current operating procedures of the 0.3 m TCT facility are acceptable in terms of today's standards of safety for cryogenic systems.

  11. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustoni, Arnold L.

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to performmore » laser interaction experiments and tests at various national test sites.« less

  12. NASA Lunar Base Wireless System Propagation Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Upanavage, Matthew; Sham, Catherine C.

    2007-01-01

    There have been many radio wave propagation studies using both experimental and theoretical techniques over the recent years. However, most of studies have been in support of commercial cellular phone wireless applications. The signal frequencies are mostly at the commercial cellular and Personal Communications Service bands. The antenna configurations are mostly one on a high tower and one near the ground to simulate communications between a cellular base station and a mobile unit. There are great interests in wireless communication and sensor systems for NASA lunar missions because of the emerging importance of establishing permanent lunar human exploration bases. Because of the specific lunar terrain geometries and RF frequencies of interest to the NASA missions, much of the published literature for the commercial cellular and PCS bands of 900 and 1800 MHz may not be directly applicable to the lunar base wireless system and environment. There are various communication and sensor configurations required to support all elements of a lunar base. For example, the communications between astronauts, between astronauts and the lunar vehicles, between lunar vehicles and satellites on the lunar orbits. There are also various wireless sensor systems among scientific, experimental sensors and data collection ground stations. This presentation illustrates the propagation analysis of the lunar wireless communication and sensor systems taking into account the three dimensional terrain multipath effects. It is observed that the propagation characteristics are significantly affected by the presence of the lunar terrain. The obtained results indicate the lunar surface material, terrain geometry and antenna location are the important factors affecting the propagation characteristics of the lunar wireless systems. The path loss can be much more severe than the free space propagation and is greatly affected by the antenna height, surface material and operating frequency. The

  13. NASA Tech Briefs, November/December 1986, Special Edition

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.

  14. Precise and Scalable Static Program Analysis of NASA Flight Software

    NASA Technical Reports Server (NTRS)

    Brat, G.; Venet, A.

    2005-01-01

    Recent NASA mission failures (e.g., Mars Polar Lander and Mars Orbiter) illustrate the importance of having an efficient verification and validation process for such systems. One software error, as simple as it may be, can cause the loss of an expensive mission, or lead to budget overruns and crunched schedules. Unfortunately, traditional verification methods cannot guarantee the absence of errors in software systems. Therefore, we have developed the CGS static program analysis tool, which can exhaustively analyze large C programs. CGS analyzes the source code and identifies statements in which arrays are accessed out of bounds, or, pointers are used outside the memory region they should address. This paper gives a high-level description of CGS and its theoretical foundations. It also reports on the use of CGS on real NASA software systems used in Mars missions (from Mars PathFinder to Mars Exploration Rover) and on the International Space Station.

  15. Finite Element Analysis of a NASA National Transonic Facility Wind Tunnel Balance

    NASA Technical Reports Server (NTRS)

    Lindell, Michael C.

    1996-01-01

    This paper presents the results of finite element analyses and correlation studies performed on a NASA National Transonic Facility (NTF) Wind Tunnel balance. In the past NASA has relied primarily on classical hand analyses, coupled with relatively large safety factors, for predicting maximum stresses in wind tunnel balances. Now, with the significant advancements in computer technology and sophistication of general purpose analysis codes, it is more reasonable to pursue finite element analyses of these balances. The correlation studies of the present analyses show very good agreement between the analyses and data measured with strain gages and therefore the studies give higher confidence for using finite element analyses to analyze and optimize balance designs in the future.

  16. Finite Element Analysis of a NASA National Transonic Facility Wide Tunnel Balance

    NASA Technical Reports Server (NTRS)

    Lindell, Michael C. (Editor)

    1999-01-01

    This paper presents the results of finite element analyses and correlation studies performed on a NASA National Transonic Facility (NTF) Wind Tunnel balance. In the past NASA has relied primarily on classical hand analyses, coupled with relatively large safety factors, for predicting maximum stresses in wind tunnel balances. Now, with the significant advancements in computer technology and sophistication of general purpose analysis codes, it is more reasonable to pursue finite element analyses of these balances. The correlation studies of the present analyses show very good agreement between the analyses and data measured with strain gages and therefore the studies give higher confidence for using finite element analyses to analyze and optimize balance designs in the future.

  17. Safeguarding the Health of the NASA Astronaut Community: the Need for Expanded Medical Monitoring for Former NASA Astronauts Under the Astronaut Occupational Health Program

    NASA Technical Reports Server (NTRS)

    Rossi, Meredith; Lee, Lesley; Wear, Mary; Van Baalen, Mary; Rhodes, Bradley

    2016-01-01

    The astronaut community is unique, and may be disproportionately exposed to occupational hazards not commonly seen in other communities. The extent to which the demands of the astronaut occupation and exposure to spaceflight-related hazards affect the health of the astronaut population over the life course is not completely known. Provision of health screening services to active and former astronauts ensures individual, mission, and community health and safety. Currently, the NASA Johnson Space Center (JSC) Flight Medicine Clinic (FMC) provides extensive medical monitoring to active astronauts throughout their careers. Upon retirement, astronauts may voluntarily return to the JSC FMC for an annual preventive exam. However, current retiree monitoring includes only selected screening tests, representing an opportunity for augmentation. The potential latent health effects of spaceflight demand an expanded framework of testing for former astronauts. The need is two-fold: screening tests widely recommended for other aging communities are necessary for astronauts to rule out conditions resulting from the natural aging process (e.g., colonoscopy, mammography), as opposed to conditions resulting directly from the astronaut occupation; and increased breadth of monitoring services will improve the understanding of occupational health risks and longitudinal health of the astronaut community, past, present, and future. To meet this need, NASA has begun an extensive exploration of the overall approach, cost, and policy implications of expanding existing medical monitoring under the Astronaut Occupational Health program for former NASA astronauts.

  18. A self-analysis of the NASA-TLX workload measure.

    PubMed

    Noyes, Jan M; Bruneau, Daniel P J

    2007-04-01

    Computer use and, more specifically, the administration of tests and materials online continue to proliferate. A number of subjective, self-report workload measures exist, but the National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is probably the most well known and used. The aim of this paper is to consider the workload costs associated with the computer-based and paper versions of the NASA-TLX measure. It was found that there is a significant difference between the workload scores for the two media, with the computer version of the NASA-TLX incurring more workload. This has implications for the practical use of the NASA-TLX as well as for other computer-based workload measures.

  19. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  20. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  1. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  2. Hazardous Waste Site Analysis (Small Site Technology)

    DTIC Science & Technology

    1990-08-01

    Act HSRT - Hazardous Substance Response Trust (Superfund Trust) HSWA - Hdzardeus and Solid Waste Amendments (to RCRA) NAAQSD - National Ambient Air...impoundments (basically, any area where hazardous substances are located). * Under CERCLA, "Environment" includes surface and groundwater, ambient air, land...34 provisions with permit requirements for new source construction). " Ambient Air Quality standards (NAAQs) have been issued for six "criteria" pollutants

  3. NASA Space Environments Technical Discipline Team Space Weather Activities

    NASA Astrophysics Data System (ADS)

    Minow, J. I.; Nicholas, A. C.; Parker, L. N.; Xapsos, M.; Walker, P. W.; Stauffer, C.

    2017-12-01

    The Space Environment Technical Discipline Team (TDT) is a technical organization led by NASA's Technical Fellow for Space Environments that supports NASA's Office of the Chief Engineer through the NASA Engineering and Safety Center. The Space Environments TDT conducts independent technical assessments related to the space environment and space weather impacts on spacecraft for NASA programs and provides technical expertise to NASA management and programs where required. This presentation will highlight the status of applied space weather activities within the Space Environment TDT that support development of operational space weather applications and a better understanding of the impacts of space weather on space systems. We will first discuss a tool that has been developed for evaluating space weather launch constraints that are used to protect launch vehicles from hazardous space weather. We then describe an effort to better characterize three-dimensional radiation transport for CubeSat spacecraft and processing of micro-dosimeter data from the International Space Station which the team plans to make available to the space science community. Finally, we will conclude with a quick description of an effort to maintain access to the real-time solar wind data provided by the Advanced Composition Explorer satellite at the Sun-Earth L1 point.

  4. A Comparison of Results from NASA's Meteoroid Engineering Model to the LDEF Cratering Record

    NASA Technical Reports Server (NTRS)

    Ehlert, S.; Moorhead, A; Cooke, W. J.

    2017-01-01

    NASA's Long Duration Exposure Facility (LDEF) has provided an extensive record of the meteoroid environment in low Earth orbit. LDEF's combination of fixed orientation, large collecting area, and long lifetime imposes constraints on the absolute flux of potentially hazardous meteoroids. The relative impact rate on each of LDEF's fourteen surfaces arises from the underlying velocity distribution and directionality of the meteoroid environment. For the first time, we model the meteoroid environment encountered by LDEF over its operational lifetime using NASA's Meteoroid Engineering Model Release 2 (MEMR2) and compare the model results with the observed craters of potentially hazardous meteoroids (i.e. crater diameters larger than approximately 0.75 mm). We discuss the extent to which the observations and model agree and how the impact rates across all of the LDEF surfaces may be utilized to help calibrate future versions of MEM.

  5. Coupled Loads Analysis of the Modified NASA Barge Pegasus and Space Launch System Hardware

    NASA Technical Reports Server (NTRS)

    Knight, J. Brent

    2015-01-01

    A Coupled Loads Analysis (CLA) has been performed for barge transport of Space Launch System hardware on the recently modified NASA barge Pegasus. The barge re-design was facilitated with detailed finite element analyses by the ARMY Corps of Engineers - Marine Design Center. The Finite Element Model (FEM) utilized in the design was also used in the subject CLA. The Pegasus FEM and CLA results are presented as well as a comparison of the analysis process to that of a payload being transported to space via the Space Shuttle. Discussion of the dynamic forcing functions is included as well. The process of performing a dynamic CLA of NASA hardware during marine transport is thought to be a first and can likely support minimization of undue conservatism.

  6. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; hide

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  7. Thunderstorm hazards flight research: Storm hazards 1980 overview

    NASA Technical Reports Server (NTRS)

    Deal, P. L.; Keyser, G. L.; Fisher, B. D.; Crabill, N. L.

    1981-01-01

    A highly instrumented NASA F-106B aircraft, modified for the storm hazards mission and protected against direct lightning strikes, was used in conjunction with various ground based radar and lightning measurement systems to collect data during thunderstorm penetration flights. During 69 thunderstorm penetrations, there were 10 direct lightning strikes to the aircraft. No problems were encountered with any of the aircraft's systems as a result of the strikes and the research instrumentation performed as designed. Electromagnetic characteristics of nine strikes were recorded, and the results of other experiments confirm the theory that X-ray radiation and nitrous oxide gas are being produced by processes associated directly with thunderstorm electric fields and lightning discharges. A better understanding of aircraft lightning attachment mechanisms and strike zones is being accomplished by careful inspection, identification, and documentation of lightning attachment points and swept stroke paths following each strike to the aircraft.

  8. STEREO as a "Planetary Hazards" Mission

    NASA Technical Reports Server (NTRS)

    Guhathakurta, M.; Thompson, B. J.

    2014-01-01

    NASA's twin STEREO probes, launched in 2006, have advanced the art and science of space weather forecasting more than any other spacecraft or solar observatory. By surrounding the Sun, they provide previously-impossible early warnings of threats approaching Earth as they develop on the solar far side. They have also revealed the 3D shape and inner structure of CMEs-massive solar storms that can trigger geomagnetic storms when they collide with Earth. This improves the ability of forecasters to anticipate the timing and severity of such events. Moreover, the unique capability of STEREO to track CMEs in three dimensions allows forecasters to make predictions for other planets, giving rise to the possibility of interplanetary space weather forecasting too. STEREO is one of those rare missions for which "planetary hazards" refers to more than one world. The STEREO probes also hold promise for the study of comets and potentially hazardous asteroids.

  9. Post Eruption Hazards at Mt. Pinatubo, Philippines

    NASA Technical Reports Server (NTRS)

    Mouginis-Mark, Peter J.

    2004-01-01

    Our project focused on the investigation of the post-eruption hazards at Mt. Pinatubo (Philippines) using remote sensing data, and field observations of the 1991 eruption deposits. Through the use of multiple satellite images, field work, and the 1996/2000 PacRim data sets, we conducted studies of the co- and post-eruption hazards of the volcano due to erosion and re-deposition of the extensive pyroclastic flow deposits. A major part of this project was the assembly and analysis of a database of over 50 high resolution (1 - 50 m/pixel) images that will facilitate this study. We collected Ikonos, SPOT, SIR-C/X-SAR, Landsat, ERS, RADARSAT, and ASTER images of the area around Mt. Pinatubo. An example of the changes that could be seen in these data is shown. Our investigation focused on a retrospective analysis of the erosion, redeposition, and re-vegetation of the 1991 pyroclastic flow deposits of Mt. Pinatubo. The primary geologic goal of our work was the analysis of the spatial distribution and volume change of the sources and sinks of materials associated with mudflow ('lahar') events. This included the measurement of river valley gradients and cross-sections using TOPSAR digital elevation data, as we are participating in the PacRim 2000 deployment to the Philippines specifically so that we can collect a second set of TOPSAR data that can then be used to create a topographic difference image of the volcano. The main results from this multi-sensor study have been published as Torres et al.. A discussion of the methodology that we used to assemble an appropriate database was included in Mouginis-Mark and Domergue-Schmidt. As part of an educational outreach effort, we also helped the Philippine Institute of Volcanology and Seismology (PHIVOLCS) in the Philippines to use NASA data to study Mt. Pinatubo and other Filipino volcanoes.

  10. NASA Johnson Space Center Usability Testing and Analysis facility (UTAF) Overview

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Holden, Kritina L.

    2005-01-01

    The Usability Testing and Analysis Facility (UTAF) is part of the Space Human Factors Laboratory at the NASA Johnson Space Center in Houston, Texas. The facility performs research for NASA's HumanSystems Integration Program, under the HumanSystems Research and Technology Division. Specifically, the UTAF provides human factors support for space vehicles, including the International Space Station, the Space Shuttle, and the forthcoming Crew Exploration Vehicle. In addition, there are ongoing collaborative research efforts with external corporations and universities. The UTAF provides human factors analysis, evaluation, and usability testing of crew interfaces for space applications. This includes computer displays and controls, workstation systems, and work environments. The UTAF has a unique mix of capabilities, with a staff experienced in both cognitive human factors and ergonomics. The current areas of focus are: human factors applications in emergency medical care and informatics; control and display technologies for electronic procedures and instructions; voice recognition in noisy environments; crew restraint design for unique microgravity workstations; and refinement of human factors processes and requirements. This presentation will provide an overview of ongoing activities, and will address how the UTAF projects will evolve to meet new space initiatives.

  11. GO/NO-GO - When is medical hazard mitigation acceptable for launch?

    NASA Technical Reports Server (NTRS)

    Hamilton, Douglas R.; Polk, James D.

    2005-01-01

    Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off

  12. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  13. NASA Accountability Report

    NASA Technical Reports Server (NTRS)

    1997-01-01

    NASA is piloting fiscal year (FY) 1997 Accountability Reports, which streamline and upgrade reporting to Congress and the public. The document presents statements by the NASA administrator, and the Chief Financial Officer, followed by an overview of NASA's organizational structure and the planning and budgeting process. The performance of NASA in four strategic enterprises is reviewed: (1) Space Science, (2) Mission to Planet Earth, (3) Human Exploration and Development of Space, and (4) Aeronautics and Space Transportation Technology. Those areas which support the strategic enterprises are also reviewed in a section called Crosscutting Processes. For each of the four enterprises, there is discussion about the long term goals, the short term objectives and the accomplishments during FY 1997. The Crosscutting Processes section reviews issues and accomplishments relating to human resources, procurement, information technology, physical resources, financial management, small and disadvantaged businesses, and policy and plans. Following the discussion about the individual areas is Management's Discussion and Analysis, about NASA's financial statements. This is followed by a report by an independent commercial auditor and the financial statements.

  14. NASA Technologists Embrace Laser Instrument Challenge

    NASA Image and Video Library

    2013-11-06

    Goddard scientist David Harding and Goddard technologist Tony Yu are developing a lidar system that could meet an ambitious requirement of the proposed LIST mission. ---------- In 2007, the National Research Council threw down a challenge: Design a space-based laser altimeter that could measure the height of Earth's surface everywhere to within a mere 10 centimeters — all at 5-meter resolution. To this day, some believe it can't be done. Goddard scientist Dave Harding begs to differ. He and his team have embraced the challenge and are developing a laser altimeter that could provide the data from a berth onboard the NRC-proposed Lidar Surface Topography, or LIST, mission. It would generate highly detailed maps of topography and vegetation that scientists could use to forecast and respond to natural hazards and study carbon storage in forests. Read more: 1.usa.gov/17N3Bql NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram Credit: Bill Hrybck/NASA

  15. Overview of Engineering Design and Analysis at the NASA John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Congiardo, Jared; Junell, Justin; Kirkpatrick, Richard; Ryan, Harry

    2007-01-01

    This viewgraph presentation gives a general overview of the design and analysis division of NASA John C. Stennis Space Center. This division develops and maintains propulsion test systems and facilities for engineering competencies.

  16. Introduction: Hazard mapping

    USGS Publications Warehouse

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  17. Microbiological quality of food in relation to hazard analysis systems and food hygiene training in UK catering and retail premises.

    PubMed

    Little, C L; Lock, D; Barnes, J; Mitchell, R T

    2003-09-01

    A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p < 0.00001). Three quarters (76%) of retail premises had hazard analysis systems in place compared with 59% of catering premises (p < 0.00001). In 87% of retail premises the manager had received some form of food hygiene training compared with 80% of catering premises (p < 0.00001). From premises where the manager had received no food hygiene training a greater proportion of samples were of unsatisfactory and unacceptable microbiological quality (20% retail, 27% catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p < 0.00001). Where the manager of the premises had received food hygiene training, documented hazard analysis systems were more likely to be in place (p < 0.00001). Higher proportions of samples of unsatisfactory and unacceptable microbiological quality (17% retail, 22% catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p < 0.00001). Our meta-analysis suggests that the lower microbiological quality of ready-to-eat foods from catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The

  18. Urban Heat Wave Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Quattrochi, D. A.; Jedlovec, G.; Crane, D. L.; Meyer, P. J.; LaFontaine, F.

    2016-12-01

    Heat waves are one of the largest causes of environmentally-related deaths globally and are likely to become more numerous as a result of climate change. The intensification of heat waves by the urban heat island effect and elevated humidity, combined with urban demographics, are key elements leading to these disasters. Better warning of the potential hazards may help lower risks associated with heat waves. Moderate resolution thermal data from NASA satellites is used to derive high spatial resolution estimates of apparent temperature (heat index) over urban regions. These data, combined with demographic data, are used to produce a daily heat hazard/risk map for selected cities. MODIS data are used to derive daily composite maximum and minimum land surface temperature (LST) fields to represent the amplitude of the diurnal temperature cycle and identify extreme heat days. Compositing routines are used to generate representative daily maximum and minimum LSTs for the urban environment. The limited effect of relative humidity on the apparent temperature (typically 10-15%) allows for the use of modeled moisture fields to convert LST to apparent temperature without loss of spatial variability. The daily max/min apparent temperature fields are used to identify abnormally extreme heat days relative to climatological values in order to produce a heat wave hazard map. Reference to climatological values normalizes the hazard for a particular region (e.g., the impact of an extreme heat day). A heat wave hazard map has been produced for several case study periods and then computed on a quasi-operational basis during the summer of 2016 for Atlanta, GA, Chicago, IL, St. Louis, MO, and Huntsville, AL. A hazard does not become a risk until someone or something is exposed to that hazard at a level that might do harm. Demographic information is used to assess the urban risk associated with the heat wave hazard. Collectively, the heat wave hazard product can warn people in urban

  19. Preliminary Results Obtained in Integrated Safety Analysis of NASA Aviation Safety Program Technologies

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information from January 1, 2001 through March 31, 2001 available on the NASA Aeronautics and Space Database. Contents include 1) Cognitive Task Analysis; 2) RTO Educational Notes; 3) The Capability of Virtual Reality to Meet Military Requirements; 4) Aging Engines, Avionics, Subsystems and Helicopters; 5) RTO Meeting Proceedings; 6) RTO Technical Reports; 7) Low Grazing Angle Clutter...; 8) Verification and Validation Data for Computational Unsteady Aerodynamics; 9) Space Observation Technology; 10) The Human Factor in System Reliability...; 11) Flight Control Design...; 12) Commercial Off-the-Shelf Products in Defense Applications.

  20. NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert

    2011-01-01

    System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual

  1. NASA TLA workload analysis support. Volume 3: FFD autopilot scenario validation data

    NASA Technical Reports Server (NTRS)

    Sundstrom, J. L.

    1980-01-01

    The data used to validate a seven time line analysis of forward flight deck autopilot mode for the pilot and copilot for NASA B737 terminal configured vehicle are presented. Demand workloads are given in two forms: workload histograms and workload summaries (bar graphs). A report showing task length and task interaction is also presented.

  2. NASA's Earth Science Research and Environmental Predictions

    NASA Technical Reports Server (NTRS)

    Hilsenrath, E.

    2004-01-01

    NASA Earth Science program began in the 1960s with cloud imaging satellites used for weather observations. A fleet of satellites are now in orbit to investigate the Earth Science System to uncover the connections between land, Oceans and the atmosphere. Satellite systems using an array of active and passive remote sensors are used to search for answers on how is the Earth changing and what are the consequences for life on Earth? The answer to these questions can be used for applications to serve societal needs and contribute to decision support systems for weather, hazard, and air quality predictions and mitigation of adverse effects. Partnerships with operational agencies using NASA's observational capabilities are now being explored. The system of the future will require new technology, data assimilation systems which includes data and models that will be used for forecasts that respond to user needs.

  3. A Comparison of Results From NASA's Meteoroid Engineering Model to the LDEF Cratering Record

    NASA Technical Reports Server (NTRS)

    Ehlert, S.; Moorhead, A.; Cooke, W. J.

    2017-01-01

    NASA's Long Duration Exposure Facility (LDEF) has provided an extensive record of the meteoroid environment in Low Earth Orbit. LDEF's combination of fixed orientation, large collecting area, and long lifetime imposes constraints on the absolute flux of potentially hazardous meteoroids. The relative impact rate on each of LDEF's fourteen surfaces arises from the underlying velocity distribution and directionality of the meteoroid environment. For the first time, we model the meteoroid environment encountered by LDEF over its operational lifetime using NASA's Meteoroid Engineering Model Release 2 (MEMR2) and compare the model results with the observed craters of potentially hazardous meteoroids (i.e. crater diameters larger than approximately 0.6 mm). We discuss the extent to which the observations and model agree and how the impact rates across all of the LDEF surfaces may suggest improvements to the underlying assumptions that go into future versions of MEM.

  4. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  5. NASA Safety Standard: Guidelines and Assessment Procedures for Limiting Orbital Debris

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Collision with orbital debris is a hazard of growing concern as historically accepted practices and procedures have allowed man-made objects to accumulate in orbit. To limit future debris generation, NASA Management Instruction (NMI) 1700.8, 'Policy to Limit Orbital Debris Generation,' was issued in April of 1993. The NMI requires each program to conduct a formal assessment of the potential to generate orbital debris. This document serves as a companion to NMI 1700.08 and provides each NASA program with specific guidelines and assessment methods to assure compliance with the NMI. Each main debris assessment issue (e.g., Post Mission Disposal) is developed in a separate chapter.

  6. NASA Earth Science Image Analysis for Climate Change Decisions

    NASA Technical Reports Server (NTRS)

    Hilderbrand, Peter H.

    2011-01-01

    This talk will briefly outline the ways in which NASA observes the Earth, then describes the NASA satellite measurements, and then proceeds to show how these measurements are used to understand the changes that are occurring as Earth's climate warms.

  7. Highlighting Your Science to NASA

    NASA Astrophysics Data System (ADS)

    Sharkey, C.

    2003-12-01

    An effort is underway to provide greater visibility within NASA headquarters, and to those who provide funding to NASA, of the outstanding work that is being performed by scientists involved in the Solar System Exploration Research and Analysis Programs, most of whom are DPS members. In support of this effort, a new feature has been developed for the NASA Headquarters Solar System Exploration Division web site whereby researchers can provide a synopsis of their current research results. The site (http://solarsystem.nasa.gov/spotlight/ - Username: your email address Password: sse) is an online submission area where NASA-funded scientists can upload the results of their research. There they provide their contact information, briefly describe their research, and upload any associated images or graphics. The information is available to a limited number of reviewers and writers at JPL. Each month, one researcher's work will be chosen as a science spotlight. After a writer interviews the scientist, a brief Power Point presentation that encapsulates their work will be given to Dr. Colleen Hartman at NASA headquarters. She will then present the exciting findings to Associate Administrator for Space Science, Dr. Ed Weiler. The information from some of these highlights can serve as a basis to bring Principal Investigators to NASA Headquarters for exposure to media through Space Science Updates on NASA television. In addition, the science results may also be incorporated into briefing material for the Office of Management and Budget and congressional staffers. Some spotlights will also be converted into feature stories for the Solar System Exploration website so the public, too, can learn about exciting new research. The site, http://solarsystem.nasa.gov/, is one of NASA's most visited. Over the past decade, there has been a trend of flat budgets for Research and Analysis activities. By giving more visibility to results of Solar System research, our goal is to encourage

  8. Update on NASA Microelectronics Activities

    NASA Technical Reports Server (NTRS)

    Label, Kenneth A.; Sampson, Michael J.; Casey, Megan; Lauenstein, Jean-Marie

    2017-01-01

    Mission Statement: The NASA Electronic Parts and Packaging (NEPP) Program provides NASA's leadership for developing and maintaining guidance for the screening, qualification, test. and usage of EEE parts by NASA as well as in collaboration with other government Agencies and industry. NASA Space Technology Mission Directorate (STMD) "STMD rapidly develops, demonstrates, and infuses revolutionary, high-payoff technologies through transparent, collaborative partnerships, expanding the boundaries of the aerospace enterprise." Mission Statement: The Space Environments Testing Management Office (SETMO) will identify, prioritize, and manage a select suite of Agency key capabilities/assets that are deemed to be essential to the future needs of NASA or the nation, including some capabilities that lack an adequate business base over the budget horizon. NESC mission is to perform value-added independent testing, analysis, and assessments of NASA's high-risk projects to ensure safety and mission success. NASA Space Environments and Avionics Fellows as well as Radiation and EEE Parts Community of Practice (CoP) leads.

  9. Fault Tree Analysis: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Fault tree analysis is a top-down approach to the identification of process hazards. It is as one of the best methods for systematically identifying an graphically displaying the many ways some things can go wrong. This bibliography references 266 documents in the NASA STI Database that contain the major concepts. fault tree analysis, risk an probability theory, in the basic index or major subject terms. An abstract is included with most citations, followed by the applicable subject terms.

  10. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  11. Data Analysis Measurement: Having a Solar Blast! NASA Connect: Program 7 in the 2001-2002 Video Series. [Videotape].

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Hampton, VA. Langley Research Center.

    NASA Connect is an interdisciplinary, instructional distance learning program targeting students in grades 6-8. This videotape explains how engineers and researchers at the National Aeronautics and Space Administration (NASA) use data analysis and measurement to predict solar storms, anticipate how they will affect the Earth, and improve…

  12. The ESA/NASA Multi-Aircraft ATV-1 Re-Entry Campaign: Analysis of Airborne Intensified Video Observations from the NASA/JSC Experiment

    NASA Technical Reports Server (NTRS)

    Barker, Ed; Maley, Paul; Mulrooney, Mark; Beaulieu, Kevin

    2009-01-01

    In September 2008, a joint ESA/NASA multi-instrument airborne observing campaign was conducted over the Southern Pacific ocean. The objective was the acquisition of data to support detailed atmospheric re-entry analysis for the first flight of the European Automated Transfer Vehicle (ATV)-1. Skilled observers were deployed aboard two aircraft which were flown at 12.8 km altitude within visible range of the ATV-1 re-entry zone. The observers operated a suite of instruments with low-light-level detection sensitivity including still cameras, high speed and 30 fps video cameras, and spectrographs. The collected data has provided valuable information regarding the dynamic time evolution of the ATV-1 re-entry fragmentation. Specifically, the data has satisfied the primary mission objective of recording the explosion of ATV-1's primary fuel tank and thereby validating predictions regarding the tanks demise and the altitude of its occurrence. Furthermore, the data contains the brightness and trajectories of several hundred ATV-1 fragments. It is the analysis of these properties, as recorded by the particular instrument set sponsored by NASA/Johnson Space Center, which we present here.

  13. Free-Flight Terrestrial Rocket Lander Demonstration for NASA's Autonomous Landing and Hazard Avoidance Technology (ALHAT) System

    NASA Technical Reports Server (NTRS)

    Rutishauser, David K.; Epp, Chirold; Robertson, Ed

    2012-01-01

    The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. Since its inception in 2006, the ALHAT Project has executed four field test campaigns to characterize and mature sensors and algorithms that support real-time hazard detection and global/local precision navigation for planetary landings. The driving objective for Government Fiscal Year 2012 (GFY2012) is to successfully demonstrate autonomous, real-time, closed loop operation of the ALHAT system in a realistic free flight scenario on Earth using the Morpheus lander developed at the Johnson Space Center (JSC). This goal represents an aggressive target consistent with a lean engineering culture of rapid prototyping and development. This culture is characterized by prioritizing early implementation to gain practical lessons learned and then building on this knowledge with subsequent prototyping design cycles of increasing complexity culminating in the implementation of the baseline design. This paper provides an overview of the ALHAT/Morpheus flight demonstration activities in GFY2012, including accomplishments, current status, results, and lessons learned. The ALHAT/Morpheus effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).

  14. Continued Research into Characterizing the Preturbulence Environment for Sensor Development, New Hazard Algorithms and Experimental Flight Planning

    NASA Technical Reports Server (NTRS)

    Kaplan, Michael L.; Lin, Yuh-Lang

    2005-01-01

    The purpose of the research was to develop and test improved hazard algorithms that could result in the development of sensors that are better able to anticipate potentially severe atmospheric turbulence, which affects aircraft safety. The research focused on employing numerical simulation models to develop improved algorithms for the prediction of aviation turbulence. This involved producing both research simulations and real-time simulations of environments predisposed to moderate and severe aviation turbulence. The research resulted in the following fundamental advancements toward the aforementioned goal: 1) very high resolution simulations of turbulent environments indicated how predictive hazard indices could be improved resulting in a candidate hazard index that indicated the potential for improvement over existing operational indices, 2) a real-time turbulence hazard numerical modeling system was improved by correcting deficiencies in its simulation of moist convection and 3) the same real-time predictive system was tested by running the code twice daily and the hazard prediction indices updated and improved. Additionally, a simple validation study was undertaken to determine how well a real time hazard predictive index performed when compared to commercial pilot observations of aviation turbulence. Simple statistical analyses were performed in this validation study indicating potential skill in employing the hazard prediction index to predict regions of varying intensities of aviation turbulence. Data sets from a research numerical model where provided to NASA for use in a large eddy simulation numerical model. A NASA contractor report and several refereed journal articles where prepared and submitted for publication during the course of this research.

  15. NASA replanning efforts continue

    NASA Astrophysics Data System (ADS)

    Katzoff, Judith A.

    A task force of the National Aeronautics and Space Administration (NASA) is producing new launch schedules for NASA's three remaining space shuttle orbiters, possibly supplemented by expendable launch vehicles. In the wake of the explosion of the space shuttle Challenger on January 28, 1986, the task force is assuming a delay of 12-18 months before resumption of shuttle flights.NASA's Headquarters Replanning Task Force, which meets daily, is separate from the agency's Data and Design Analysis Task Force, which collects and analyzes information about the accident for the use of the investigative commission appointed by President Ronald Reagan.

  16. Hazardous waste crime: a contextual analysis of the offense and the offender

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rebovich, D.J.

    The goal of this study is to analyze hazardous waste offense and offender characteristics. Criminal case data were collected from four sample states (Maine, Maryland, New Jersey, and Pennsylvania). Files of disposed criminal cases charged between 1977 and 1984 were content-analyzed, and interviews were conducted with prominent hazardous waste crime enforcement personnel from the sample states. Areas of analysis include methods of crime commission, skills required for crime commission, patterns of criminal network relationships, and degree of syndicate crime influence. While there has been some previous speculation that hazardous waste criminal behavior is directed through centralized racketeering, the present studymore » of known offenders found little evidence of syndicate crime family infiltration. Crimes occurred within small, informal networks of waste generators, waste transporters, the employees of treatment/storage/disposal (TSD) facilities and certain non-industry peripheral actors. The study concludes that, while attempts have been made by syndicate crime operatives to infiltrate, these attempts have failed largely due to features of criminal commission methods and to the inherent fragmentation of hauling and TSD firm interests.« less

  17. Overview of NASA Glenn Aero/Mobile Communications Demonstrations

    NASA Technical Reports Server (NTRS)

    Brooks, David; Hoder, Doug; Wilkins, Ryan

    2004-01-01

    The Glenn Research Center at Lewis Field (GRC) has been involved with several other NASA field centers on various networking and RF communications demonstrations and experiments since 1998. These collaborative experiments investigated communications technologies new to aviation, such as wideband Ku satcom, L-band narrowband satcom, and IP (Internet Protocol), using commercial off-the-shelf (COTS) components These technologies can be used to distribute weather and hazard data, air traffic management and airline fleet management information, and passenger cabin Internet service.

  18. Overview of NASA Glenn Aero/Mobile Communication Demonstrations

    NASA Technical Reports Server (NTRS)

    Brooks, David; Hoder, Doug; Wilkins, Ryan

    2004-01-01

    The Glenn Research Center at Lewis Field (GRC) has been involved with several other NASA field centers on various networking and RF communications demonstrations and experiments since 1998. These collaborative experiments investigated communications technologies new to aviation, such as wideband Ku satcom, L-band narrowband satcom, and IP (Internet Protocol), using commercial off-the-shelf (COTS) components These technologies can be used to distribute weather and hazard data, air traffic management and airline fleet management information, and passenger cabin Internet service.

  19. Hazard Analysis of Commercial Space Transportation: Volume 1: Operations. Volume 2: Hazards. Volume 3: Risk Analysis

    DOT National Transportation Integrated Search

    1988-05-01

    The report is devoted to the review and discussion of generic hazards associated with the ground, launch, orbital and re-entry phases of space operations. Since the DOT Office of Commercial Space Transportation (OCST) has been charged with protecting...

  20. NASA Technical Management Report (533Q)

    NASA Technical Reports Server (NTRS)

    Klosko, S. M.; Sanchez, B. (Technical Monitor)

    2001-01-01

    The objective of this task is analytical support of the NASA Satellite Laser Ranging (SLR) program in the areas of SLR data analysis, software development, assessment of SLR station performance, development of improved models for atmospheric propagation and interpretation of station calibration techniques, and science coordination and analysis functions for the NASA led Central Bureau of the International Laser Ranging Service (ILRS). The contractor shall in each year of the five year contract: (1) Provide software development and analysis support to the NASA SLR program and the ILRS. Attend and make analysis reports at the monthly meetings of the Central Bureau of the ILRS covering data received during the previous period. Provide support to the Analysis Working Group of the ILRS including special tiger teams that are established to handle unique analysis problems. Support the updating of the SLR Bibliography contained on the ILRS web site; (2) Perform special assessments of SLR station performance from available data to determine unique biases and technical problems at the station; (3) Develop improvements to models of atmospheric propagation and for handling pre- and post-pass calibration data provided by global network stations; (4) Provide review presentation of overall ILRS network data results at one major scientific meeting per year; (5) Contribute to and support the publication of NASA SLR and ILRS reports highlighting the results of SLR analysis activity.

  1. Overview of Engineering Design and Analysis at the NASA John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Ryan, Harry; Congiardo, Jared; Junell, Justin; Kirkpatrick, Richard

    2007-01-01

    A wide range of rocket propulsion test work occurs at the NASA John C. Stennis Space Center (SSC) including full-scale engine test activities at test facilities A-1, A-2, B-1 and B-2 as well as combustion device research and development activities at the E-Complex (E-1, E-2, E-3 and E-4) test facilities. The propulsion test engineer at NASA SSC faces many challenges associated with designing and operating a test facility due to the extreme operating conditions (e.g., cryogenic temperatures, high pressures) of the various system components and the uniqueness of many of the components and systems. The purpose of this paper is to briefly describe the NASA SSC Engineering Science Directorate s design and analysis processes, experience, and modeling techniques that are used to design and support the operation of unique rocket propulsion test facilities.

  2. NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2013-01-01

    The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.

  3. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0357] Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food...

  4. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    PubMed

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  5. NASA Satellite Image of Tropical Cyclone Ului

    NASA Image and Video Library

    2017-12-08

    NASA image acquired March 18, 2010. Tropical Cyclone Ului persisted south of the Solomon Islands on March 18, 2010. A bulletin from the U.S. Navy’s Joint Typhoon Warning Center (JTWC) issued the same day reported that the cyclone had maximum sustained winds of 80 knots (150 kilometers per hour) and gusts up to 100 knots (185 kilometers per hour). Although still strong, the wind speeds had significantly diminished over the previous few days. The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite captured this true-color image of the storm on March 18, 2010. North of the storm lie the Solomon Islands (shown in the high-resolution image). Southeast of the storm is New Caledonia. Ului’s eye appears to span 100 kilometers (60 miles) and the whole storm spans several hundred kilometers. As of 15:00 UTC on March 18 (2:00 a.m. on March 19 in Sydney, Australia), Ului was roughly 670 nautical miles (1,240 kilometers) east of Cairns, Australia. The JTWC reported that Ului had been moving southward and was expected to turn west and accelerate toward Australia. The JTWC forecast that Ului would make landfall over the northeastern Queensland coast and diminish over land. NASA image courtesy Jeff Schmaltz, MODIS Rapid Response Team at NASA GSFC. Caption by Michon Scott. Instrument: Terra - MODIS To learn more about this image go to: earthobservatory.nasa.gov/NaturalHazards/view.php?id=43180

  6. Development of Risk Uncertainty Factors from Historical NASA Projects

    NASA Technical Reports Server (NTRS)

    Amer, Tahani R.

    2011-01-01

    NASA is a good investment of federal funds and strives to provide the best value to the nation. NASA has consistently budgeted to unrealistic cost estimates, which are evident in the cost growth in many of its programs. In this investigation, NASA has been using available uncertainty factors from the Aerospace Corporation, Air Force, and Booz Allen Hamilton to develop projects risk posture. NASA has no insight into the developmental of these factors and, as demonstrated here, this can lead to unrealistic risks in many NASA Programs and projects (P/p). The primary contribution of this project is the development of NASA missions uncertainty factors, from actual historical NASA projects, to aid cost-estimating as well as for independent reviews which provide NASA senior management with information and analysis to determine the appropriate decision regarding P/p. In general terms, this research project advances programmatic analysis for NASA projects.

  7. Explore Earth Science Datasets for STEM with the NASA GES DISC Online Visualization and Analysis Tool, GIOVANNI

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Acker, J. G.; Kempler, S. J.

    2016-12-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) is one of twelve NASA Science Mission Directorate (SMD) Data Centers that provide Earth science data, information, and services to research scientists, applications scientists, applications users, and students around the world. The GES DISC is the home (archive) of NASA Precipitation and Hydrology, as well as Atmospheric Composition and Dynamics remote sensing data and information. To facilitate Earth science data access, the GES DISC has been developing user-friendly data services for users at different levels. Among them, the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI, http://giovanni.gsfc.nasa.gov/) allows users to explore satellite-based data using sophisticated analyses and visualizations without downloading data and software, which is particularly suitable for novices to use NASA datasets in STEM activities. In this presentation, we will briefly introduce GIOVANNI and recommend datasets for STEM. Examples of using these datasets in STEM activities will be presented as well.

  8. Explore Earth Science Datasets for STEM with the NASA GES DISC Online Visualization and Analysis Tool, Giovanni

    NASA Technical Reports Server (NTRS)

    Liu, Z.; Acker, J.; Kempler, S.

    2016-01-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center(DISC) is one of twelve NASA Science Mission Directorate (SMD) Data Centers that provide Earth science data, information, and services to users around the world including research and application scientists, students, citizen scientists, etc. The GESDISC is the home (archive) of remote sensing datasets for NASA Precipitation and Hydrology, Atmospheric Composition and Dynamics, etc. To facilitate Earth science data access, the GES DISC has been developing user-friendly data services for users at different levels in different countries. Among them, the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni, http:giovanni.gsfc.nasa.gov) allows users to explore satellite-based datasets using sophisticated analyses and visualization without downloading data and software, which is particularly suitable for novices (such as students) to use NASA datasets in STEM (science, technology, engineering and mathematics) activities. In this presentation, we will briefly introduce Giovanni along with examples for STEM activities.

  9. JANNAF 17th Propulsion Systems Hazards Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Cocchiaro, James E. (Editor); Gannaway, Mary T. (Editor); Rognan, Melanie (Editor)

    1998-01-01

    Volume 1, the first of two volumes is a compilation of 16 unclassified/unlimited technical papers presented at the 17th meeting of the Joint Army-Navy-NASA-Air Force (JANNAF) Propulsion Systems Hazards Subcommittee (PSHS) held jointly with the 35th Combustion Subcommittee (CS) and Airbreathing Propulsion Subcommittee (APS). The meeting was held on 7 - 11 December 1998 at Raytheon Systems Company and the Marriott Hotel, Tucson, AZ. Topics covered include projectile and shaped charge jet impact vulnerability of munitions; thermal decomposition and cookoff behavior of energetic materials; damage and hot spot initiation mechanisms with energetic materials; detonation phenomena of solid energetic materials; and hazard classification, insensitive munitions, and propulsion systems safety.

  10. Design and application analysis of prediction system of geo-hazards based on GIS in the Three Gorges Reservoir

    NASA Astrophysics Data System (ADS)

    Li, Deying; Yin, Kunlong; Gao, Huaxi; Liu, Changchun

    2009-10-01

    Although the project of the Three Gorges Dam across the Yangtze River in China can utilize this huge potential source of hydroelectric power, and eliminate the loss of life and damage by flood, it also causes environmental problems due to the big rise and fluctuation of the water, such as geo-hazards. In order to prevent and predict geo-hazards, the establishment of prediction system of geo-hazards is very necessary. In order to implement functions of hazard prediction of regional and urban geo-hazard, single geo-hazard prediction, prediction of landslide surge and risk evaluation, logical layers of the system consist of data capturing layer, data manipulation and processing layer, analysis and application layer, and information publication layer. Due to the existence of multi-source spatial data, the research on the multi-source transformation and fusion data should be carried on in the paper. Its applicability of the system was testified on the spatial prediction of landslide hazard through spatial analysis of GIS in which information value method have been applied aims to identify susceptible areas that are possible to future landslide, on the basis of historical record of past landslide, terrain parameter, geology, rainfall and anthropogenic activity. Detailed discussion was carried out on spatial distribution characteristics of landslide hazard in the new town of Badong. These results can be used for risk evaluation. The system can be implemented as an early-warning and emergency management tool by the relevant authorities of the Three Gorges Reservoir in the future.

  11. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  12. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    PubMed

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  13. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose ofmore » GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.« less

  14. NECAP 4.1: NASA's Energy-Cost Analysis Program input manual

    NASA Technical Reports Server (NTRS)

    Jensen, R. N.

    1982-01-01

    The computer program NECAP (NASA's Energy Cost Analysis Program) is described. The program is a versatile building design and energy analysis tool which has embodied within it state of the art techniques for performing thermal load calculations and energy use predictions. With the program, comparisons of building designs and operational alternatives for new or existing buildings can be made. The major feature of the program is the response factor technique for calculating the heat transfer through the building surfaces which accounts for the building's mass. The program expands the response factor technique into a space response factor to account for internal building temperature swings; this is extremely important in determining true building loads and energy consumption when internal temperatures are allowed to swing.

  15. Progress Towards the Remote Sensing of Aircraft Icing Hazards

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew; Brinker, David; Politovich, Marcia; Serke, David; Ryerson, Charles; Pazmany, Andrew; Solheim, Fredrick

    2009-01-01

    NASA has teamed with the FAA, DoD, industry, and academia for research into the remote detection and measurement of atmospheric conditions leading to aircraft icing hazards. The ultimate goal of this effort is to provide pilots, controllers, and dispatchers sufficient information to allow aircraft to avoid or minimize their exposure to the hazards of in-flight icing. Since the hazard of in-flight icing is the outcome of aircraft flight through clouds containing supercooled liquid water and strongly influenced by the aircraft s speed and configuration and by the length of exposure, the hazard cannot be directly detected, but must be inferred based upon the measurement of conducive atmospheric conditions. Therefore, icing hazard detection is accomplished through the detection and measurement of liquid water in regions of measured sub-freezing air temperatures. The icing environment is currently remotely measured from the ground with a system fusing radar, lidar, and multifrequency microwave radiometer sensors. Based upon expected ice accretion severity for the measured environment, a resultant aircraft hazard is then calculated. Because of the power, size, weight, and view angle constraints of airborne platforms, the current ground-based solution is not applicable for flight. Two current airborne concepts are based upon the use of either multifrequency radiometers or multifrequency radar. Both ground-based and airborne solutions are required for the future since groundbased systems can provide hazard detection for all aircraft in airport terminal regions while airborne systems will be needed to provide equipped aircraft with flight path coverage between terminal regions.

  16. Comparative analysis of hazardous household waste in two Mexican regions.

    PubMed

    Delgado, Otoniel Buenrostro; Ojeda-Benítez, Sara; Márquez-Benavides, Liliana

    2007-01-01

    Household hazardous waste (HHW) generation in two Mexican regions was examined, a northern region (bordering with the USA) and a central region. The aim of this work was to determine the dynamics of solid waste generation and to be able to compare the results of both regions, regarding consumption patterns and solid waste generation rates. In the northern region, household solid waste was analysed quantitatively. In order to perform this analysis, the population was categorized into three socioeconomic strata (lower, middle, upper). Waste characterization revealed the presence of products that give origin to household hazardous waste. In the northern region (Mexicali city), household hazardous waste comprised 3.7% of municipal solid waste, the largest categories in this fraction were home care products (29.2%), cleaning products (19.5%) and batteries and electronic equipment (15.7%). In the central region, HHW comprised 1.03% of municipal solid waste; the main categories in this fraction were represented by cleaning products (39%), self care products (27.3%), and insecticides (14.4%). In Mexicali, the socioeconomic study demonstrated that the production of HHW is independent of the income level. Furthermore, the composition of the solid waste stream in both regions suggested the influence of another set of variables such as local climate, migration patterns and marketing coverage. Further research is needed in order to establish the effect of low quantities of HHW upon the environment and public health.

  17. A Subject Reference: Benefit-Cost Analysis of Toxic Substances, Hazardous Materials and Solid Waste Control (1977)

    EPA Pesticide Factsheets

    Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control

  18. An inlet analysis for the NASA hypersonic research engine aerothermodynamic integration model

    NASA Technical Reports Server (NTRS)

    Andrews, E. H., Jr.; Russell, J. W.; Mackley, E. A.; Simmonds, A. L.

    1974-01-01

    A theoretical analysis for the inlet of the NASA Hypersonic Research Engine (HRE) Aerothermodynamic Integration Model (AIM) has been undertaken by use of a method-of-characteristics computer program. The purpose of the analysis was to obtain pretest information on the full-scale HRE inlet in support of the experimental AIM program (completed May 1974). Mass-flow-ratio and additive-drag-coefficient schedules were obtained that well defined the range effected in the AIM tests. Mass-weighted average inlet total-pressure recovery, kinetic energy efficiency, and throat Mach numbers were obtained.

  19. Advancing the citizen scientist's contributions to documenting and understanding natural hazards: a proof of concept for linking crowdsourced and remotely sensed data on landslide hazards in El Salvador

    NASA Astrophysics Data System (ADS)

    Anderson, E. R.; Griffin, R.; Markert, K. N.

    2017-12-01

    Scientists, practitioners, policymakers, and citizen groups, share a role in ensuring "that all sectors have access to, understand and can use scientific information for better informed decision-making" (Sendai Framework 2015-2030). When it comes to understanding hazards and exposure, inventories on disaster events are often limited. Thus, there are many opportunities for citizen scientists to engage in improving the collective understanding—and ultimately reduction—of disaster risk. Landslides are very difficult to forecast on spatial and temporal scales meaningful for early warning and evacuation. Heuristic hazard mapping methods are very common in regional hazard zonation and rely on expert knowledge of previous events and local conditions, but they often lack a temporal component. As new data analysis packages are becoming more open and accessible, probabilistic approaches that consider high resolution spatial and temporal dimensions are becoming more common, but this is only possible when rich inventories of landslide events exist. The work presented offers a proof of concept on incorporating crowd-sourced data to improve landslide hazard model performance. Starting with a national inventory of 90 catalogued landslides in El Salvador for a study period of 1998 to 2011, we simulate the addition of over 600 additional crowd-sourced landslide events that would have been identified through human interpretation of high resolution imagery in the Google Earth time slider feature. There is a noticeable improvement in performance statistics between static heuristic hazard models and probabilistic models that incorporate the events identified by the "crowd." Such a dynamic incorporation of crowd-sourced data on hazard events is not so far-fetched. Given the engagement of "local observers" in El Salvador who augment in situ hydro-meteorological measurements, the growing access to Earth observation data to the lay person, and immense interest behind connecting citizen

  20. Using S’COOL and MY NASA DATA to Support Language Arts Instruction: Overview and Analysis

    NASA Astrophysics Data System (ADS)

    Moore, S. W.; Rogerson, T. M.; Chambers, L. H.; Fischer, J. D.; Oots, P. C.; Lewis, P. M.

    2009-12-01

    choose correct terminology and compose hypotheses about possible patterns and anomalies. This paper will provide an overview and analysis of the use of S’COOL and MY NASA DATA to support language arts instruction.

  1. Analysis of Operational Hazards and Safety Requirements for Traffic Aware Strategic Aircrew Requests (TASAR)

    NASA Technical Reports Server (NTRS)

    Koczo, Stefan, Jr.

    2013-01-01

    Safety analyses of the Traffic Aware Strategic Aircrew Requests (TASAR) Electronic Flight Bag (EFB) application are provided to establish its Failure Effects Classification which affects certification and operational approval requirements. TASAR was developed by NASA Langley Research Center to offer flight path improvement opportunities to the pilot during flight for operational benefits (e.g., reduced fuel, flight time). TASAR, using own-ship and network-enabled information concerning the flight and its environment, including weather and Air Traffic Control (ATC) system constraints, provides recommended improvements to the flight trajectory that the pilot can choose to request via Change Requests to ATC for revised clearance. This study reviews the Change Request process of requesting updates to the current clearance, examines the intended function of TASAR, and utilizes two safety assessment methods to establish the Failure Effects Classification of TASAR. Considerable attention has been given in this report to the identification of operational hazards potentially associated with TASAR.

  2. Radiation Hazards and Countermeasures for Human Space Flight

    NASA Technical Reports Server (NTRS)

    Adams, James

    2004-01-01

    The protection of astronauts from the hazards of ionizing radiation in space is a moral and legal obligation of NASA. If there are to be manned deep-space missions, means must be found to provide this protection. There are two parts to providing this protection: understanding the effects of space radiation on humans so that radiation exposure limits can be established; and developing countermeasures so that exposures can be kept below these limits. This talk will cover both parts of this problem.

  3. NASA Past, Present, and Future: The Use of Commercial Off The Shelf (COTS) Electronics in Space

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.; Guertin, Steven M.

    2017-01-01

    NASA has a long history of using commercial grade electronics in space. In this presentation we will provide a brief history of NASA's trends and approaches to commercial grade electronics focusing on processing and memory systems. This will include providing summary information on the space hazards to electronics as well as NASA mission trade space. We will also discuss developing recommendations for risk management approaches to Electrical, Electronic and Electromechanical (EEE) parts usage in space. Two examples will be provided focusing on a near-earth Polar-orbiting spacecraft as well as a mission to Mars. The final portion will discuss emerging trends impacting usage.

  4. The Spaceguard Survey: Report of the NASA International Near-Earth-Object Detection Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, David (Editor)

    1992-01-01

    Impacts by Earth-approaching asteroids and comets pose a significant hazard to life and property. Although the annual probability of the Earth being struck by a large asteroid or comet is extremely small, the consequences of such a collision are so catastrophic that it is prudent to assess the nature of the threat and to prepare to deal with it. The first step in any program for the prevention or mitigation of impact catastrophes must involve a comprehensive search for Earth-crossing asteroids and comets and a detailed analysis of their orbits. At the request of the U.S. Congress, NASA has carried out a preliminary study to define a program for dramatically increasing the detection rate of Earth-crossing objects, as documented in this workshop report.

  5. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  6. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  7. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  8. 1995 NASA High-Speed Research Program Sonic Boom Workshop. Volume 2; Configuration Design, Analysis, and Testing

    NASA Technical Reports Server (NTRS)

    Baize, Daniel G. (Editor)

    1999-01-01

    The High-Speed Research Program and NASA Langley Research Center sponsored the NASA High-Speed Research Program Sonic Boom Workshop on September 12-13, 1995. The workshop was designed to bring together NASAs scientists and engineers and their counterparts in industry, other Government agencies, and academia working together in the sonic boom element of NASAs High-Speed Research Program. Specific objectives of this workshop were to: (1) report the progress and status of research in sonic boom propagation, acceptability, and design; (2) promote and disseminate this technology within the appropriate technical communities; (3) help promote synergy among the scientists working in the Program; and (4) identify technology pacing, the development C, of viable reduced-boom High-Speed Civil Transport concepts. The Workshop was organized in four sessions: Sessions 1 Sonic Boom Propagation (Theoretical); Session 2 Sonic Boom Propagation (Experimental); Session 3 Acceptability Studies-Human and Animal; and Session 4 - Configuration Design, Analysis, and Testing.

  9. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  10. An Improved Version of the NASA-Lockheed Multielement Airfoil Analysis Computer Program

    NASA Technical Reports Server (NTRS)

    Brune, G. W.; Manke, J. W.

    1978-01-01

    An improved version of the NASA-Lockheed computer program for the analysis of multielement airfoils is described. The predictions of the program are evaluated by comparison with recent experimental high lift data including lift, pitching moment, profile drag, and detailed distributions of surface pressures and boundary layer parameters. The results of the evaluation show that the contract objectives of improving program reliability and accuracy have been met.

  11. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Richey, Christina; Bernstein, Max; Rall, Jonathan

    2015-01-01

    Introduction: NASA's Planetary Science Division (PSD) solicits its Research and Analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD will be changing the structure of the program elements under which the majority of planetary science R&A is done. Major changes include the creation of five core research program elements aligned with PSD's strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submissionROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2014 submission changes: All PSD programs will use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.Additional Information: Additional details will be provided on the Cassini Data Analysis Program, the

  12. NDARC NASA Design and Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne R.

    2009-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and

  13. NDARC - NASA Design and Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2015-01-01

    The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail

  14. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less

  15. Severe Air Pollution in New Delhi View by NASA MISR

    NASA Image and Video Library

    2016-11-16

    New Delhi, India's capital city, is currently suffering though a period of particularly poor air quality. In early November 2016, monitors at various locations in the area posted air quality index measurements as high as the 900s (the most severe ranking, "hazardous," is any air quality index measurement over 300). Thousands of schools have been closed, and a survey by the Associate Chambers of Commerce and Industry of India reports that 10 percent of the city's workers called in sick due to air-pollution-related health issues. According to several published news reports, the extreme air pollution may be due to a combination of nearby agricultural burning after harvest, urban construction and solid-waste burning, as well as remnants of firecracker smoke and additional car emissions after the celebration of Diwali, the Hindu festival of lights, on October 30. The Multi-angle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite passed over the region on Saturday, Nov. 5, 2016, at around 11:05 a.m. local time. At left is an image acquired from MISR's vertical viewing camera. The Himalayas stretch across the northern portion of the image. This towering mountain range tends to concentrate pollution in the region immediately to the south, including New Delhi, by preventing pollutants from blowing northwards. New Delhi, whose location is indicated on the image, is under a patch of especially thick haze. At 6:00 a.m. local time on that date, the U.S. Mission India NowCast Air Quality Index for New Delhi was reported at 751, more than twice the threshold for hazardous air quality. At right, a map of aerosol optical depth is superimposed on the image. Optical depth is a quantitative measure of the abundance of aerosols (tiny particles in the atmosphere). Optical depths for the area around New Delhi have not been calculated because the haze is so thick that the algorithm has classified the area as a cloud. In the region immediately surrounding the thick

  16. Joint NASA/EPA AVIRIS Analysis in the Chesapeake Bay Region: Plans and Initial Results

    NASA Technical Reports Server (NTRS)

    Johnson, Lee; Stokely, Peter; Lobitz, Brad; Shelton, Gary

    1998-01-01

    NASA's Ames Research Center is performing an AVIRIS demonstration project in conjunction with the U. S. Environmental Protection Agency (Region 3). NASA and EPA scientists have jointly defined a Study Area in eastern Virginia to include portions of the Chesapeake Bay, southern Delmarva Peninsula, and the mouths of the York and James Rivers. Several environmental issues have been identified for study. These include, by priority: 1) water constituent analysis in the Chesapeake Bay, 2) mapping of submerged aquatic vegetation in the Bay, 3) detection of vegetation stress related to Superfund sites at the Yorktown Naval Weapons Station, and 4) wetland species analysis in the York River vicinity. In support of this project, three lines of AVIRIS data were collected during the Wallops Island deployment on 17 August 1997. The remote sensing payload included AVIRIS, MODIS Airborne Simulator and an RC-10 color infrared film camera. The AVIRIS data were delivered to Ames from the JPL AVIRIS Data Facility, on 29 September 1997. Quicklook images indicate nominal data acquisition, and at the current time an atmospheric correction is being applied. Water constituent analysis of the Bay is our highest priority based on EPA interest and available collateral data, both from the surface and from other remote sensing instruments. Constituents of interest include suspended sediments, chlorophyll-a and accessory pigments, Analysis steps will include: verification of data quality, location of study sites in imagery, incorporation of relevant field data from EPA and other Chesapeake Bay cooperators, processing of imagery to show phenomenon of interest, verification of results with cooperators. By 1st quarter CY98 we plan to circulate initial results to NASA and EPA management for review. In the longer term we will finalize documentation, prepare results for publication, and complete any needed technology transfer to EPA remote sensing personnel.

  17. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Scroggins, Sharon

    2008-01-01

    NASA's Agency-wide.resource for identifying and managing risks associated with changing environmental regulations Goals of the RRAC PC: 1) Proactively. detect, analyze and communicate environmental regulatory risks to NASA Programs and facilities; 2) Communicate with regulators and participate in the mitigation of such risks; and 3) Provide centralized support on emerging regulations to NASA HQ Environmental Management Division. When significant regulatory changes are identified, timely communication is essential. Communication of changing requirements to the regulatory stakeholders - NASA Programs and Facilities. Communication of potential issues to management and, when appropriate, back to the regulating agency.

  18. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  19. [Re-analysis of occupational hazards in foundry].

    PubMed

    Zhang, Min; Qi, Cheng; Chen, Wei-Hong; Lu, Yang; Du, Xie-Yi; Li, Wen-Jie; Meng, Chuan-San

    2010-04-01

    To analyze systematically the characteristics of occupational hazards in the foundry, and provide precise data for epidemiology studies and control of occupational hazards in the foundry. Data of airborne dust, chemical occupational hazards and physical occupational agents in environment in the foundry from 1978 to 2008 were dynamically collected. Mean concentration and intensity (geometric mean) of occupational hazards were calculated by job in different years. Main occupational hazards in the foundry were silica, metal fume, noise and heat stress. Silica existed in all of main jobs. The mean concentration of silica before 1986 was an extremely high level of 8.6 mg/m(3), and then remarkably dropped after 1986, with the level of 2.4 mg/m(3) from 1986 to 1989, 2.7 mg/m(3) from 1990 to 2002 and 2.7 mg/m(3) from 2003 to 2008. The trend of silica concentrations by job was consistent with that in general. Silica concentrations among jobs were significantly different, with highest level in melting (4.4 mg/m(3)), followed by cast shakeout and finishing (3.4 mg/m(3)), pouring (3.4 mg/m(3)), sand preparation (2.4 mg/m(3)), moulding (2.1 mg/m(3)) and core-making (1.7 mg/m(3)). Concentration of respirable dust in pouring was highest (2.76 mg/m(3)), followed by cast shakeout and finishing (1.14 mg/m(3)). Mean concentration of asbestos dust in melting was a relative high level of 2.0 mg/m(3). In core-making and sand preparation, there existed emission production of adhesive, with mean concentrations as followed, ammonia (5.84 mg/m(3)), formaldehyde (0.60 mg/m(3)), phenol (1.73 mg/m(3)) and phenol formaldehyde resin (1.3 mg/m(3)) also existed. Benzene and its homologues existed in cast shakeout and finishing, and the level of benzene, toluene, xylene was 0.2 mg/m(3), 0.1 mg/m(3) and 1.3 mg/m(3), respectively. In pouring and melting, there existed chemical occupational hazards, including benzo(a) pyrene, metal fume (lead, cadmium, manganese, nickel, chromium) and gas

  20. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    PubMed

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  1. NASA's Myriad Uses of Digital Video

    NASA Technical Reports Server (NTRS)

    Grubbs, Rodney; Lindblom, Walt; George, Sandy

    1999-01-01

    Since it's inception, NASA has created many of the most memorable images seen this Century. From the fuzzy video of Neil Armstrong taking that first step on the moon, to images of the Mars surface available to all on the internet, NASA has provided images to inspire a generation, all because a scientist or researcher had a requirement to see something unusual. Digital Television technology will give NASA unprecedented new tools for acquiring, analyzing, and distributing video. This paper will explore NASA's DTV future. The agency has a requirement to move video from one NASA Center to another, in real time. Specifics will be provided relating to the NASA video infrastructure, including video from the Space Shuttle and from the various Centers. A comparison of the pros and cons of interlace and progressive scanned images will be presented. Film is a major component of NASA's image acquisition for analysis usage. The future of film within the context of DTV will be explored.

  2. How much do hazard mitigation plans cost? An analysis of federal grant data.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2013-01-01

    Under the Disaster Mitigation Act of 2000 and Federal Emergency Management Agency's subsequent Interim Final Rule, the requirement was placed on local governments to author and gain approval for a Hazard Mitigation Plan (HMP) for the areas under their jurisdiction. Low completion percentages for HMPs--less than one-third of eligible governments--were found by an analysis conducted 3 years after the final deadline for the aforementioned legislation took place. Follow-up studies showed little improvement at 5 and 8 years after the deadline. It was hypothesized that the cost of a HMP is a significant factor in determining whether or not a plan is completed. A study was conducted using Boolean Matrix Analysis methods to determine what, if any, characteristics of a certain community will most influence the cost of a HMP. The frequency of natural hazards experienced by the planning area, the number of jurisdictions participating in the HMEP, the population, and population density were found to significantly affect cost. These variables were used in a regression analysis to determine their predictive power for cost. It was found that along with two interaction terms, the variables explain approximately half the variation in HMP cost.

  3. Human-like robots for space and hazardous environments

    NASA Technical Reports Server (NTRS)

    Cogley, Allen; Gustafson, David; White, Warren; Dyer, Ruth; Hampton, Tom (Editor); Freise, Jon (Editor)

    1990-01-01

    The three year goal for this NASA Senior Design team is to design and build a walking autonomous robotic rover. The rover should be capable of rough terrain crossing, traversing human made obstacles (such as stairs and doors), and moving through human and robot occupied spaces without collision. The rover is also to evidence considerable decision making ability, navigation and path planning skills. These goals came from the concept that the robot should have the abilities of both a planetary rover and a hazardous waste site scout.

  4. Human-like robots for space and hazardous environments

    NASA Astrophysics Data System (ADS)

    Cogley, Allen; Gustafson, David; White, Warren; Dyer, Ruth; Hampton, Tom; Freise, Jon

    The three year goal for this NASA Senior Design team is to design and build a walking autonomous robotic rover. The rover should be capable of rough terrain crossing, traversing human made obstacles (such as stairs and doors), and moving through human and robot occupied spaces without collision. The rover is also to evidence considerable decision making ability, navigation and path planning skills. These goals came from the concept that the robot should have the abilities of both a planetary rover and a hazardous waste site scout.

  5. NASA Update

    NASA Image and Video Library

    2010-04-08

    "NASA Update" program with NASA Administrator Charles Bolden, NASA Deputy Administrator Lori Garver and NASA Acting Asistant Administrator for Public Affairs Bob Jacobs as moderator, NASA Headquarters, Thursday, April 8, 2010 in Washington. Photo Credit: (NASA/Bill Ingalls)

  6. Demonstration of Hazardous Hypervelocity Test Capability

    NASA Technical Reports Server (NTRS)

    Rodriquez, Karen M.

    1991-01-01

    NASA Johnson Space Center (JSC) White Sands Test Facility (WSTF) participated in a joint test program with NASA JSC Hypervelocity Impact Research Laboratory (HIRL) to determine if JSC was capable of performing hypervelocity impact tests on hazardous targets. Seven pressurized vessels were evaluated under hypervelocity impact conditions. The vessels were tested with various combinations of liquids and gasses at various pressures. Results from the evaluation showed that vessels containing 100-percent pressurized gas sustained more severe damage and had a higher potential for damaging nearby equipment, than vessels containing 75-percent liquid, 25-percent inert pressurized gas. Two water-filled test vessels, one of which was placed behind an aluminum shield, failed by bulging and splitting open at the impact point; pressure was relieved without the vessel fragmenting or sustaining internal damage. An additional water-filled test vessel, placed a greater distance behind an aluminum shield, sustained damage that resembled a shotgun blast, but did not bulge or split open; again, pressure was relieved without the vessel fragmenting. Two test vessels containing volatile liquids (nitro methane and hydrazine) also failed by bulging and splitting open; neither liquid detonated under hypervelocity test conditions. A test vessel containing nitrogen gas failed by relieving pressure through a circular entry hole; multiple small penetrations opposite the point of entry provided high velocity target debris to surrounding objects. A high-pressure oxygen test vessel fragmented upon impact; the ensuing fire and high velocity fragments caused secondary damage to surrounding objects. The results from the evaluation of the pressurized vessels indicated that JSC is capable of performing hypervelocity impact tests on hazardous targets.

  7. NASA Applications for Computational Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.

    2011-01-01

    Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.

  8. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  9. NASA #801 and NASA 7 on ramp

    NASA Technical Reports Server (NTRS)

    1997-01-01

    NASA N801NA and NASA 7 together on the NASA Dryden ramp. The Beechcraft Beech 200 Super KingAir aircraft N7NA, known as NASA 7, has been a support aircraft for many years, flying 'shuttle' missions to Ames Research Center. It once flew from the Jet Propulsion Laboratory and back each day but now (2001) flies between the Dryden Flight Research Center and Ames. A second Beechcraft Beech 200 Super King Air, N701NA, redesignated N801NA, transferred to Dryden on 3 Oct. 1997 and is used for research missions but substitutes for NASA 7 on shuttle missions when NASA 7 is not available.

  10. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parisi, Carlo; Prescott, Steve; Ma, Zhegang

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less

  11. Hazards and hazard combinations relevant for the safety of nuclear power plants

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Brinkman, Hans; Raimond, Emmanuel

    2017-04-01

    exclusive (e.g., extremely high air temperature and surface ice). Our dataset further provides information on hazard combinations which are more likely to occur than just by random coincidence. 577 correlations between individual hazards are identified by expert opinion and shown in a cross-correlation chart. Combinations discriminate between: (1) causally connected hazards (cause-effect relation) where one hazard (e.g., costal erosion) may be caused by another hazard (e.g., storm surge); or where one hazard (e.g., high wind) is a prerequisite for a correlated hazard (e.g., storm surge). The identified causal links are not commutative. (2) Associated hazards ("contemporary" events) which are probable to occur at the same time due to a common root cause (e.g., a cold front of a meteorological low pressure area which leads to a drop of air pressure, high wind, thunderstorm, lightning, heavy rain and hail). The root cause may not necessarily be regarded as a hazard by itself. The hazard list and the hazard correlation chart may serve as a starting point for the hazard analysis process for nuclear installations in Level 1 PSA as outlined by IAEA (2010), the definition of design basis for nuclear reactors, and the assessment of design extension conditions as required by WENRA-RHWG (2014). It may further be helpful for the identification of hazard combinations and hazard cascades which threaten other critical infrastructure. References: Decker, K. & Brinkman, H., 2017. List of external hazards to be considered in extended PSA. Report No. ASAMPSA_E/WP21/D21.2/2017-41 - IRSN/ PSN-RES/SAG/2017-00011 IAEA, 2010. Development and Application of Level 1 Probabilistic Safety Assessment for Nuclear Power Plants. Safety Guide No. SSG-3, Vienna. http://www-pub.iaea.org/books/ WENRA-RHWG, 2014. WENRA Safety Reference Levels for Existing Reactors. Update in Relation to Lessons Learned from TEPCO Fukushima Dai-Ichi Accident. http://www.wenra.org/publications/

  12. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  13. Proceedings of the NASA Conference on Materials for Improved Fire Safety

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The Manned Spacecraft Center was pleased to host the NASA Conference on Materials for Improved Fire Safety which was held on May 6 and 7, 1970. This document is a compilation of papers presented at the conference and represents the culmination of several years of effort by NASA and industry which was directed toward the common objective of minimizing the fire hazard in manned spacecraft and in some other related areas. One of the more serious problem areas in the Apollo program was the flammability of nonmetallic materials. The effective and timely solution of this problem area resulted from much of the effort reported herein and contributed greatly toward the successful achievement of landing men on the moon and returning them safely to earth.

  14. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  16. Benefit from NASA

    NASA Image and Video Library

    2001-10-01

    Technology used to provide thermal protection for Apollo astronauts and spacecraft components provides firefighters with better protective clothing and equipment. Spinoffs include a portable firefighting module, protective clothing for workers in hazardous environments, fire-retardant paints and forms, fireblocking coating for outdoor structures, and flame-resistant fabric. Perhaps the farthest reaching is the breathing apparatus worn by firefighters throughout the U.S. for protection against smoke inhalation injury. The breathing system weighs approximately 20 pounds, one-third less than past systems, and it enables the wearer to have improved mobility. It consists of a face mask, frame and harness, a warning device, and an air bottle. The basic air cylinder offers the same 30-minutes of operation time as its predecessor. The result is a drastic reduction in the number of inhalation injuries to firefighters. Though they have made many design modifications and refinements, manufacturers of breathing apparatus still incorporate the original NASA technology.

  17. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz

    NASA Astrophysics Data System (ADS)

    Løvholt, Finn; Urgeles, Roger

    2017-04-01

    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn

  18. Swedish Delegation Visits NASA Goddard

    NASA Image and Video Library

    2017-12-08

    Swedish Delegation Visits GSFC – May 3, 2017 - Members of the Royal Swedish Academy of Engineering Sciences listen to Dr. Melissa Trainer, Sample Analysis at, Mars (SAM) team member and Charles Malespin, SAM Deputy Principal Investigator and Operations Test Lead discuss research being done in the SAM lab being carried by the Curiosity Rover on the surface of Mars. Credit: NASA/Goddard/Bill Hrybyk Read more: go.nasa.gov/2p1rP0h NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  20. NASA Science Data Processing for SNPP

    NASA Astrophysics Data System (ADS)

    Hall, A.; Behnke, J.; Lowe, D. R.; Ho, E. L.

    2014-12-01

    NASA's ESDIS Project has been operating the Suomi National Polar-Orbiting Partnership (SNPP) Science Data Segment (SDS) since the launch in October 2011. The science data processing system includes a Science Data Depository and Distribution Element (SD3E) and five Product Evaluation and Analysis Tool Elements (PEATEs): Land, Ocean, Atmosphere, Ozone, and Sounder. The SDS has been responsible for assessing Environmental Data Records (EDRs) for climate quality, providing and demonstrating algorithm improvements/enhancements and supporting the calibration/validation activities as well as instrument calibration and sensor table uploads for mission planning. The SNPP also flies two NASA instruments: OMPS Limb and CERES. The SNPP SDS has been responsible for producing, archiving and distributing the standard products for those instruments in close association with their NASA science teams. The PEATEs leveraged existing science data processing techniques developed under the EOSDIS Program. This enabled he PEATEs to do an excellent job in supporting Science Team analysis for SNPP. The SDS acquires data from three sources: NESDIS IDPS (Raw Data Records (RDRs)), GRAVITE (Retained Intermediate Products (RIPs)), and the NOAA/CLASS (higher level products). The SD3E component aggregates the RDRs, and distributes them to each of the PEATEs for further analysis and processing. It provides a ~32 day rolling storage of data, available for pickup by the PEATEs. The current system used by NASA will be presented along with plans for streamlining the system in support of continuing the NASA's EOS measurements.

  1. NASA Update

    NASA Image and Video Library

    2011-02-15

    NASA Administrator Charles F. Bolden Jr., answers questions during a NASA Update on, Tuesday, Feb. 15, 2011, at NASA Headquarters in Washington. Bolden, NASA's 12th Administrator and NASA Deputy Administrator Lori Garver took the time discuss the agency’s fiscal year 2012 budget request and to take questions from employees. Photo Credit: (NASA/Bill Ingalls)

  2. Supporting NASA Facilities Through GIS

    NASA Technical Reports Server (NTRS)

    Ingham, Mary E.

    2000-01-01

    The NASA GIS Team supports NASA facilities and partners in the analysis of spatial data. Geographic Information System (G[S) is an integration of computer hardware, software, and personnel linking topographic, demographic, utility, facility, image, and other geo-referenced data. The system provides a graphic interface to relational databases and supports decision making processes such as planning, design, maintenance and repair, and emergency response.

  3. The NASA Herschel Science Center

    NASA Astrophysics Data System (ADS)

    Helou, G.

    2005-12-01

    NASA has set aside resources in support of US-based scientists working on analysis and interpretation of data from Herschel, and has designated IPAC as the home of the NASA Herschel Science Center (NHSC). In supporting the US-based Herschel community, NHSC will draw on its experience (ISO, Spitzer and other missions) as well as a close working relation with the Herschel mission and instrument expertise both in Europe and in the U.S. The support covers technical and logistical aspects as well as data analysis funding, to be handled in large measure following the Spitzer funding model.

  4. Recommendations for dealing with waste contaminated with Ebola virus: a Hazard Analysis of Critical Control Points approach

    PubMed Central

    Edmunds, Kelly L; Elrahman, Samira Abd; Bell, Diana J; Brainard, Julii; Dervisevic, Samir; Fedha, Tsimbiri P; Few, Roger; Howard, Guy; Lake, Iain; Maes, Peter; Matofari, Joseph; Minnigh, Harvey; Mohamedani, Ahmed A; Montgomery, Maggie; Morter, Sarah; Muchiri, Edward; Mudau, Lutendo S; Mutua, Benedict M; Ndambuki, Julius M; Pond, Katherine; Sobsey, Mark D; van der Es, Mike; Zeitoun, Mark

    2016-01-01

    Abstract Objective To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. Methods A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. Findings The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. Conclusion The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease. PMID:27274594

  5. Recommendations for dealing with waste contaminated with Ebola virus: a Hazard Analysis of Critical Control Points approach.

    PubMed

    Edmunds, Kelly L; Elrahman, Samira Abd; Bell, Diana J; Brainard, Julii; Dervisevic, Samir; Fedha, Tsimbiri P; Few, Roger; Howard, Guy; Lake, Iain; Maes, Peter; Matofari, Joseph; Minnigh, Harvey; Mohamedani, Ahmed A; Montgomery, Maggie; Morter, Sarah; Muchiri, Edward; Mudau, Lutendo S; Mutua, Benedict M; Ndambuki, Julius M; Pond, Katherine; Sobsey, Mark D; van der Es, Mike; Zeitoun, Mark; Hunter, Paul R

    2016-06-01

    To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease.

  6. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  7. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  8. Development of an Outreach Program for NASA: "NASA Ambassadors"

    NASA Technical Reports Server (NTRS)

    Lebo, George R.

    1996-01-01

    It is widely known that the average American citizen has either no idea or the wrong impression of what NASA is doing. The most common impression is that NASA's sole mission is to build and launch spacecraft and that the everyday experience of the common citizen would be impacted very little if NASA failed to exist altogether. Some feel that most of NASA's efforts are much too expensive and that the money would be better used on other efforts. Others feel that most of NASA's efforts either fail altogether or fail to meet their original objectives. Yet others feel that NASA is so mired in bureaucracy that it is no longer able to function. The goal of the NASA Ambassadors Program (NAP) is to educate the general populace as to what NASA's mission and goals actually are, to re-excite the "man on the street" with NASA's discoveries and technologies, and to convince him that NASA really does impact his everyday experience and that the economy of the U.S. is very dependent on NASA-type research. Each of the NASA centers currently run a speakers bureau through its Public Affairs Office (PAO). The speakers, NASA employees, are scheduled on an "as available" status and their travel is paid by NASA. However, there are only a limited number of them and their message may be regarded as being somewhat biased as they are paid by NASA. On the other hand, there are many members of NASA's summer programs which come from all areas of the country. Most of them not only believe that NASA's mission is important but are willing and able to articulate it to others. Furthermore, in the eyes of the public, they are probably more effective as ambassadors for NASA than are the NASA employees, as they do not derive their primary funding from it. Therefore it was decided to organize materials for them to use in presentations to general audiences in their home areas. Each person who accepted these materials was to be called a "NASA Ambassador".

  9. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    NASA Astrophysics Data System (ADS)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  10. Analysis of data from NASA B-57B gust gradient program

    NASA Technical Reports Server (NTRS)

    Frost, W.; Lin, M. C.; Chang, H. P.; Ringnes, E.

    1985-01-01

    Statistical analysis of the turbulence measured in flight 6 of the NASA B-57B over Denver, Colorado, from July 7 to July 23, 1982 included the calculations of average turbulence parameters, integral length scales, probability density functions, single point autocorrelation coefficients, two point autocorrelation coefficients, normalized autospectra, normalized two point autospectra, and two point cross sectra for gust velocities. The single point autocorrelation coefficients were compared with the theoretical model developed by von Karman. Theoretical analyses were developed which address the effects spanwise gust distributions, using two point spatial turbulence correlations.

  11. NASA UTILIZATION OF THE INTERNATIONAL SPACE STATION AND THE VISION FOR SPACE EXPLORATION

    NASA Technical Reports Server (NTRS)

    Robinson, Julie A.; Thomas, Donald A.

    2006-01-01

    Under U.S. President Bush s Vision for Space Exploration (January 14, 2004), NASA has refocused its utilization plans for the International Space Station (ISS). This use will now focus on: (1) the development of countermeasures that will protect crews from the hazards of the space environment, (2) testing and validating technologies that will meet information and systems needs for future exploration missions.

  12. A UAV System for Observing Volcanoes and Natural Hazards

    NASA Astrophysics Data System (ADS)

    Saggiani, G.; Persiani, F.; Ceruti, A.; Tortora, P.; Troiani, E.; Giuletti, F.; Amici, S.; Buongiorno, M.; Distefano, G.; Bentini, G.; Bianconi, M.; Cerutti, A.; Nubile, A.; Sugliani, S.; Chiarini, M.; Pennestri, G.; Petrini, S.; Pieri, D.

    2007-12-01

    Fixed or rotary wing manned aircraft are currently the most commonly used platforms for airborne reconnaissance in response to natural hazards, such as volcanic eruptions, oil spills, wild fires, earthquakes. Such flights are very often undertaken in hazardous flying conditions (e.g., turbulence, downdrafts, reduced visibility, close proximity to dangerous terrain) and can be expensive. To mitigate these two fundamental issues-- safety and cost--we are exploring the use of small (less than 100kg), relatively inexpensive, but effective, unmanned aerial vehicles (UAVs) for this purpose. As an operational test, in 2004 we flew a small autonomous UAV in the airspace above and around Stromboli Volcano. Based in part on this experience, we are adapting the RAVEN UAV system for such natural hazard surveillance missions. RAVEN has a 50km range, with a 3.5m wingspan, main fuselage length of 4.60m, and maximum weight of 56kg. It has autonomous flight capability and a ground control Station for the mission planning and control. It will carry a variety of imaging devices, including a visible camera, and an IR camera. It will also carry an experimental Fourier micro-interferometer based on MOEMS technology, (developed by IMM Institute of CNR), to detect atmospheric trace gases. Such flexible, capable, and easy-to-deploy UAV systems may significantly shorten the time necessary to characterize the nature and scale of the natural hazard threats if used from the outset of, and systematically during, natural hazard events. When appropriately utilized, such UAVs can provide a powerful new hazard mitigation and documentation tool for civil protection hazard responders. This research was carried out under the auspices of the Italian government, and, in part, under contract to NASA at the Jet Propulsion Laboratory.

  13. NASA Update

    NASA Image and Video Library

    2009-07-20

    NASA Deputy Administrator Lori Garver, right, looks on as NASA Administrator Charles F. Bolden Jr. speaks during his first NASA Update,Tuesday, July 21, 2009, at NASA Headquarters in Washington. Bolden, NASA's 12th Administrator and Garver took the time to introduce themselves and outline their vision for the agency going forward. No questions were taken during the session. Photo Credit: (NASA/Bill Ingalls)

  14. NASA Update

    NASA Image and Video Library

    2009-07-20

    NASA Deputy Administrator Lori Garver, second right on stage, speaks as NASA Administrator Charles F. Bolden Jr. looks on during a NASA Update,Tuesday, July 21, 2009, at NASA Headquarters in Washington. Bolden, NASA's 12th Administrator and Garver took the time to introduce themselves and outline their vision for the agency going forward. No questions were taken during the session. Photo Credit: (NASA/Bill Ingalls)

  15. The NASA Icing Remote Sensing System

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Brinker, David J.; Ratvasky, Thomas P.; Ryerson, Charles C.; Koenig, George G.

    2005-01-01

    NASA and the U.S. Army Cold Regions Research and Engineering Laboratory (CRREL) have an on-going activity to develop remote sensing technologies for the detection and measurement of icing conditions aloft. A multiple instrument approach is the current emphasis of this activity. Utilizing radar, radiometry, and lidar, a region of supercooled liquid is identified. If the liquid water content (LWC) is sufficiently high, then the region of supercooled liquid cloud is flagged as being an aviation hazard. The instruments utilized for the current effort are an X-band vertical staring radar, a radiometer that measures twelve frequencies between 22 and 59 GHz, and a lidar ceilometer. The radar data determine cloud boundaries, the radiometer determines the sub-freezing temperature heights and total liquid water content, and the ceilometer refines the lower cloud boundary. Data are post-processed with a LabVIEW program with a resultant supercooled LWC profile and aircraft hazard identification. Remotely sensed measurements gathered during the 2003-2004 Alliance Icing Research Study (AIRS II) were compared to aircraft in-situ measurements. Although the comparison data set is quite small, the cases examined indicate that the remote sensing technique appears to be an acceptable approach.

  16. Hypergol Maintenance Facility Hazardous Waste South Staging Areas, SWMU 070 Corrective Measures Implementation

    NASA Technical Reports Server (NTRS)

    Miller, Ralinda R.

    2016-01-01

    This document presents the Corrective Measures Implementation (CMI) Year 10 Annual Report for implementation of corrective measures at the Hypergol Maintenance Facility (HMF) Hazardous Waste South Staging Areas at Kennedy Space Center, Florida. The work is being performed by Tetra Tech, Inc., for the National Aeronautics and Space Administration (NASA) under Indefinite Delivery Indefinite Quantity (IDIQ) NNK12CA15B, Task Order (TO) 07. Mr. Harry Plaza, P.E., of NASA's Environmental Assurance Branch is the Remediation Project Manager for John F. Kennedy Space Center. The Tetra Tech Program Manager is Mr. Mark Speranza, P.E., and the Tetra Tech Project Manager is Robert Simcik, P.E.

  17. The Economics of NASA Mission Cost Reserves

    NASA Technical Reports Server (NTRS)

    Whitley, Sally; Shinn, Stephen

    2012-01-01

    Increases in NASA mission costs are well-noted but not well-understood, and there is little evidence that they are decreasing in frequency or amount over time. The need to control spending has led to analysis of the causes and magnitude of historical mission overruns, and many program control efforts are being implemented to attempt to prevent or mitigate the problem (NPR 7120). However, cost overruns have not abated, and while some direct causes of increased spending may be obvious (requirements creep, launch delays, directed changes, etc.), the underlying impetus to spend past the original budget may be more subtle. Gaining better insight into the causes of cost overruns will help NASA and its contracting organizations to avoid .them. This paper hypothesizes that one cause of NASA mission cost overruns is that the availability of reserves gives project team members an incentive to make decisions and behave in ways that increase costs. We theorize that the presence of reserves is a contributing factor to cost overruns because it causes organizations to use their funds less efficiently or to control spending less effectively. We draw a comparison to the insurance industry concept of moral hazard, the phenomenon that the presence of insurance causes insureds to have more frequent and higher insurance losses, and we attempt to apply actuarial techniques to quantifY the increase in the expected cost of a mission due to the availability of reserves. We create a theoretical model of reserve spending motivation by defining a variable ReserveSpending as a function of total reserves. This function has a positive slope; for every dollar of reserves available, there is a positive probability of spending it. Finally, the function should be concave down; the probability of spending each incremental dollar of reserves decreases progressively. We test the model against available NASA CADRe data by examining missions with reserve dollars initially available and testing whether

  18. 20th JANNAF Propulsion Systems Hazards Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Cocchiaro, James E. (Editor); Eggleston, Debra S. (Editor); Gannaway, Mary T. (Editor); Inzar, Jeanette M. (Editor)

    2002-01-01

    This volume, the first of two volumes, is a collection of 24 unclassified/unlimited-distribution papers which were presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 20th Propulsion Systems Hazards Subcommittee (PSHS), 38th Combustion Subcommittee (CS), 26th Airbreathing Propulsion Subcommittee (APS), and 21 Modeling and Simulation Subcommittee meeting. The meeting was held 8-12 April 2002 at the Bayside Inn at The Sandestin Golf & Beach Resort and Eglin Air Force Base, Destin, Florida. Topics covered include: insensitive munitions and hazard classification testing of solid rocket motors and other munitions; vulnerability of gun propellants to impact stimuli; thermal decomposition and cookoff properties of energetic materials; burn-to-violent reaction phenomena in energetic materials; and shock-to-detonation properties of solid propellants and energetic materials.

  19. JANNAF 19th Propulsion Systems Hazards Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Cocchiaro, James E. (Editor); Kuckels, Melanie C. (Editor)

    2000-01-01

    This volume, the first of two volumes is a compilation of 25 unclassified/unlimited-distribution technical papers presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 19th Propulsion Systems Hazards Subcommittee (PSHS) meeting held jointly with the 37th Combustion Subcommittee (CS) and 25th Airbreathing Propulsion Subcommittee (APS), and 1st Modeling and Simulation Subcommittee (MSS) meetings. The meeting was held 13-17 November 2000 at the Naval Postgraduate School and Hyatt Regency Hotel, Monterey, California. Topics covered at the PSHS meeting include: impact and thermal vulnerability of gun propellants; thermal decomposition and cookoff behavior of energetic materials; violent reaction and detonation phenomena of solid energetic materials subjected to shock and impact loading; and hazard classification, and insensitive munitions testing of propellants and propulsion systems.

  20. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  1. Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem

    DOE PAGES

    Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; ...

    2015-01-01

    In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.

  2. Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.

    In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.

  3. Natural Hazards, Second Edition

    NASA Astrophysics Data System (ADS)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  4. A Guidance Manual: Waste Analysis at Facilities that Generate, Treat, Store, and Dispose of Hazardous Wastes

    EPA Pesticide Factsheets

    Discusses how a person can perform waste analyses and develop waste analysis plans (WAPs) in accordance with the federal hazardous waste regulations of the Resource Conservation and Recovery Act (RCRA)

  5. Advanced Methodologies for NASA Science Missions

    NASA Astrophysics Data System (ADS)

    Hurlburt, N. E.; Feigelson, E.; Mentzel, C.

    2017-12-01

    Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.

  6. Climate Change Adaptation Science Activities at NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Lulla, Kamlesh

    2012-01-01

    The Johnson Space Center (JSC), located in the southeast metropolitan region of Houston, TX is the prime NASA center for human spaceflight operations and astronaut training, but it also houses the unique collection of returned extraterrestrial samples, including lunar samples from the Apollo missions. The Center's location adjacent to Clear Lake and the Clear Creek watershed, an estuary of Galveston Bay, puts it at direct annual risk from hurricanes, but also from a number of other climate-related hazards including drought, floods, sea level rise, heat waves, and high wind events all assigned Threat Levels of 2 or 3 in the most recent NASA Center Disaster/Risk Matrix produced by the Climate Adaptation Science Investigator Working Group. Based on prior CASI workshops at other NASA centers, it is recognized that JSC is highly vulnerable to climate-change related hazards and has a need for adaptation strategies. We will present an overview of prior CASI-related work at JSC, including publication of a climate change and adaptation informational data brochure, and a Resilience and Adaptation to Climate Risks Workshop that was held at JSC in early March 2012. Major outcomes of that workshop that form a basis for work going forward are 1) a realization that JSC is embedded in a regional environmental and social context, and that potential climate change effects and adaptation strategies will not, and should not, be constrained by the Center fence line; 2) a desire to coordinate data collection and adaptation planning activities with interested stakeholders to form a regional climate change adaptation center that could facilitate interaction with CASI; 3) recognition that there is a wide array of basic data (remotely sensed, in situ, GIS/mapping, and historical) available through JSC and other stakeholders, but this data is not yet centrally accessible for planning purposes.

  7. 14 CFR 417.409 - System hazard controls.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... hazard as identified by the ground safety analysis and satisfy the requirements of this section. A launch... electrical power and signal circuits that interface with hazardous subsystems. (e) Propulsion systems. A...

  8. 14 CFR 417.409 - System hazard controls.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... hazard as identified by the ground safety analysis and satisfy the requirements of this section. A launch... electrical power and signal circuits that interface with hazardous subsystems. (e) Propulsion systems. A...

  9. 14 CFR 417.409 - System hazard controls.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... hazard as identified by the ground safety analysis and satisfy the requirements of this section. A launch... electrical power and signal circuits that interface with hazardous subsystems. (e) Propulsion systems. A...

  10. 14 CFR 417.409 - System hazard controls.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... hazard as identified by the ground safety analysis and satisfy the requirements of this section. A launch... electrical power and signal circuits that interface with hazardous subsystems. (e) Propulsion systems. A...

  11. Expert Water Quality Panel Review of Responses to the NASA Request for Information for the International Space Station On-Board Environmental Monitoring System

    NASA Technical Reports Server (NTRS)

    Fishman, Julianna L.; Mudgett, Paul D.; Packham, Nigel J.; Schultz, John R.; Straub, John E., II

    2005-01-01

    On August 9, 2003, NASA, with the cooperative support of the Vehicle Office of the International Space Station Program, the Advanced Human Support Technology Program, and the Johnson Space Center Habitability and Environmental Factors Office released a Request for Information, or RFI, to identify next-generation environmental monitoring systems that have demonstrated ability or the potential to meet defined requirements for monitoring air and water quality onboard the International Space Station. This report summarizes the review and analysis of the proposed solutions submitted to meet the water quality monitoring requirements. Proposals were to improve upon the functionality of the existing Space Station Total Organic Carbon Analyzer (TOCA) and monitor additional contaminants in water samples. The TOCA is responsible for in-flight measurement of total organic carbon, total inorganic carbon, total carbon, pH, and conductivity in the Space Station potable water supplies. The current TOCA requires hazardous reagents to accomplish the carbon analyses. NASA is using the request for information process to investigate new technologies that may improve upon existing capabilities, as well as reduce or eliminate the need for hazardous reagents. Ideally, a replacement for the TOCA would be deployed in conjunction with the delivery of the Node 3 water recovery system currently scheduled for November 2007.

  12. NDARC - NASA Design and Analysis of Rotorcraft Validation and Demonstration

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2010-01-01

    Validation and demonstration results from the development of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are presented. The principal tasks of NDARC are to design a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft chosen as NDARC development test cases are the UH-60A single main-rotor and tail-rotor helicopter, the CH-47D tandem helicopter, the XH-59A coaxial lift-offset helicopter, and the XV-15 tiltrotor. These aircraft were selected because flight performance data, a weight statement, detailed geometry information, and a correlated comprehensive analysis model are available for each. Validation consists of developing the NDARC models for these aircraft by using geometry and weight information, airframe wind tunnel test data, engine decks, rotor performance tests, and comprehensive analysis results; and then comparing the NDARC results for aircraft and component performance with flight test data. Based on the calibrated models, the capability of the code to size rotorcraft is explored.

  13. Technological Innovations from NASA

    NASA Technical Reports Server (NTRS)

    Pellis, Neal R.

    2006-01-01

    The challenge of human space exploration places demands on technology that push concepts and development to the leading edge. In biotechnology and biomedical equipment development, NASA science has been the seed for numerous innovations, many of which are in the commercial arena. The biotechnology effort has led to rational drug design, analytical equipment, and cell culture and tissue engineering strategies. Biomedical research and development has resulted in medical devices that enable diagnosis and treatment advances. NASA Biomedical developments are exemplified in the new laser light scattering analysis for cataracts, the axial flow left ventricular-assist device, non contact electrocardiography, and the guidance system for LASIK surgery. Many more developments are in progress. NASA will continue to advance technologies, incorporating new approaches from basic and applied research, nanotechnology, computational modeling, and database analyses.

  14. NASA Update.

    NASA Image and Video Library

    2011-02-15

    NASA Deputy Administrator Lori Garver answers questions during a NASA Update on, Tuesday, Feb. 15, 2011, at NASA Headquarters in Washington. Garver and NASA Administrator Charles Bolden took the time discuss the agency’s fiscal year 2012 budget request and to take questions from employees. Photo Credit: (NASA/Bill Ingalls)

  15. NASA Update

    NASA Image and Video Library

    2009-07-20

    Alan Ladwig, senior advisor to the NASA Administator, far left, makes a point as he introduces NASA Administrator Charles F. Bolden Jr. and Deputy Administrator Lori Garver at a NASA Update,Tuesday, July 21, 2009, at NASA Headquarters in Washington. Bolden, NASA's 12th Administrator and Garver took the time to introduce themselves and outline their vision for the agency going forward. No questions were taken during the session. Photo Credit: (NASA/Bill Ingalls)

  16. Lessons Learned from Inlet Integration Analysis of NASA's Low Boom Flight Demonstrator

    NASA Technical Reports Server (NTRS)

    Friedlander, David; Heath, Christopher; Castner, Ray

    2017-01-01

    In 2016, NASA's Aeronautics Research Mission Directorate announced the New Aviation Horizons Initiative with a goal of designing/building several X-Planes, including a Low Boom Flight Demonstrator (LBFD). That same year, NASA awarded a contract to Lockheed Martin (LM) to advance the LBFD concept through preliminary design. Several configurations of the LBFD aircraft were analyzed by both LM engineers and NASA researchers. This presentation focuses on some of the CFD simulations that were run by NASA Glenn researchers. NASA's FUN3D V13.1 code was used for all adjoint-based grid refinement studies and Spalart-Allmaras turbulence model was used during adaptation. It was found that adjoint-based grid adaptation did not accurately capture inlet performance for high speed top-aft-mounted propulsion.

  17. NASA Aviation Safety Program Systems Analysis/Program Assessment Metrics Review

    NASA Technical Reports Server (NTRS)

    Louis, Garrick E.; Anderson, Katherine; Ahmad, Tisan; Bouabid, Ali; Siriwardana, Maya; Guilbaud, Patrick

    2003-01-01

    The goal of this project is to evaluate the metrics and processes used by NASA's Aviation Safety Program in assessing technologies that contribute to NASA's aviation safety goals. There were three objectives for reaching this goal. First, NASA's main objectives for aviation safety were documented and their consistency was checked against the main objectives of the Aviation Safety Program. Next, the metrics used for technology investment by the Program Assessment function of AvSP were evaluated. Finally, other metrics that could be used by the Program Assessment Team (PAT) were identified and evaluated. This investigation revealed that the objectives are in fact consistent across organizational levels at NASA and with the FAA. Some of the major issues discussed in this study which should be further investigated, are the removal of the Cost and Return-on-Investment metrics, the lack of the metrics to measure the balance of investment and technology, the interdependencies between some of the metric risk driver categories, and the conflict between 'fatal accident rate' and 'accident rate' in the language of the Aviation Safety goal as stated in different sources.

  18. NASA Quest.

    ERIC Educational Resources Information Center

    Ashby, Susanne

    2000-01-01

    Introduces NASA Quest as part of NASA's Learning Technologies Project, which connects students to the people of NASA through the various pages at the website where students can glimpse the various types of work performed at different NASA facilities and talk to NASA workers about the type of work they do. (ASK)

  19. NASA Update

    NASA Image and Video Library

    2011-02-15

    NASA Administrator Charles F. Bolden Jr., and Deputy Administrator Lori Garver deliver a NASA Update on, Tuesday, Feb. 15, 2011, at NASA Headquarters in Washington. Bolden, NASA's 12th Administrator and Garver took the time discuss the agency’s fiscal year 2012 budget request and to take questions from employees. Photo Credit: (NASA/Bill Ingalls)

  20. NASA Update

    NASA Image and Video Library

    2011-02-15

    NASA Deputy Administrator Lori Garver listens as NASA Administrator Charles Bolden answers a question during a NASA Update on Tuesday, Feb. 15, 2011, at NASA Headquarters in Washington. Bolden and Garver took the time discuss the agency’s fiscal year 2012 budget request and to take questions from employees. Photo Credit: (NASA/Bill Ingalls)