41 CFR 102-80.150 - What is meant by “reasonable worst case fire scenario”?
Code of Federal Regulations, 2011 CFR
2011-01-01
... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What is meant by âreasonable worst case fire scenarioâ? 102-80.150 Section 102-80.150 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80...
41 CFR 102-80.150 - What is meant by “reasonable worst case fire scenario”?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is meant by âreasonable worst case fire scenarioâ? 102-80.150 Section 102-80.150 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80...
NASA Technical Reports Server (NTRS)
Olson, S. L.
2004-01-01
NASA's current method of material screening determines fire resistance under conditions representing a worst-case for normal gravity flammability - the Upward Flame Propagation Test (Test 1). Its simple pass-fail criteria eliminates materials that burn for more than 12 inches from a standardized ignition source. In addition, if a material drips burning pieces that ignite a flammable fabric below, it fails. The applicability of Test 1 to fires in microgravity and extraterrestrial environments, however, is uncertain because the relationship between this buoyancy-dominated test and actual extraterrestrial fire hazards is not understood. There is compelling evidence that the Test 1 may not be the worst case for spacecraft fires, and we don t have enough information to assess if it is adequate at Lunar or Martian gravity levels.
NASA Technical Reports Server (NTRS)
Olson, S. L.
2004-01-01
NASA s current method of material screening determines fire resistance under conditions representing a worst-case for normal gravity flammability - the Upward Flame Propagation Test (Test 1[1]). Its simple pass-fail criteria eliminates materials that burn for more than 12 inches from a standardized ignition source. In addition, if a material drips burning pieces that ignite a flammable fabric below, it fails. The applicability of Test 1 to fires in microgravity and extraterrestrial environments, however, is uncertain because the relationship between this buoyancy-dominated test and actual extraterrestrial fire hazards is not understood. There is compelling evidence that the Test 1 may not be the worst case for spacecraft fires, and we don t have enough information to assess if it is adequate at Lunar or Martian gravity levels.
Management adaptation to fires in the wildland-urban risk areas in Spain
Gema Herrero-Corral
2013-01-01
Forest fires not only cause damage to ecosystems but also result in major socio-economic losses and in the worst cases loss of human life. Specifically, the incidence of fires in the overlapping areas between building structures and forest vegetation (wildland-urban interface, WUI) generates highly-complex emergencies due to the presence of people and goods....
Crisis management with applicability on fire fighting plants
NASA Astrophysics Data System (ADS)
Panaitescu, M.; Panaitescu, F. V.; Voicu, I.; Dumitrescu, L. G.
2017-08-01
The paper presents a case study for a crisis management analysis which address to fire fighting plants. The procedures include the steps of FTA (Failure tree analysis). The purpose of the present paper is to describe this crisis management plan with tools of FTA. The crisis management procedures have applicability on anticipated and emergency situations and help to describe and planning a worst-case scenario plan. For this issue must calculate the probabilities in different situations for fire fighting plants. In the conclusions of paper is analised the block diagram with components of fire fighting plant and are presented the solutions for each possible risk situations.
Testing and Selection of Fire-Resistant Materials for Spacecraft Use
NASA Technical Reports Server (NTRS)
Friedman, Robert; Jackson, Brian; Olson, Sandra
2000-01-01
Spacecraft fire-safety strategy emphasizes prevention, mostly through the selection of onboard items classified accord- ing to their fire resistance. The principal NASA acceptance tests described in this paper assess the flammability of materials and components under "worst-case" normal-gravity conditions of upward flame spread in controlled-oxygen atmospheres. Tests conducted on the ground, however, cannot duplicate the unique fire characteristics in the nonbuoyant low-gravity environment of orbiting spacecraft. Research shows that flammability an fire-spread rates in low gravity are sensitive to forced convection (ventilation flows) and atmospheric-oxygen concentration. These research results are helping to define new material-screening test methods that will better evaluate material performance in spacecraft.
41 CFR 102-80.145 - What is meant by “flashover”?
Code of Federal Regulations, 2010 CFR
2010-07-01
...”? Flashover means fire conditions in a confined area where the upper gas layer temperature reaches 600 °C (1100 °F) and the heat flux at floor level exceeds 20 kW/m2 (1.8 Btu/ft2/sec). Reasonable Worst Case...
41 CFR 102-80.145 - What is meant by “flashover”?
Code of Federal Regulations, 2011 CFR
2011-01-01
...”? Flashover means fire conditions in a confined area where the upper gas layer temperature reaches 600 °C (1100 °F) and the heat flux at floor level exceeds 20 kW/m2 (1.8 Btu/ft2/sec). Reasonable Worst Case...
2011-01-01
we propose that hot-spot mitigation using thermoelectric coolers can be used as a power management mechanism to allow global coolers to be provi...sioned for a better worst case temperature leading to substan- tial savings in cooling power. In order to quantify the potential power savings from us- ing...energy density inside a processor to maximally tolerable levels, modern microprocessors make ex- tensive use of hardware structures such as the load
2016-09-01
noise density and temperature sensitivity of these devices are all on the same order of magnitude. Even the worst- case noise density of the GCDC...accelerations from a handgun firing were distinct from other impulsive events on the wrist, such as using a hammer. Loeffler first identified potential shots by...spikes, taking various statistical parameters. He used a logistic regression model on these parameters and was able to classify 98.9% of shots
The KSC Simulation Team practices for contingencies in Firing Room 1
NASA Technical Reports Server (NTRS)
1998-01-01
In Firing Room 1 at KSC, Shuttle launch team members put the Shuttle system through an integrated simulation. The control room is set up with software used to simulate flight and ground systems in the launch configuration. A Simulation Team, comprised of KSC engineers, introduce 12 or more major problems to prepare the launch team for worst-case scenarios. Such tests and simulations keep the Shuttle launch team sharp and ready for liftoff. The next liftoff is targeted for Oct. 29.
The +vbar breakout during approach to Space Station Freedom
NASA Technical Reports Server (NTRS)
Dunham, Scott D.
1993-01-01
A set of burn profiles was developed to provide bounding jet firing histories for a +vbar breakout during approaches to Space Station Freedom. The delta-v sequences were designed to place the Orbiter on a safe trajectory under worst case conditions and to try to minimize plume impingement on Space Station Freedom structure.
Isolator fragmentation and explosive initiation tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, Peter; Rae, Philip John; Foley, Timothy J.
2016-09-19
Three tests were conducted to evaluate the effects of firing an isolator in proximity to a barrier or explosive charge. The tests with explosive were conducted without a barrier, on the basis that since any barrier will reduce the shock transmitted to the explosive, bare explosive represents the worst-case from an inadvertent initiation perspective. No reaction was observed. The shock caused by the impact of a representative plastic material on both bare and cased PBX 9501 is calculated in the worst-case, 1-D limit, and the known shock response of the HE is used to estimate minimum run-to-detonation lengths. The estimatesmore » demonstrate that even 1-D impacts would not be of concern and that, accordingly, the divergent shocks due to isolator fragment impact are of no concern as initiating stimuli.« less
Isolator fragmentation and explosive initiation tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, Peter; Rae, Philip John; Foley, Timothy J.
2015-09-30
Three tests were conducted to evaluate the effects of firing an isolator in proximity to a barrier or explosive charge. The tests with explosive were conducted without barrier, on the basis that since any barrier will reduce the shock transmitted to the explosive, bare explosive represents the worst-case from an inadvertent initiation perspective. No reaction was observed. The shock caused by the impact of a representative plastic material on both bare and cased PBX9501 is calculated in the worst-case, 1-D limit, and the known shock response of the HE is used to estimate minimum run-to-detonation lengths. The estimates demonstrate thatmore » even 1-D impacts would not be of concern and that, accordingly, the divergent shocks due to isolator fragment impact are of no concern as initiating stimuli.« less
Changes in fire weather distributions: effects on predicted fire behavior
Lucy A. Salazar; Larry S. Bradshaw
1984-01-01
Data that represent average worst fire weather for a particular area are used to index daily fire danger; however, they do not account for different locations or diurnal weather changes that significantly affect fire behavior potential. To study the effects that selected changes in weather databases have on computed fire behavior parameters, weather data for the...
A bioinspired collision detection algorithm for VLSI implementation
NASA Astrophysics Data System (ADS)
Cuadri, J.; Linan, G.; Stafford, R.; Keil, M. S.; Roca, E.
2005-06-01
In this paper a bioinspired algorithm for collision detection is proposed, based on previous models of the locust (Locusta migratoria) visual system reported by F.C. Rind and her group, in the University of Newcastle-upon-Tyne. The algorithm is suitable for VLSI implementation in standard CMOS technologies as a system-on-chip for automotive applications. The working principle of the algorithm is to process a video stream that represents the current scenario, and to fire an alarm whenever an object approaches on a collision course. Moreover, it establishes a scale of warning states, from no danger to collision alarm, depending on the activity detected in the current scenario. In the worst case, the minimum time before collision at which the model fires the collision alarm is 40 msec (1 frame before, at 25 frames per second). Since the average time to successfully fire an airbag system is 2 msec, even in the worst case, this algorithm would be very helpful to more efficiently arm the airbag system, or even take some kind of collision avoidance countermeasures. Furthermore, two additional modules have been included: a "Topological Feature Estimator" and an "Attention Focusing Algorithm". The former takes into account the shape of the approaching object to decide whether it is a person, a road line or a car. This helps to take more adequate countermeasures and to filter false alarms. The latter centres the processing power into the most active zones of the input frame, thus saving memory and processing time resources.
Tseng, Wei-Wen; Shih, Chung-Liang; Chien, Shen-Wen
2013-04-01
Taiwan's worst hospital fire in history on October 23rd, 2012 at Sinying Hospital's Bei-Men Branch resulted in 13 elderly patient deaths and over 70 injuries. The heavy casualties were due in part to the serious condition of patients. Some patients on life-support machines were unable to move or be moved. This disaster highlights the issue of fire safety in small-scale hospitals that have transformed existing hospital space into special care environments for elderly patients. Compared with medical centers and general hospitals, these small-scale health facilities are ill equipped to deal properly with fire safety management and emergency response issues due to inadequate fire protection facilities, fire safety equipment, and human resources. Small-scale facilities that offer health care and medical services to mostly immobile patients face fire risks that differ significantly from general health care facilities. This paper focuses on fire risks in small-scale facilities and suggests a strategy for fire prevention and emergency response procedures, including countermeasures for fire risk assessment, management, and emergency response, in order to improve fire safety at these institutions in Taiwan.
Workplace fire-not a misfortune, but an avoidable occupational hazard in Korea.
Park, Ji-Eun; Kim, Myoung-Hee
2015-02-01
In this article, we argue that workplace fire should be understood within an occupational safety and health context. We selected two cases of fire and explosion with the greatest numbers of fatalities from the annual lists of the "Worst Manslaughter Companies of the Year" in Korea. Through review of information from major media, government, courts, and workers' advocacy organizations, we found that these incidents resulted from violations of basic safety rules by the companies, and that the penalties imposed on them were light. In addition, precarious workers were more vulnerable to such risk, and self-regulation did not work even in large corporations. Like other types of occupational hazards, explosions and fires can be prevented, but prevention requires that occupational safety and health regulations be thoroughly enforced and that heavy penalties be imposed in order to eliminate any incentives for regulatory violations. © 2015 SAGE Publications.
The viability of prescribed fire for mitigating the soil degradational impacts of wildfire
NASA Astrophysics Data System (ADS)
Shakesby, R. A.; Bento, C. P. M.; Ferreira, C. S. S.; Ferreira, A. J. D.; Stoof, C. R.; Urbanek, E.; Walsh, R. P. D.
2012-04-01
Prescribed (controlled) fire has become an important strategy primarily to limit the likelihood of more devastating wildfire. The considerable increase in wildfire activity in recent decades throughout the Mediterranean, and in Portugal in particular, has meant that this strategy has become increasingly popular despite inherent fears of people about fire of any sort. Despite many studies of the impact of wildfire on soil erosion and degradation, relatively little research has assessed impacts of prescribed fire on soil in Portugal or elsewhere in the Mediterranean. As part of the DESIRE research programme, this paper addresses this research gap by investigating hillslope-scale losses of soil, soil organic matter and selected nutrients before and after an experimental fire (representing a 'worst case-scenario' prescribed fire) in a shrub-vegetated catchment in central Portugal. Comparison is provided by post-fire monitoring of a nearby hillslope affected by a wildfire of moderate severity. Hillslope-scale measurements were carried out over c. 3 years using sediment fences with contributing areas of up to c. 0.5 ha. Eroded sediment was periodically removed from the fences both before and after the fire at intervals ranging from a few weeks to several months depending on rainfall characteristics and logistics. Erosion expressed as g/m2 and g/m2/mm of rainfall was determined. Figures for long-term (c. 10 years) erosion under unburnt conditions for this vegetation type were obtained from a small bounded plot and from sediment accumulating in a weir pool draining a sub-catchment within the prescribed-fire catchment. In addition, soil organic matter and selected nutrients, including K2O, P2O5 and Total N, were measured in the eroded sediment and in the pre-burn and post-burn in situ soil. The results indicate that both the wildfire and prescribed fire caused erosion that was orders of magnitude higher than for long-term plot-scale and hillslope-scale erosion recorded under unburnt conditions. Total post-fire erosion measured over 21 /2 years was relatively high for this worst case scenario prescribed fire even when compared with published results from smaller-scale plots monitored after wildfire elsewhere in the Mediterranean, which would be expected to be higher. Nevertheless, the post-fire hillslope-scale losses appear to have had a relatively low impact on the thin, stony, degraded soils. This is thought also to be the case following the wildfire, even though it caused somewhat higher erosion. Its other serious effects (damage to habitat and property, loss of life), however, mean that wildfire can never be viewed as acceptable, particularly where people live in close proximity to highly fire-prone terrain. The results support the viability of prescribed fire as a strategy for combating wildfire on shrub-vegetated terrain in this wet Mediterranean environment. This view of a low impact of prescribed fire on the terrain may be different where the stability of the soil is reduced by disturbance through ploughing, where soils are very thin or contain relatively few stones, or where fire is carried out too frequently.
NASA Astrophysics Data System (ADS)
Wegrzyński, Wojciech; Konecki, Marek
2018-01-01
This paper presents results of CFD and scale modelling of the flow of heat and smoke inside and outside of a compartment, in case of fire. Estimation of mass flow out of a compartment is critical, as it is the boundary condition in further considerations related to the exhaust of the smoke from a building - also in analysis related to the performance of natural ventilation in wind conditions. Both locations of the fire and the size of compartment were addressed as possible variables, which influence the mass and the temperature of smoke that leaves the room engulfed in fire. Results of the study show small to none influence of both size of the compartment and the location of the fire, on the mass flow of smoke exiting the room. On the same time, both of these parameters influence the temperature of the smoke - in larger compartments lower average temperatures of the smoke layer, but higher maximum values were observed. Results of this study may be useful also in the determination of the worst case scenarios for structural analysis, or in the investiga tion of the spread of fire through the compartment. Based on the results presented in this study, researchers can attribute an expert judgement choice of fire location, as a single scenario that is representative of a larger amount of probable scenarios.
Static electricity: A literature review
NASA Astrophysics Data System (ADS)
Crow, Rita M.
1991-11-01
The major concern with static electricity is its discharging in a flammable atmosphere which can explode and cause a fire. Textile materials can have their electrical resistivity decreased by the addition of antistatic finishes, imbedding conductive particles into the fibres or by adding metal fibers to the yarns. The test methods used in the studies of static electricity include measuring the static properties of materials, of clothed persons, and of the ignition energy of flammable gases. Surveys have shown that there is sparse evidence for fires definitively being caused by static electricity. However, the 'worst-case' philosophy has been adopted and a static electricity safety code is described, including correct grounding procedures and the wearing of anti-static clothing and footwear.
The Oakland-Berkeley Hills fire of 1991
P. Lamont Ewell
1995-01-01
Sunday, October 20, 1991, will be remembered as the date of Americaâs most costly urban-wildland fire (FEMA 1992) and one of the worst fires involving loss of life and property since the Great San Francisco Earthquake and Fire of 1906 (OFD 1992). The magnitude and range of what is simply referred to as the âTunnel Fireâ is far beyond the experience of any living...
1998-08-20
In Firing Room 1 at KSC, Shuttle launch team members put the Shuttle system through an integrated simulation. The control room is set up with software used to simulate flight and ground systems in the launch configuration. A Simulation Team, comprised of KSC engineers, introduce 12 or more major problems to prepare the launch team for worst-case scenarios. Such tests and simulations keep the Shuttle launch team sharp and ready for liftoff. The next liftoff is targeted for Oct. 29
1981-10-02
acceptable levels under the worst case condition of coal firing. The flue gas desulfurization system would be designed to reduce the sulfer dioxide content...approaches taken. There exists federal statutory authority to implement gas rationing under certain conditions. Thus, federal controls on oil production...such as air quality deterioration, water consumption, P wastewater generation, disposal of flue gas scrubbing sludge, and ash. Another alternative is
Lundberg, Jonas; Törnqvist, Eva K; Nadjm-Tehrani, Simin
2014-10-01
In presenting examples from the most extensive and demanding fire in modern Swedish history, this paper describes challenges facing hastily formed networks in exceptional situations. Two concepts that have been used in the analysis of the socio-technical systems that make up a response are conversation space and sensemaking. This paper argues that a framework designed to promote understanding of the sensemaking process must take into consideration the time and the location at which an individual is engaged in an event. In hastily formed networks, location is partly mediated through physical systems that form conversation spaces of players and their interaction practices. This paper identifies and discusses four challenges to the formation of shared conversation spaces. It is based on the case study of the 2006 Bodträskfors forest fire in Sweden and draws on the experiences of organised volunteers and firefighters who participated in a hastily formed network created to combat the fire. © 2014 The Author(s). Disasters © Overseas Development Institute, 2014.
Policy Options to Address Crucial Communication Gaps in the Incident Command System
2012-09-01
California Department of Forestry and Fire Protection COML Communications Unit Leader COMT Communication Technician EBRPD East Bay Regional Parks...Laguna Fire 1970 - One of California’s Worst Wildfires.” Available at http://www.cccarto.com/cal_wildfire/laguna/fire.html, Accessed August 10, 2012...NIMS - The Evolution of the National Incident Management System.” Fire Rescue Magazine, August 2011. 15 compatibility, and department emergency
1998-08-19
KENNEDY SPACE CENTER, FLA. -- In Firing Room 1 at KSC, Shuttle launch team members put the Shuttle system through an integrated simulation. The control room is set up with software used to simulate flight and ground systems in the launch configuration. A Simulation Team, comprisING KSC engineers, introduce 12 or more major problems to prepare the launch team for worst-case scenarios. Such tests and simulations keep the Shuttle launch team sharp and ready for liftoff. The next liftoff is targeted for Oct. 29.
1998-08-20
KENNEDY SPACE CENTER, FLA. -- In Firing Room 1 at KSC, Shuttle launch team members put the Shuttle system through an integrated simulation. The control room is set up with software used to simulate flight and ground systems in the launch configuration. A Simulation Team, comprising KSC engineers, introduce 12 or more major problems to prepare the launch team for worst-case scenarios. Such tests and simulations keep the Shuttle launch team sharp and ready for liftoff. The next liftoff is targeted for Oct. 29
Cost-effectively automating fire door release.
Pearce, Chris
2004-04-01
UK care home fire statistics are the worst for 40 years. After catastrophic fires in Scotland and Wales this year the death toll in care homes rose to 18 fatalities over a single month. No larger loss of life in a fire at a care home in Britain has been recorded since regulations covering these homes were introduced in the 1960s. It is against this background of heightened national vigilance that Fireco's Chris Pearce explains, in a timely overview, how the very latest "add-on" automatic fire door release technology can revolutionize integrated fire alarm systems by easing access in the care setting.
Worst case analysis: Earth sensor assembly for the tropical rainfall measuring mission observatory
NASA Technical Reports Server (NTRS)
Conley, Michael P.
1993-01-01
This worst case analysis verifies that the TRMMESA electronic design is capable of maintaining performance requirements when subjected to worst case circuit conditions. The TRMMESA design is a proven heritage design and capable of withstanding the most worst case and adverse of circuit conditions. Changes made to the baseline DMSP design are relatively minor and do not adversely effect the worst case analysis of the TRMMESA electrical design.
Evaluation of Risk and Possible Mitigation Schemes for Previously Unidentified Hazards
NASA Technical Reports Server (NTRS)
Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III
2006-01-01
In April 2004, the NASA Engineering and Safety Center (NESC) was commissioned by NASA's Chief Safety and Mission Assurance (S&MA) Officer to review and render a technical opinion on the probability of a catastrophic failure related to this scenario: The Space Shuttle Program (SSP) recognized a zero-fault-tolerant design related to an inadvertent firing of the primary reaction control system (RCS) jets on the Orbiter during mated operations with the International Space Station (ISS). It was determined that an un-commanded firing of an RCS jet could cause serious damage or loss of both the SSP Orbiter and the ISS. Several scenarios were suggested in which an un-commanded firing of the RCS jet is possible. These scenarios include an arc track event in the 28-volt heater circuits that could result in a wire-to-wire short to the adjacent reaction control jet wire. In this worst-case scenario, enough current and power could be applied to activate the reaction control jet valves and fire a thruster. The following report summarizes the work that was sponsored by the NESC as part of their assessment of the Orbiter inadvertent firing of a RCS thruster while attached to the ISS.
Vegetation fires, smoke emissions, and dispersion of radionuclides in the chernobyl exclusion zone
Wei Min Hao; Oleg O. Bondarenko; Sergiy Zibtsev; Diane Hutton
2009-01-01
The accident of the Chernobyl nuclear power plant (ChNPP) in 1986 was probably the worst environmental disaster in the past 30 years. The fallout and accumulation of radionuclides in the soil and vegetation could have long-term impacts on the environment. Radionuclides released during large, catastrophic vegetation fires could spread to continental Europe, Scandinavia...
2003-01-18
This dramatic image of the Australian brushers was taken from orbit by one of the crew members aboard the International Space Station (ISS). Following the worst regional drought in 50 years, this summer's fire season has resulted in numerous large fires over much of the Great Dividing Range as well as the enormous smoke pall over New South Wales, Victoria, and the adjacent South Pacific Ocean.
NASA Technical Reports Server (NTRS)
Keckler, C. R.
1980-01-01
A high fidelity digital computer simulation was used to establish the viability of the Annular Suspension and Pointing System (ASPS) for satisfying the pointing and stability requirements of facility class payloads, such as the Solar Optical Telescope, when subjected to the Orbiter disturbance environment. The ASPS and its payload were subjected to disturbances resulting from crew motions in the Orbiter aft flight deck and VRCS thruster firings. Worst case pointing errors of 0.005 arc seconds were experienced under the disturbance environment simulated; this is well within the 0.08 arc seconds requirement specified by the payload.
Electronic delay ignition module for single bridgewire Apollo standard initiator
NASA Technical Reports Server (NTRS)
Ward, R. D.
1975-01-01
An engineering model and a qualification model of the EDIM were constructed and tested to Scout flight qualification criteria. The qualification model incorporated design improvements resulting from the engineering model tests. Compatibility with single bridgewire Apollo standard initiator (SBASI) was proven by test firing forty-five (45) SBASI's with worst case voltage and temperature conditions. The EDIM was successfully qualified for Scout flight application with no failures during testing of the qualification unit. Included is a method of implementing the EDIM into Scout vehicle hardware and the ground support equipment necessary to check out the system.
Fire-related injuries with inpatient care in Finland: a 10-year nationwide study.
Haikonen, Kari; Lillsunde, Pirjo M; Lunetta, Philippe; Lounamaa, Anne; Vuola, Jyrki
2013-06-01
The aim of this study was to examine fire-related injuries leading to inpatient care in Finland. The Finnish National Hospital Discharge Register (2000-2009) and a sample of 222 patients from the Helsinki Burn Centre who sustained flame burns was used. During the 10-years study period, the incidence of fire-related injuries with inpatient care was approximately 5.6 per 100000 persons-years (n=295; males 74%, females 26%). Approximately three quarters involved burns and the remaining cases were mostly combustion gas poisonings. Burns declined from 5.4 in 2000 to 4.0 per 100000 person-years in 2009. The decline was accounted for by young people primarily. Socio-economic features and smoking habits differ between the injured and general population. House fire victims were mainly middle aged and older, while injures involving flammable substances, campfires, etc., were mostly associated with young people. House fires caused the worst damage in terms of Total Body Surface Area burned and inhalation burns. Significantly more people die on the scene of the incident than during the hospital care. Targeting preventive measures in particular at older people and those with a tendency for alcohol abuse and smoking could potentially reduce the burden of the most severe flame burns. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.
Meehan, Bart
2008-01-01
On 18th January, 2003, one of the worst bushfires in the history of Australia hit the capital city, Canberra. By the time it was under control, four people were dead and more than 500 homes were destroyed. The fire also destroyed the Mount Stromlo campus of the Australian National University, the location of the Research School of Astronomy and Astrophysics. In response to the fires, the University initiated its emergency management strategy and business continuity plans. These allowed the School to recommence limited operations within two weeks of the disaster. This paper details a case study of the impact of the fire (in part using personal recollections of staff and students), and the emergency response implemented by the University. It describes the development of the University's emergency management strategy, with its emphasis on the key elements of clear chain of command and flexibility in developing an incident-specific response. The paper also provides an assessment of how the plan worked during an actual incident and some of the lessons learned, including the importance of the early response, managing the impact on people, media management, insurance and communications.
SU-E-T-551: PTV Is the Worst-Case of CTV in Photon Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, D; Liu, W; Park, P
2014-06-01
Purpose: To examine the supposition of the static dose cloud and adequacy of the planning target volume (PTV) dose distribution as the worst-case representation of clinical target volume (CTV) dose distribution for photon therapy in head and neck (H and N) plans. Methods: Five diverse H and N plans clinically delivered at our institution were selected. Isocenter for each plan was shifted positively and negatively in the three cardinal directions by a displacement equal to the PTV expansion on the CTV (3 mm) for a total of six shifted plans per original plan. The perturbed plan dose was recalculated inmore » Eclipse (AAA v11.0.30) using the same, fixed fluence map as the original plan. The dose distributions for all plans were exported from the treatment planning system to determine the worst-case CTV dose distributions for each nominal plan. Two worst-case distributions, cold and hot, were defined by selecting the minimum or maximum dose per voxel from all the perturbed plans. The resulting dose volume histograms (DVH) were examined to evaluate the worst-case CTV and nominal PTV dose distributions. Results: Inspection demonstrates that the CTV DVH in the nominal dose distribution is indeed bounded by the CTV DVHs in the worst-case dose distributions. Furthermore, comparison of the D95% for the worst-case (cold) CTV and nominal PTV distributions by Pearson's chi-square test shows excellent agreement for all plans. Conclusion: The assumption that the nominal dose distribution for PTV represents the worst-case dose distribution for CTV appears valid for the five plans under examination. Although the worst-case dose distributions are unphysical since the dose per voxel is chosen independently, the cold worst-case distribution serves as a lower bound for the worst-case possible CTV coverage. Minor discrepancies between the nominal PTV dose distribution and worst-case CTV dose distribution are expected since the dose cloud is not strictly static. This research was supported by the NCI through grant K25CA168984, by The Lawrence W. and Marilyn W. Matteson Fund for Cancer Research, and by the Fraternal Order of Eagles Cancer Research Fund, the Career Development Award Program at Mayo Clinic.« less
Suppression and Structure of Low Strain Rate Nonpremixed Flames
NASA Technical Reports Server (NTRS)
Hamins, Anthony; Bundy, Matthew; Park, Woe Chul; Lee, Ki Yong; Logue, Jennifer
2003-01-01
The agent concentration required to achieve suppression of low strain rate nonpremixed flames is an important fire safety consideration. In a microgravity environment such as a space platform, unwanted fires will likely occur in near quiescent conditions where strain rates are very low. Diffusion flames typically become more robust as the strain rate is decreased. When designing a fire suppression system for worst-case conditions, low strain rates should be considered. The objective of this study is to investigate the impact of radiative emission, flame strain, agent addition, and buoyancy on the structure and extinction of low strain rate nonpremixed flames through measurements and comparison with flame simulations. The suppression effectiveness of a suppressant (N2) added to the fuel stream of low strain rate methane-air diffusion flames was measured. Flame temperature measurements were attained in the high temperature region of the flame (T greater than 1200 K) by measurement of thin filament emission intensity. The time varying temperature was measured and simulated as the flame made the transition from normal to microgravity conditions and as the flame extinguished.
Time Safety Margin: Theory and Practice
2016-09-01
Basic Dive Recovery Terminology The Simplest Definition of TSM: Time Safety Margin is the time to directly travel from the worst-case vector to an...Safety Margin (TSM). TSM is defined as the time in seconds to directly travel from the worst case vector (i.e. worst case combination of parameters...invoked by this AFI, base recovery planning and risk management upon the calculated TSM. TSM is the time in seconds to di- rectly travel from the worst case
NASA Astrophysics Data System (ADS)
Deanes, L. N.; Ahmadov, R.; McKeen, S. A.; Manross, K.; Grell, G. A.; James, E.
2016-12-01
Wildfires are increasing in number and size in the western United States as climate change contributes to warmer and drier conditions in this region. These fires lead to poor air quality and diminished visibility. The High Resolution Rapid Refresh-Smoke modeling system (HRRR-Smoke) is designed to simulate fire emissions and smoke transport with high resolution. The model is based on the Weather Research and Forecasting model, coupled with chemistry (WRF-Chem) and uses fire detection data from the Visible Infrared and Imaging Radiometer Suite (VIIRS) satellite instrument to simulate wildfire emissions and their plume rise. HRRR-Smoke is used in both real-time applications and case studies. In this study, we evaluate the HRRR-Smoke for August 2015, during one of the worst wildfire seasons on record in the United States, by focusing on wildfires that occurred in the northwestern US. We compare HRRR-Smoke simulations with hourly fine particulate matter (PM2.5) observations from the Air Quality System (https://www.epa.gov/aqs) from multiple air quality monitoring sites in Washington state. PM2.5 data includes measurements from urban, suburban and remote sites in the state. We discuss the model performance in capturing large PM2.5 enhancements detected at surface sites due to wildfires. We present various statistical parameters to demonstrate HRRR-Smoke's performance in simulating surface PM2.5 levels.
Specifying design conservatism: Worst case versus probabilistic analysis
NASA Technical Reports Server (NTRS)
Miles, Ralph F., Jr.
1993-01-01
Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.
30 CFR 553.14 - How do I determine the worst case oil-spill discharge volume?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 2 2012-07-01 2012-07-01 false How do I determine the worst case oil-spill... THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 553.14 How do I determine the worst case oil-spill discharge volume? (a) To calculate...
30 CFR 253.13 - How much OSFR must I demonstrate?
Code of Federal Regulations, 2010 CFR
2010-07-01
...: COF worst case oil-spill discharge volume Applicable amount of OSFR Over 1,000 bbls but not more than... must demonstrate OSFR in accordance with the following table: COF worst case oil-spill discharge volume... applicable table in paragraph (b)(1) or (b)(2) for a facility with a potential worst case oil-spill discharge...
30 CFR 553.14 - How do I determine the worst case oil-spill discharge volume?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 2 2013-07-01 2013-07-01 false How do I determine the worst case oil-spill... THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 553.14 How do I determine the worst case oil-spill discharge volume? (a) To calculate...
30 CFR 553.14 - How do I determine the worst case oil-spill discharge volume?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 2 2014-07-01 2014-07-01 false How do I determine the worst case oil-spill... THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 553.14 How do I determine the worst case oil-spill discharge volume? (a) To calculate...
The Impact of Fire on Active Layer Thicknes
NASA Astrophysics Data System (ADS)
Schaefer, K. M.; Parsekian, A.; Natali, S.; Ludwig, S.; Michaelides, R. J.; Zebker, H. A.; Chen, J.
2016-12-01
Fire influences permafrost thermodynamics by darkening the surface to increase solar absorption and removing insulating moss and organic soil, resulting in an increase in Active Layer Thickness (ALT). The summer of 2015 was one of the worst fire years on record in Alaska with multiple fires in the Yukon-Kuskokwim (YK) Delta. To understand the impacts of fire on permafrost, we need large-scale, extensive measurements of ALT both within and outside the fire zones. In August 2016, we surveyed ALT across multiple fire zones in the YK Delta using Ground Penetrating Radar (GPR) and mechanical probing. GPR uses pulsed, radio-frequency electromagnetic waves to noninvasively image the subsurface and is an effective tool to quickly map ALT over large areas. We supplemented this ALT data with measurements of Volumetric Water Content (VWC), Organic Layer Thickness (OLT), and burn severity. We quantified the impacts of fire by statistically comparing the measurements inside and outside the fire zones and statistically regressing ALT against VWC, change in OLT, and burn severity.
2002-06-18
The Hayman forest fire, started on June 8, is continuing to burn in the Pike National Forest, 57 km (35 miles) south-southwest of Denver. According to the U.S. Forest Service, the fire has consumed more than 90,000 acres and has become Colorado's worst fire ever. In this ASTER image, acquired Sunday, June 16, 2002 at 10:30 am MST, the dark blue area is burned vegetation and the green areas are healthy vegetation. Red areas are active fires, and the blue cloud at the top center is smoke. Meteorological clouds are white. The image covers an area of 32.2 x 35.2 km (20.0 x 21.8 miles), and displays ASTER bands 8-3-2 in red, green and blue. http://photojournal.jpl.nasa.gov/catalog/PIA03499
30 CFR 253.14 - How do I determine the worst case oil-spill discharge volume?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false How do I determine the worst case oil-spill... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 253.14 How do I determine the worst case oil-spill discharge volume? (a) To...
30 CFR 253.14 - How do I determine the worst case oil-spill discharge volume?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How do I determine the worst case oil-spill... INTERIOR OFFSHORE OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 253.14 How do I determine the worst case oil-spill discharge volume? (a) To calculate the amount...
Lower bound for LCD image quality
NASA Astrophysics Data System (ADS)
Olson, William P.; Balram, Nikhil
1996-03-01
The paper presents an objective lower bound for the discrimination of patterns and fine detail in images on a monochrome LCD. In applications such as medical imaging and military avionics the information of interest is often at the highest frequencies in the image. Since LCDs are sampled data systems, their output modulation is dependent on the phase between the input signal and the sampling points. This phase dependence becomes particularly significant at high spatial frequencies. In order to use an LCD for applications such as those mentioned above it is essential to have a lower (worst case) bound on the performance of the display. We address this problem by providing a mathematical model for the worst case output modulation of an LCD in response to a sine wave input. This function can be interpreted as a worst case modulation transfer function (MTF). The intersection of the worst case MTF with the contrast threshold function (CTF) of the human visual system defines the highest spatial frequency that will always be detectable. In addition to providing the worst case limiting resolution, this MTF is combined with the CTF to produce objective worst case image quality values using the modulation transfer function area (MTFA) metric.
Probabilistic Solar Energetic Particle Models
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.
2011-01-01
To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.
Selected Parametric Effects on Materials Flammability Limits
NASA Technical Reports Server (NTRS)
Hirsch, David B.; Juarez, Alfredo; Peyton, Gary J.; Harper, Susana A.; Olson, Sandra L.
2011-01-01
NASA-STD-(I)-6001B Test 1 is currently used to evaluate the flammability of materials intended for use in habitable environments of U.S. spacecraft. The method is a pass/fail upward flame propagation test conducted in the worst case configuration, which is defined as a combination of a material s thickness, test pressure, oxygen concentration, and temperature that make the material most flammable. Although simple parametric effects may be intuitive (such as increasing oxygen concentrations resulting in increased flammability), combinations of multi-parameter effects could be more complex. In addition, there are a variety of material configurations used in spacecraft. Such configurations could include, for example, exposed free edges where fire propagation may be different when compared to configurations commonly employed in standard testing. Studies involving combined oxygen concentration, pressure, and temperature on flammability limits have been conducted and are summarized in this paper. Additional effects on flammability limits of a material s thickness, mode of ignition, burn-length criteria, and exposed edges are presented. The information obtained will allow proper selection of ground flammability test conditions, support further studies comparing flammability in 1-g with microgravity and reduced gravity environments, and contribute to persuasive scientific cases for rigorous space system fire risk assessments.
Zhu, Zhengfei; Liu, Wei; Gillin, Michael; Gomez, Daniel R; Komaki, Ritsuko; Cox, James D; Mohan, Radhe; Chang, Joe Y
2014-05-06
We assessed the robustness of passive scattering proton therapy (PSPT) plans for patients in a phase II trial of PSPT for stage III non-small cell lung cancer (NSCLC) by using the worst-case scenario method, and compared the worst-case dose distributions with the appearance of locally recurrent lesions. Worst-case dose distributions were generated for each of 9 patients who experienced recurrence after concurrent chemotherapy and PSPT to 74 Gy(RBE) for stage III NSCLC by simulating and incorporating uncertainties associated with set-up, respiration-induced organ motion, and proton range in the planning process. The worst-case CT scans were then fused with the positron emission tomography (PET) scans to locate the recurrence. Although the volumes enclosed by the prescription isodose lines in the worst-case dose distributions were consistently smaller than enclosed volumes in the nominal plans, the target dose coverage was not significantly affected: only one patient had a recurrence outside the prescription isodose lines in the worst-case plan. PSPT is a relatively robust technique. Local recurrence was not associated with target underdosage resulting from estimated uncertainties in 8 of 9 cases.
Extensive Fires in the Western U.S.
NASA Technical Reports Server (NTRS)
2002-01-01
The summer of 2000 is shaping up to be the worst U.S. fire season in four years. On July 27, 2000, fires were burning in Mesa Verde National Park (Colorado), Montana, Idaho, Utah, Washington, Nevada, Arizona, New Mexico, Texas, and California. The Mesa Verde fire has threatened some prehistoric archeological sites. Ironically, other sites have been unearthed as vegetation was burned away by the fire and as firefighters dug trenches to serve as firebreaks. In a bizarre coincidence, one of the fires came close to the Idaho National Engineering and Environmental Laboratory, the third nuclear site affected by fire this year. This image from GOES 11, the newest NOAA Geostationary Operational Environmental Satellite (GOES), shows smoke plumes and heat signatures (red) from many of fires in the western United States on the evening of July 27. For current GOES images and more information, visit the GOES Project Science page. Marit Jentoft-Nilsen and Robert Simmon, NASA GSFC, based on data provided by NOAA
Disaster, Controversy--Are You Prepared for the Worst?
ERIC Educational Resources Information Center
Heller, Robert W.; And Others
1991-01-01
Provides demographic profiles from "Executive Educator's" fourth annual survey of U.S. school executives. Regarding disaster preparedness, only a small percentage of all districts in earthquake-prone areas have earthquake and fire action plans. Concerning controversial issues, teaching about substance abuse, child abuse, and teen suicide meets…
NASA Technical Reports Server (NTRS)
2002-01-01
2000 continues to be the worst fire season in the United States in decades. By August 8, 2000, fires in Montana and Idaho had burned more than 250,000 acres. Resources were stretched so thin that Army and Marine soldiers were recruited to help fight the fires. President Clinton visited Payette National Forest to lend moral support to the firefighters. Dense smoke from Idaho and western Montana is visible stretching all the way to North and South Dakota in this image from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS). The image was taken on August 7, 2000. Although the primary mission of SeaWiFS is to measure the biology of the ocean, it also provides stunning color imagery of the Earth's surface. For more information about fires in the U.S., visit the National Interagency Fire Center. To learn more about using satellites to monitor fires, visit Global Fire Monitoring and New Technology for Monitoring Fires from Space in the Earth Observatory. Provided by the SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 5, Appendix D
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS 5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Average input high current, worst case input high current, output low current, and data setup time are some of the results presented.
Do Principals Fire the Worst Teachers?
ERIC Educational Resources Information Center
Jacob, Brian A.
2011-01-01
This article takes advantage of a unique policy change to examine how principals make decisions regarding teacher dismissal. In 2004, the Chicago Public Schools (CPS) and Chicago Teachers Union signed a new collective bargaining agreement that gave principals the flexibility to dismiss probationary teachers for any reason and without the…
Worst-Case Flutter Margins from F/A-18 Aircraft Aeroelastic Data
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Marty
1997-01-01
An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, micron, computes a stability margin which directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The micron margins are robust margins which indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 SRA using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.
Reusable Solid Rocket Motor Nozzle Joint-4 Thermal Analysis
NASA Technical Reports Server (NTRS)
Clayton, J. Louie
2001-01-01
This study provides for development and test verification of a thermal model used for prediction of joint heating environments, structural temperatures and seal erosions in the Space Shuttle Reusable Solid Rocket Motor (RSRM) Nozzle Joint-4. The heating environments are a result of rapid pressurization of the joint free volume assuming a leak path has occurred in the filler material used for assembly gap close out. Combustion gases flow along the leak path from nozzle environment to joint O-ring gland resulting in local heating to the metal housing and erosion of seal materials. Analysis of this condition was based on usage of the NASA Joint Pressurization Routine (JPR) for environment determination and the Systems Improved Numerical Differencing Analyzer (SINDA) for structural temperature prediction. Model generated temperatures, pressures and seal erosions are compared to hot fire test data for several different leak path situations. Investigated in the hot fire test program were nozzle joint-4 O-ring erosion sensitivities to leak path width in both open and confined joint geometries. Model predictions were in generally good agreement with the test data for the confined leak path cases. Worst case flight predictions are provided using the test-calibrated model. Analysis issues are discussed based on model calibration procedures.
On the estimation of the worst-case implant-induced RF-heating in multi-channel MRI.
Córcoles, Juan; Zastrow, Earl; Kuster, Niels
2017-06-21
The increasing use of multiple radiofrequency (RF) transmit channels in magnetic resonance imaging (MRI) systems makes it necessary to rigorously assess the risk of RF-induced heating. This risk is especially aggravated with inclusions of medical implants within the body. The worst-case RF-heating scenario is achieved when the local tissue deposition in the at-risk region (generally in the vicinity of the implant electrodes) reaches its maximum value while MRI exposure is compliant with predefined general specific absorption rate (SAR) limits or power requirements. This work first reviews the common approach to estimate the worst-case RF-induced heating in multi-channel MRI environment, based on the maximization of the ratio of two Hermitian forms by solving a generalized eigenvalue problem. It is then shown that the common approach is not rigorous and may lead to an underestimation of the worst-case RF-heating scenario when there is a large number of RF transmit channels and there exist multiple SAR or power constraints to be satisfied. Finally, this work derives a rigorous SAR-based formulation to estimate a preferable worst-case scenario, which is solved by casting a semidefinite programming relaxation of this original non-convex problem, whose solution closely approximates the true worst-case including all SAR constraints. Numerical results for 2, 4, 8, 16, and 32 RF channels in a 3T-MRI volume coil for a patient with a deep-brain stimulator under a head imaging exposure are provided as illustrative examples.
On the estimation of the worst-case implant-induced RF-heating in multi-channel MRI
NASA Astrophysics Data System (ADS)
Córcoles, Juan; Zastrow, Earl; Kuster, Niels
2017-06-01
The increasing use of multiple radiofrequency (RF) transmit channels in magnetic resonance imaging (MRI) systems makes it necessary to rigorously assess the risk of RF-induced heating. This risk is especially aggravated with inclusions of medical implants within the body. The worst-case RF-heating scenario is achieved when the local tissue deposition in the at-risk region (generally in the vicinity of the implant electrodes) reaches its maximum value while MRI exposure is compliant with predefined general specific absorption rate (SAR) limits or power requirements. This work first reviews the common approach to estimate the worst-case RF-induced heating in multi-channel MRI environment, based on the maximization of the ratio of two Hermitian forms by solving a generalized eigenvalue problem. It is then shown that the common approach is not rigorous and may lead to an underestimation of the worst-case RF-heating scenario when there is a large number of RF transmit channels and there exist multiple SAR or power constraints to be satisfied. Finally, this work derives a rigorous SAR-based formulation to estimate a preferable worst-case scenario, which is solved by casting a semidefinite programming relaxation of this original non-convex problem, whose solution closely approximates the true worst-case including all SAR constraints. Numerical results for 2, 4, 8, 16, and 32 RF channels in a 3T-MRI volume coil for a patient with a deep-brain stimulator under a head imaging exposure are provided as illustrative examples.
Do Principals Fire the Worst Teachers? NBER Working Paper No. 15715
ERIC Educational Resources Information Center
Jacob, Brian A.
2010-01-01
This paper takes advantage of a unique policy change to examine how principals make decisions regarding teacher dismissal. In 2004, the Chicago Public Schools (CPS) and Chicago Teachers Union (CTU) signed a new collective bargaining agreement that gave principals the flexibility to dismiss probationary teachers for any reason and without the…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-14
... submittal identifies organic carbon emissions from natural wildfires as the primary contributor to... 20% worst days in Denali National Park were composed of organic carbon from natural fires. Alaska... Organic Matter Carbon (OMC) and Elemental Carbon (EC), it attributes all OMC and EC in the Denali region...
Code of Federal Regulations, 2012 CFR
2012-10-01
... crosses a major river or other navigable waters, which, because of the velocity of the river flow and vessel traffic on the river, would require a more rapid response in case of a worst case discharge or..., because of its velocity and vessel traffic, would require a more rapid response in case of a worst case...
Code of Federal Regulations, 2014 CFR
2014-10-01
... crosses a major river or other navigable waters, which, because of the velocity of the river flow and vessel traffic on the river, would require a more rapid response in case of a worst case discharge or..., because of its velocity and vessel traffic, would require a more rapid response in case of a worst case...
Code of Federal Regulations, 2013 CFR
2013-10-01
... crosses a major river or other navigable waters, which, because of the velocity of the river flow and vessel traffic on the river, would require a more rapid response in case of a worst case discharge or..., because of its velocity and vessel traffic, would require a more rapid response in case of a worst case...
Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization
NASA Technical Reports Server (NTRS)
Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.
2014-01-01
Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.
The Worst-Case Weighted Multi-Objective Game with an Application to Supply Chain Competitions.
Qu, Shaojian; Ji, Ying
2016-01-01
In this paper, we propose a worst-case weighted approach to the multi-objective n-person non-zero sum game model where each player has more than one competing objective. Our "worst-case weighted multi-objective game" model supposes that each player has a set of weights to its objectives and wishes to minimize its maximum weighted sum objectives where the maximization is with respect to the set of weights. This new model gives rise to a new Pareto Nash equilibrium concept, which we call "robust-weighted Nash equilibrium". We prove that the robust-weighted Nash equilibria are guaranteed to exist even when the weight sets are unbounded. For the worst-case weighted multi-objective game with the weight sets of players all given as polytope, we show that a robust-weighted Nash equilibrium can be obtained by solving a mathematical program with equilibrium constraints (MPEC). For an application, we illustrate the usefulness of the worst-case weighted multi-objective game to a supply chain risk management problem under demand uncertainty. By the comparison with the existed weighted approach, we show that our method is more robust and can be more efficiently used for the real-world applications.
An SEU resistant 256K SOI SRAM
NASA Astrophysics Data System (ADS)
Hite, L. R.; Lu, H.; Houston, T. W.; Hurta, D. S.; Bailey, W. E.
1992-12-01
A novel SEU (single event upset) resistant SRAM (static random access memory) cell has been implemented in a 256K SOI (silicon on insulator) SRAM that has attractive performance characteristics over the military temperature range of -55 to +125 C. These include worst-case access time of 40 ns with an active power of only 150 mW at 25 MHz, and a worst-case minimum WRITE pulse width of 20 ns. Measured SEU performance gives an Adams 10 percent worst-case error rate of 3.4 x 10 exp -11 errors/bit-day using the CRUP code with a conservative first-upset LET threshold. Modeling does show that higher bipolar gain than that measured on a sample from the SRAM lot would produce a lower error rate. Measurements show the worst-case supply voltage for SEU to be 5.5 V. Analysis has shown this to be primarily caused by the drain voltage dependence of the beta of the SOI parasitic bipolar transistor. Based on this, SEU experiments with SOI devices should include measurements as a function of supply voltage, rather than the traditional 4.5 V, to determine the worst-case condition.
Chi, Ching-Chi; Wang, Shu-Hui
2014-01-01
Compared to conventional therapies, biologics are more effective but expensive in treating psoriasis. To evaluate the efficacy and cost-efficacy of biologic therapies for psoriasis. We conducted a meta-analysis to calculate the efficacy of etanercept, adalimumab, infliximab, and ustekinumab for at least 75% reduction in the Psoriasis Area and Severity Index score (PASI 75) and Physician's Global Assessment clear/minimal (PGA 0/1). The cost-efficacy was assessed by calculating the incremental cost-effectiveness ratio (ICER) per subject achieving PASI 75 and PGA 0/1. The incremental efficacy regarding PASI 75 was 55% (95% confidence interval (95% CI) 38%-72%), 63% (95% CI 59%-67%), 71% (95% CI 67%-76%), 67% (95% CI 62%-73%), and 72% (95% CI 68%-75%) for etanercept, adalimumab, infliximab, and ustekinumab 45 mg and 90 mg, respectively. The corresponding 6-month ICER regarding PASI 75 was $32,643 (best case $24,936; worst case $47,246), $21,315 (best case $20,043; worst case $22,760), $27,782 (best case $25,954; worst case $29,440), $25,055 (best case $22,996; worst case $27,075), and $46,630 (best case $44,765; worst case $49,373), respectively. The results regarding PGA 0/1 were similar. Infliximab and ustekinumab 90 mg had the highest efficacy. Meanwhile, adalimumab had the best cost-efficacy, followed by ustekinumab 45 mg and infliximab.
Bangladesh: currently the worst, but possibly the future's best.
Brown, Garrett
2015-02-01
Garment workers in Bangladesh producing clothing for international brands have experienced repeated factory fires and building collapses in the last 10 years, resulting in more than 1,600 deaths and hundreds of disabling injuries. After the Tazreen Fashion fire in December 2012 and the Rana Plaza building collapse in April 2013, more than 190 international clothing brands and retailers signed an "Accord on Fire and Building Safety" with two international union federations. Full implementation of the provisions of the Accord would change "business as usual" in Bangladesh's garment industry and set a positive example for other countries and other industries with global supply chains. The components, challenges, and controversies of the Accord are detailed in the article. © 2015 SAGE Publications.
30 CFR 254.47 - Determining the volume of oil of your worst case discharge scenario.
Code of Federal Regulations, 2011 CFR
2011-07-01
... associated with the facility. In determining the daily discharge rate, you must consider reservoir characteristics, casing/production tubing sizes, and historical production and reservoir pressure data. Your...) For exploratory or development drilling operations, the size of your worst case discharge scenario is...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y; Wang, X; Li, H
Purpose: Proton therapy is more sensitive to uncertainties than photon treatments due to protons’ finite range depending on the tissue density. Worst case scenario (WCS) method originally proposed by Lomax has been adopted in our institute for robustness analysis of IMPT plans. This work demonstrates that WCS method is sufficient enough to take into account of the uncertainties which could be encountered during daily clinical treatment. Methods: A fast and approximate dose calculation method is developed to calculate the dose for the IMPT plan under different setup and range uncertainties. Effects of two factors, inversed square factor and range uncertainty,more » are explored. WCS robustness analysis method was evaluated using this fast dose calculation method. The worst-case dose distribution was generated by shifting isocenter by 3 mm along x,y and z directions and modifying stopping power ratios by ±3.5%. 1000 randomly perturbed cases in proton range and x, yz directions were created and the corresponding dose distributions were calculated using this approximated method. DVH and dosimetric indexes of all 1000 perturbed cases were calculated and compared with the result using worst case scenario method. Results: The distributions of dosimetric indexes of 1000 perturbed cases were generated and compared with the results using worst case scenario. For D95 of CTVs, at least 97% of 1000 perturbed cases show higher values than the one of worst case scenario. For D5 of CTVs, at least 98% of perturbed cases have lower values than worst case scenario. Conclusion: By extensively calculating the dose distributions under random uncertainties, WCS method was verified to be reliable in evaluating the robustness level of MFO IMPT plans of H&N patients. The extensively sampling approach using fast approximated method could be used in evaluating the effects of different factors on the robustness level of IMPT plans in the future.« less
Factors involved in dental surgery fires: a review of the literature.
VanCleave, Andrea M; Jones, James E; McGlothlin, James D; Saxen, Mark A; Sanders, Brian J; Walker, LaQuia A
2014-01-01
Surgical fires are well-characterized, readily preventable, potentially devastating operating room catastrophes that continue to occur from 20 to 100 times per year or, by one estimate, up to 600 times per year in US operating rooms, sometimes with fatal results. The most significant risk factors for surgical fires involve (a) the use of an ignition source, such as laser or electrocautery equipment, in or around an oxygen-enriched environment in the head, neck, and upper torso area and (b) the concurrent delivery of supplemental oxygen, especially via nasal cannula. Nonetheless, while these 2 conditions occur very commonly in dental surgery, especially in pediatric dental surgery where sedation and anesthesia are regularly indicated, there is a general absence of documented dental surgical fires in the literature. Barring the possibility of underreporting for fear of litigation, this may suggest that there is another mechanism or mechanisms present in dental or pediatric dental surgery that mitigates this worst-case risk of surgical fires. Some possible explanations for this include: greater fire safety awareness by dental practitioners, incidental ventilation of oxygen-enriched environments in patient oral cavities due to breathing, or suction used by dental practitioners during procedures. This review of the literature provides a background to suggest that the practice of using intraoral suction in conjunction with the use of supplemental oxygen during dental procedures may alter the conditions needed for the initiation of intraoral fires. To date, there appear to be no published studies describing the ability of intraoral suctioning devices to alter the ambient oxygen concentration in an intraoral environment. In vivo models that would allow examination of intraoral suction on the ambient oxygen concentration in a simulated intraoral environment may then provide a valuable foundation for evaluating the safety of current clinical dental surgical practices, particularly in regard to the treatment of children.
Fire Signatures of Materials Used in Spacecraft Construction
NASA Technical Reports Server (NTRS)
Taylor, Christina
2003-01-01
The focus of my work this summer was fire safety, specifically determining fire signatures from the combustion of materials commonly found in the construction of spacecraft. This project was undertaken with the aim of addressing concerns for health and safety onboard spacecraft. Under certain conditions, burning electronics produce surprisingly large amounts of acrid smoke, release fine airborne particles and expel condensable aerosols. Similarly, some wire insulation and packing material evolves smoke when in contact with a hot surface. In the limited, enclosed space available on spacecraft, these combustion products may pose a nuisance at the very least - at worst, a hazard to health or equipment. There is also a concern for fire safety in early detection on spacecraft. Our goal for the summer was to determine the most effective methods to test the materials, develop a protocol for sampling, and generate samples for analysis. We restricted our testing to electronic components, packaging and insulation materials, and wire insulation materials.
Campus Security under the Microscope
ERIC Educational Resources Information Center
Pelletier, Stephen
2008-01-01
A university president's worst nightmare can take any number of forms. The lone shooter run amok on campus. The freight-train sound of a tornado bearing down on a dormitory. A river cresting its banks, about to flood a college town. From robberies and assaults to fires and chemical spills, the list goes on and on. Campus security and safety…
Preventing Catastrophes from Data Loss
ERIC Educational Resources Information Center
Goldsborough, Reid
2004-01-01
What's the worst thing that can happen to your computer? Worse than a hard disk crash, virus infection, spam assault, denial-of-service attack, hacker take-over, fire, flood, or other human, mechanical or natural disaster is a faulty backup when you really need it. If the computer blows up, as long as your data is backed up securely, you can…
Materials Science Research Rack-1 Fire Suppressant Distribution Test Report
NASA Technical Reports Server (NTRS)
Wieland, P. O.
2002-01-01
Fire suppressant distribution testing was performed on the Materials Science Research Rack-1 (MSRR-1), a furnace facility payload that will be installed in the U.S. Lab module of the International Space Station. Unlike racks that were tested previously, the MSRR-1 uses the Active Rack Isolation System (ARIS) to reduce vibration on experiments, so the effects of ARIS on fire suppressant distribution were unknown. Two tests were performed to map the distribution of CO2 fire suppressant throughout a mockup of the MSRR-1 designed to have the same component volumes and flowpath restrictions as the flight rack. For the first test, the average maximum CO2 concentration for the rack was 60 percent, achieved within 45 s of discharge initiation, meeting the requirement to reach 50 percent throughout the rack within 1 min. For the second test, one of the experiment mockups was removed to provide a worst-case configuration, and the average maximum CO2 concentration for the rack was 58 percent. Comparing the results of this testing with results from previous testing leads to several general conclusions that can be used to evaluate future racks. The MSRR-1 will meet the requirements for fire suppressant distribution. Primary factors that affect the ability to meet the CO2 distribution requirements are the free air volume in the rack and the total area and distribution of openings in the rack shell. The length of the suppressant flowpath and degree of tortuousness has little correlation with CO2 concentration. The total area of holes in the rack shell could be significantly increased. The free air volume could be significantly increased. To ensure the highest maximum CO2 concentration, the PFE nozzle should be inserted to the stop on the nozzle.
Vanderborght, Jan; Tiktak, Aaldrik; Boesten, Jos J T I; Vereecken, Harry
2011-03-01
For the registration of pesticides in the European Union, model simulations for worst-case scenarios are used to demonstrate that leaching concentrations to groundwater do not exceed a critical threshold. A worst-case scenario is a combination of soil and climate properties for which predicted leaching concentrations are higher than a certain percentile of the spatial concentration distribution within a region. The derivation of scenarios is complicated by uncertainty about soil and pesticide fate parameters. As the ranking of climate and soil property combinations according to predicted leaching concentrations is different for different pesticides, the worst-case scenario for one pesticide may misrepresent the worst case for another pesticide, which leads to 'scenario uncertainty'. Pesticide fate parameter uncertainty led to higher concentrations in the higher percentiles of spatial concentration distributions, especially for distributions in smaller and more homogeneous regions. The effect of pesticide fate parameter uncertainty on the spatial concentration distribution was small when compared with the uncertainty of local concentration predictions and with the scenario uncertainty. Uncertainty in pesticide fate parameters and scenario uncertainty can be accounted for using higher percentiles of spatial concentration distributions and considering a range of pesticides for the scenario selection. Copyright © 2010 Society of Chemical Industry.
The Worst-Case Weighted Multi-Objective Game with an Application to Supply Chain Competitions
Qu, Shaojian; Ji, Ying
2016-01-01
In this paper, we propose a worst-case weighted approach to the multi-objective n-person non-zero sum game model where each player has more than one competing objective. Our “worst-case weighted multi-objective game” model supposes that each player has a set of weights to its objectives and wishes to minimize its maximum weighted sum objectives where the maximization is with respect to the set of weights. This new model gives rise to a new Pareto Nash equilibrium concept, which we call “robust-weighted Nash equilibrium”. We prove that the robust-weighted Nash equilibria are guaranteed to exist even when the weight sets are unbounded. For the worst-case weighted multi-objective game with the weight sets of players all given as polytope, we show that a robust-weighted Nash equilibrium can be obtained by solving a mathematical program with equilibrium constraints (MPEC). For an application, we illustrate the usefulness of the worst-case weighted multi-objective game to a supply chain risk management problem under demand uncertainty. By the comparison with the existed weighted approach, we show that our method is more robust and can be more efficiently used for the real-world applications. PMID:26820512
30 CFR 254.47 - Determining the volume of oil of your worst case discharge scenario.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the daily discharge rate, you must consider reservoir characteristics, casing/production tubing sizes, and historical production and reservoir pressure data. Your scenario must discuss how to respond to... drilling operations, the size of your worst case discharge scenario is the daily volume possible from an...
Liu, Wei; Liao, Zhongxing; Schild, Steven E; Liu, Zhong; Li, Heng; Li, Yupeng; Park, Peter C; Li, Xiaoqiang; Stoker, Joshua; Shen, Jiajian; Keole, Sameer; Anand, Aman; Fatyga, Mirek; Dong, Lei; Sahoo, Narayan; Vora, Sujay; Wong, William; Zhu, X Ronald; Bues, Martin; Mohan, Radhe
2015-01-01
We compared conventionally optimized intensity modulated proton therapy (IMPT) treatment plans against worst-case scenario optimized treatment plans for lung cancer. The comparison of the 2 IMPT optimization strategies focused on the resulting plans' ability to retain dose objectives under the influence of patient setup, inherent proton range uncertainty, and dose perturbation caused by respiratory motion. For each of the 9 lung cancer cases, 2 treatment plans were created that accounted for treatment uncertainties in 2 different ways. The first used the conventional method: delivery of prescribed dose to the planning target volume that is geometrically expanded from the internal target volume (ITV). The second used a worst-case scenario optimization scheme that addressed setup and range uncertainties through beamlet optimization. The plan optimality and plan robustness were calculated and compared. Furthermore, the effects on dose distributions of changes in patient anatomy attributable to respiratory motion were investigated for both strategies by comparing the corresponding plan evaluation metrics at the end-inspiration and end-expiration phase and absolute differences between these phases. The mean plan evaluation metrics of the 2 groups were compared with 2-sided paired Student t tests. Without respiratory motion considered, we affirmed that worst-case scenario optimization is superior to planning target volume-based conventional optimization in terms of plan robustness and optimality. With respiratory motion considered, worst-case scenario optimization still achieved more robust dose distributions to respiratory motion for targets and comparable or even better plan optimality (D95% ITV, 96.6% vs 96.1% [P = .26]; D5%- D95% ITV, 10.0% vs 12.3% [P = .082]; D1% spinal cord, 31.8% vs 36.5% [P = .035]). Worst-case scenario optimization led to superior solutions for lung IMPT. Despite the fact that worst-case scenario optimization did not explicitly account for respiratory motion, it produced motion-resistant treatment plans. However, further research is needed to incorporate respiratory motion into IMPT robust optimization. Copyright © 2015 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Selection of Worst-Case Pesticide Leaching Scenarios for Pesticide Registration
NASA Astrophysics Data System (ADS)
Vereecken, H.; Tiktak, A.; Boesten, J.; Vanderborght, J.
2010-12-01
The use of pesticides, fertilizers and manure in intensive agriculture may have a negative impact on the quality of ground- and surface water resources. Legislative action has been undertaken in many countries to protect surface and groundwater resources from contamination by surface applied agrochemicals. Of particular concern are pesticides. The registration procedure plays an important role in the regulation of pesticide use in the European Union. In order to register a certain pesticide use, the notifier needs to prove that the use does not entail a risk of groundwater contamination. Therefore, leaching concentrations of the pesticide need to be assessed using model simulations for so called worst-case scenarios. In the current procedure, a worst-case scenario represents a parameterized pesticide fate model for a certain soil and a certain time series of weather conditions that tries to represent all relevant processes such as transient water flow, root water uptake, pesticide transport, sorption, decay and volatilisation as accurate as possible. Since this model has been parameterized for only one soil and weather time series, it is uncertain whether it represents a worst-case condition for a certain pesticide use. We discuss an alternative approach that uses a simpler model that requires less detailed information about the soil and weather conditions but still represents the effect of soil and climate on pesticide leaching using information that is available for the entire European Union. A comparison between the two approaches demonstrates that the higher precision that the detailed model provides for the prediction of pesticide leaching at a certain site is counteracted by its smaller accuracy to represent a worst case condition. The simpler model predicts leaching concentrations less precise at a certain site but has a complete coverage of the area so that it selects a worst-case condition more accurately.
Wildfire contribution to world-wide desertification.
NASA Astrophysics Data System (ADS)
Neary, D.; Wittenberg, L.; Bautista, S.; Ffolliott, P.
2009-04-01
Wildfire is a natural phenomenon that began with the development of terrestrial vegetation in a lightning-filled atmosphere. Sediments from the Carboniferous Period (307-359 million years before the present) contain evidence of charcoal from post-fire ash slurry flows. As human populations developed in the Pleistocene and Holocene epochs, mankind transformed fire into one of its oldest tools. Human and natural ignited fires from lightning altered and steered the trajectories of ecosystem development in most parts of the world. Humans are now the primary source of forest and grass fire ignitions throughout the world. As human populations have increased and industrialized in the past two centuries, fire ignitions and burned areas have increased due to both sheer numbers of people and anthropogenic changes in the global climate. Recent scientific findings have bolstered the hypothesis that climate change is resulting in fire seasons starting earlier, lasting longer, burning greater areas, and being more severe Computer models point to the Western U.S., Mediterranean nations and Brazil as "hot spots" that will get extremes at their worst. The climatic change to drier and warmer conditions has the potential to aggravate wildfire conditions, resulting in burning over longer seasons, larger areas of vegetation conflagration, and higher fire severities. Wildfire is now driving desertification in some of the forest lands in the western United States. The areas of wildfire in the Southwest USA have increased dramatically in the past two decades from <10,000 ha yr-1 in the early 20th Century to over 230,000 ha yr-1 in the first decade of the 21st Century. Individual wildfires are now larger and produce higher severity burns than in the past. A combination of natural drought, climate change, excessive fuel loads, and increased ignition sources have produced the perfect conditions for fire-induced desertification. Portugal suffered the worst and second worst wildfire seasons in a three-year period (2003 - 2005). In 2005, 338,262 ha of forest land burned. This was a 77% increase over the 10-year burn average of 189,500 ha. Desertification is about the loss of the land's proper hydrologic function, biological productivity, and other ecosystem services as a result of human activities and climate change. It affects one third of the earth's surface and over a billion people. In the past, desertification was considered a problem of only arid, semi-arid, and dry sub-humid areas. However, humid zones can undergo desertification with the wrong combination of human impacts. The Amazon region is an example of where forest harvesting, shifting cut and burn agriculture, and large-scale grazing are producing desertification of a tropical rain forest on a large scale. Some of the environmental consequences of wildfires are vegetation destruction, plant species and type shifts, exotic plant invasions, wildlife habitat destruction, soil erosion, floods, watershed function decline, water supply disruption, and air pollution. All of these are immediate impacts. Some impacts will persist beyond the careers and lifetimes of individuals. Small, isolated areas do not produce noticeable desertification. But, the cumulative effect of multiple, large area, and adjacent fires can be landscape-level desertification. This paper examines wildfire contributions to desertification in regions of the world that are prone to wildfire and climate change.
Taylor, Lauren J; Nabozny, Michael J; Steffens, Nicole M; Tucholka, Jennifer L; Brasel, Karen J; Johnson, Sara K; Zelenski, Amy; Rathouz, Paul J; Zhao, Qianqian; Kwekkeboom, Kristine L; Campbell, Toby C; Schwarze, Margaret L
2017-06-01
Although many older adults prefer to avoid burdensome interventions with limited ability to preserve their functional status, aggressive treatments, including surgery, are common near the end of life. Shared decision making is critical to achieve value-concordant treatment decisions and minimize unwanted care. However, communication in the acute inpatient setting is challenging. To evaluate the proof of concept of an intervention to teach surgeons to use the Best Case/Worst Case framework as a strategy to change surgeon communication and promote shared decision making during high-stakes surgical decisions. Our prospective pre-post study was conducted from June 2014 to August 2015, and data were analyzed using a mixed methods approach. The data were drawn from decision-making conversations between 32 older inpatients with an acute nonemergent surgical problem, 30 family members, and 25 surgeons at 1 tertiary care hospital in Madison, Wisconsin. A 2-hour training session to teach each study-enrolled surgeon to use the Best Case/Worst Case communication framework. We scored conversation transcripts using OPTION 5, an observer measure of shared decision making, and used qualitative content analysis to characterize patterns in conversation structure, description of outcomes, and deliberation over treatment alternatives. The study participants were patients aged 68 to 95 years (n = 32), 44% of whom had 5 or more comorbid conditions; family members of patients (n = 30); and surgeons (n = 17). The median OPTION 5 score improved from 41 preintervention (interquartile range, 26-66) to 74 after Best Case/Worst Case training (interquartile range, 60-81). Before training, surgeons described the patient's problem in conjunction with an operative solution, directed deliberation over options, listed discrete procedural risks, and did not integrate preferences into a treatment recommendation. After training, surgeons using Best Case/Worst Case clearly presented a choice between treatments, described a range of postoperative trajectories including functional decline, and involved patients and families in deliberation. Using the Best Case/Worst Case framework changed surgeon communication by shifting the focus of decision-making conversations from an isolated surgical problem to a discussion about treatment alternatives and outcomes. This intervention can help surgeons structure challenging conversations to promote shared decision making in the acute setting.
40 CFR 300.324 - Response to worst case discharges.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 28 2011-07-01 2011-07-01 false Response to worst case discharges. 300.324 Section 300.324 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND, EMERGENCY PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS NATIONAL OIL AND HAZARDOUS SUBSTANCES POLLUTION...
40 CFR 300.324 - Response to worst case discharges.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 29 2012-07-01 2012-07-01 false Response to worst case discharges. 300.324 Section 300.324 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND, EMERGENCY PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS NATIONAL OIL AND HAZARDOUS SUBSTANCES POLLUTION...
40 CFR 300.324 - Response to worst case discharges.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 29 2013-07-01 2013-07-01 false Response to worst case discharges. 300.324 Section 300.324 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND, EMERGENCY PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS NATIONAL OIL AND HAZARDOUS SUBSTANCES POLLUTION...
40 CFR 300.324 - Response to worst case discharges.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 28 2014-07-01 2014-07-01 false Response to worst case discharges. 300.324 Section 300.324 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SUPERFUND, EMERGENCY PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS NATIONAL OIL AND HAZARDOUS SUBSTANCES POLLUTION...
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 1
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
Electrical characterization and qualification tests were performed on the RCA MWS5001D, 1024 by 1-bit, CMOS, random access memory. Characterization tests were performed on five devices. The tests included functional tests, AC parametric worst case pattern selection test, determination of worst-case transition for setup and hold times and a series of schmoo plots. The qualification tests were performed on 32 devices and included a 2000 hour burn in with electrical tests performed at 0 hours and after 168, 1000, and 2000 hours of burn in. The tests performed included functional tests and AC and DC parametric tests. All of the tests in the characterization phase, with the exception of the worst-case transition test, were performed at ambient temperatures of 25, -55 and 125 C. The worst-case transition test was performed at 25 C. The preburn in electrical tests were performed at 25, -55, and 125 C. All burn in endpoint tests were performed at 25, -40, -55, 85, and 125 C.
Calacal, Gayvelline C; Delfin, Frederick C; Tan, Michelle Music M; Roewer, Lutz; Magtanong, Danilo L; Lara, Myra C; Fortun, Raquel dR; De Ungria, Maria Corazon A
2005-09-01
In a fire tragedy in Manila in December 1998, one of the worst tragic incidents which resulted in the reported death of 23 children, identity could not be established initially resulting in the burial of still unidentified bodies. Underscoring the importance of identifying each of the human remains, the bodies were exhumed 3 months after the tragedy. We describe here our work, which was the first national case handled by local laboratories wherein conventional and molecular-based techniques were successfully applied in forensic identification. The study reports analysis of DNA obtained from skeletal remains exposed to conditions of burning, burial, and exhumation. DNA typing methods using autosomal and Y-chromosomal short tandem repeat (Y-STR) markers reinforced postmortem examinations using conventional identification techniques. The strategy resulted in the identification of 18 out of the 21 human remains analyzed, overcoming challenges encountered due to the absence of established procedures for the recovery of mass disaster remains. There was incomplete antemortem information to match the postmortem data obtained from the remains of 3 female child victims. Two victims were readily identified due to the availability of antemortem tissues. In the absence of this biologic material, parentage testing was performed using reference blood samples collected from parents and relatives. Data on patrilineal lineage based on common Y-STR haplotypes augmented autosomal DNA typing, particularly in deficiency cases.
Suppression of Low Strain Rate Nonpremixed Flames by an Agent
NASA Technical Reports Server (NTRS)
Olson, Sandra L. (Technical Monitor); Hamins, A.; Bundy, M.; Oh, C. B.; Park, J.; Puri, I. K.
2004-01-01
The extinction and structure of non-premixed methane/air flames were investigated in normal gravity and microgravity through the comparison of experiments and calculations using a counterflow configuration. From a fire safety perspective, low strain rate conditions are important for several reasons. In normal gravity, many fires start from small ignition sources where the convective flow and strain rates are weak. Fires in microgravity conditions, such as a manned spacecraft, may also occur in near quiescent conditions where strain rates are very low. When designing a fire suppression system, worst-case conditions should be considered. Most diffusion flames become more robust as the strain rate is decreased. The goal of this project is to investigate the extinction limits of non-premixed flames using various agents and to compare reduced gravity and normal gravity conditions. Experiments at the NASA Glenn Research Center's 2.2-second drop tower were conducted to attain extinction and temperature measurements in low-strain non-premixed flames. Extinction measurements using nitrogen added to the fuel stream were performed for global strain rates from 7/s to 50/s. The results confirmed the "turning point" behavior observed previously by Maruta et al. in a 10 s drop tower. The maximum nitrogen volume fraction in the fuel stream needed to assure extinction for all strain rates was measured to be 0.855+/-0.016, associated with the turning point determined to occur at a strain rate of 15/s. The critical nitrogen volume fraction in the fuel stream needed for extinction of 0-g flames was measured to be higher than that of 1-g flames.
"The Fire That Is Beginning to Stand": Teaching Historical Trauma at Stone Child College
ERIC Educational Resources Information Center
Allery, V. P.
2017-01-01
History at its best helps the present make sense of the past. History at its best tells the nation's story through the voices of all the people. These voices enlighten and provide wise counsel for the present, creating healthy and creative communities. History at its worst not only ignores the different voices, but eliminates them altogether. The…
Preventing Catastrophes from Data Loss
ERIC Educational Resources Information Center
Goldsborough, Reid
2004-01-01
What's the worst thing that can happen to a computer? Worse than a hard disk crash, virus infection, spare assault, denial-of-service attack, hacker take-over, fire, flood or some other human, mechanical or natural disaster is a faulty backup when it is really needed. If the computer blows up, as long as the data is backed up securely, it can be…
Query Optimization in Distributed Databases.
1982-10-01
general, the strategy a31 a11 a 3 is more time comsuming than the strategy a, a, and sually we do not use it. Since the semijoin of R.XJ> RS requires...analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are difficult to obtain, some...is the study of the analytic behavior of those heuristic algorithms. Although some analytic results of worst case and average case analysis are
ANOTHER LOOK AT THE FAST ITERATIVE SHRINKAGE/THRESHOLDING ALGORITHM (FISTA)*
Kim, Donghwan; Fessler, Jeffrey A.
2017-01-01
This paper provides a new way of developing the “Fast Iterative Shrinkage/Thresholding Algorithm (FISTA)” [3] that is widely used for minimizing composite convex functions with a nonsmooth term such as the ℓ1 regularizer. In particular, this paper shows that FISTA corresponds to an optimized approach to accelerating the proximal gradient method with respect to a worst-case bound of the cost function. This paper then proposes a new algorithm that is derived by instead optimizing the step coefficients of the proximal gradient method with respect to a worst-case bound of the composite gradient mapping. The proof is based on the worst-case analysis called Performance Estimation Problem in [11]. PMID:29805242
Grieger, Khara D; Hansen, Steffen F; Sørensen, Peter B; Baun, Anders
2011-09-01
Conducting environmental risk assessment of engineered nanomaterials has been an extremely challenging endeavor thus far. Moreover, recent findings from the nano-risk scientific community indicate that it is unlikely that many of these challenges will be easily resolved in the near future, especially given the vast variety and complexity of nanomaterials and their applications. As an approach to help optimize environmental risk assessments of nanomaterials, we apply the Worst-Case Definition (WCD) model to identify best estimates for worst-case conditions of environmental risks of two case studies which use engineered nanoparticles, namely nZVI in soil and groundwater remediation and C(60) in an engine oil lubricant. Results generated from this analysis may ultimately help prioritize research areas for environmental risk assessments of nZVI and C(60) in these applications as well as demonstrate the use of worst-case conditions to optimize future research efforts for other nanomaterials. Through the application of the WCD model, we find that the most probable worst-case conditions for both case studies include i) active uptake mechanisms, ii) accumulation in organisms, iii) ecotoxicological response mechanisms such as reactive oxygen species (ROS) production and cell membrane damage or disruption, iv) surface properties of nZVI and C(60), and v) acute exposure tolerance of organisms. Additional estimates of worst-case conditions for C(60) also include the physical location of C(60) in the environment from surface run-off, cellular exposure routes for heterotrophic organisms, and the presence of light to amplify adverse effects. Based on results of this analysis, we recommend the prioritization of research for the selected applications within the following areas: organism active uptake ability of nZVI and C(60) and ecotoxicological response end-points and response mechanisms including ROS production and cell membrane damage, full nanomaterial characterization taking into account detailed information on nanomaterial surface properties, and investigations of dose-response relationships for a variety of organisms. Copyright © 2011 Elsevier B.V. All rights reserved.
Reducing Probabilistic Weather Forecasts to the Worst-Case Scenario: Anchoring Effects
ERIC Educational Resources Information Center
Joslyn, Susan; Savelli, Sonia; Nadav-Greenberg, Limor
2011-01-01
Many weather forecast providers believe that forecast uncertainty in the form of the worst-case scenario would be useful for general public end users. We tested this suggestion in 4 studies using realistic weather-related decision tasks involving high winds and low temperatures. College undergraduates, given the statistical equivalent of the…
30 CFR 553.13 - How much OSFR must I demonstrate?
Code of Federal Regulations, 2014 CFR
2014-07-01
... OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 553.13... the following table: COF worst case oil-spill discharge volume Applicable amount of OSFR Over 1,000... worst case oil-spill discharge of 1,000 bbls or less if the Director notifies you in writing that the...
30 CFR 553.13 - How much OSFR must I demonstrate?
Code of Federal Regulations, 2012 CFR
2012-07-01
... OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 553.13... the following table: COF worst case oil-spill discharge volume Applicable amount of OSFR Over 1,000... worst case oil-spill discharge of 1,000 bbls or less if the Director notifies you in writing that the...
30 CFR 553.13 - How much OSFR must I demonstrate?
Code of Federal Regulations, 2013 CFR
2013-07-01
... OIL SPILL FINANCIAL RESPONSIBILITY FOR OFFSHORE FACILITIES Applicability and Amount of OSFR § 553.13... the following table: COF worst case oil-spill discharge volume Applicable amount of OSFR Over 1,000... worst case oil-spill discharge of 1,000 bbls or less if the Director notifies you in writing that the...
Martin, Adrian; Schiavi, Emanuele; Eryaman, Yigitcan; Herraiz, Joaquin L; Gagoski, Borjan; Adalsteinsson, Elfar; Wald, Lawrence L; Guerin, Bastien
2016-06-01
A new framework for the design of parallel transmit (pTx) pulses is presented introducing constraints for local and global specific absorption rate (SAR) in the presence of errors in the radiofrequency (RF) transmit chain. The first step is the design of a pTx RF pulse with explicit constraints for global and local SAR. Then, the worst possible SAR associated with that pulse due to RF transmission errors ("worst-case SAR") is calculated. Finally, this information is used to re-calculate the pulse with lower SAR constraints, iterating this procedure until its worst-case SAR is within safety limits. Analysis of an actual pTx RF transmit chain revealed amplitude errors as high as 8% (20%) and phase errors above 3° (15°) for spokes (spiral) pulses. Simulations show that using the proposed framework, pulses can be designed with controlled "worst-case SAR" in the presence of errors of this magnitude at minor cost of the excitation profile quality. Our worst-case SAR-constrained pTx design strategy yields pulses with local and global SAR within the safety limits even in the presence of RF transmission errors. This strategy is a natural way to incorporate SAR safety factors in the design of pTx pulses. Magn Reson Med 75:2493-2504, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
40 CFR Appendix D to Part 112 - Determination of a Worst Case Discharge Planning Volume
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 23 2013-07-01 2013-07-01 false Determination of a Worst Case Discharge Planning Volume D Appendix D to Part 112 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS OIL POLLUTION PREVENTION Pt. 112, App. D Appendix D to Part 112—Determination of a...
40 CFR Appendix D to Part 112 - Determination of a Worst Case Discharge Planning Volume
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Determination of a Worst Case Discharge Planning Volume D Appendix D to Part 112 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS OIL POLLUTION PREVENTION Pt. 112, App. D Appendix D to Part 112—Determination of a...
40 CFR Appendix D to Part 112 - Determination of a Worst Case Discharge Planning Volume
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 22 2014-07-01 2013-07-01 true Determination of a Worst Case Discharge Planning Volume D Appendix D to Part 112 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS OIL POLLUTION PREVENTION Pt. 112, App. D Appendix D to Part 112—Determination of a...
40 CFR Appendix D to Part 112 - Determination of a Worst Case Discharge Planning Volume
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 22 2011-07-01 2011-07-01 false Determination of a Worst Case Discharge Planning Volume D Appendix D to Part 112 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS OIL POLLUTION PREVENTION Pt. 112, App. D Appendix D to Part 112—Determination of a...
40 CFR Appendix D to Part 112 - Determination of a Worst Case Discharge Planning Volume
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 23 2012-07-01 2012-07-01 false Determination of a Worst Case Discharge Planning Volume D Appendix D to Part 112 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS OIL POLLUTION PREVENTION Pt. 112, App. D Appendix D to Part 112—Determination of a...
Robust Flutter Margin Analysis that Incorporates Flight Data
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Martin J.
1998-01-01
An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.
Wildfire air pollution hazard during the 21st century
NASA Astrophysics Data System (ADS)
Knorr, Wolfgang; Dentener, Frank; Lamarque, Jean-François; Jiang, Leiwen; Arneth, Almut
2017-07-01
Wildfires pose a significant risk to human livelihoods and are a substantial health hazard due to emissions of toxic smoke. Previous studies have shown that climate change, increasing atmospheric CO2, and human demographic dynamics can lead to substantially altered wildfire risk in the future, with fire activity increasing in some regions and decreasing in others. The present study re-examines these results from the perspective of air pollution risk, focussing on emissions of airborne particulate matter (PM2. 5), combining an existing ensemble of simulations using a coupled fire-dynamic vegetation model with current observation-based estimates of wildfire emissions and simulations with a chemical transport model. Currently, wildfire PM2. 5 emissions exceed those from anthropogenic sources in large parts of the world. We further analyse two extreme sets of future wildfire emissions in a socio-economic, demographic climate change context and compare them to anthropogenic emission scenarios reflecting current and ambitious air pollution legislation. In most regions of the world, ambitious reductions of anthropogenic air pollutant emissions have the potential to limit mean annual pollutant PM2. 5 levels to comply with World Health Organization (WHO) air quality guidelines for PM2. 5. Worst-case future wildfire emissions are not likely to interfere with these annual goals, largely due to fire seasonality, as well as a tendency of wildfire sources to be situated in areas of intermediate population density, as opposed to anthropogenic sources that tend to be highest at the highest population densities. However, during the high-fire season, we find many regions where future PM2. 5 pollution levels can reach dangerous levels even for a scenario of aggressive reduction of anthropogenic emissions.
Tight Leash Likely on Turnaround Aid: Radical Steps Proposed as Price for Title I Grants
ERIC Educational Resources Information Center
McNeil, Michele
2009-01-01
U.S. Secretary of Education Arne Duncan said that he plans to demand radical steps--such as firing most of a school's staff or converting it to a charter school--as the price of admission in directing $3.5 billion in new school improvement aid to the nation's 5,000 worst-performing schools. In sharp contrast to the current free-flowing nature of…
Less than severe worst case accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, G.A.
1996-08-01
Many systems can provide tremendous benefit if operating correctly, produce only an inconvenience if they fail to operate, but have extreme consequences if they are only partially disabled such that they operate erratically or prematurely. In order to assure safety, systems are often tested against the most severe environments and accidents that are considered possible to ensure either safe operation or safe failure. However, it is often the less severe environments which result in the ``worst case accident`` since these are the conditions in which part of the system may be exposed or rendered unpredictable prior to total system failure.more » Some examples of less severe mechanical, thermal, and electrical environments which may actually be worst case are described as cautions for others in industries with high consequence operations or products.« less
DEVELOPMENT OF A LOW-COST INFERENTIAL NATURAL GAS ENERGY FLOW RATE PROTOTYPE RETROFIT MODULE
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. Kelner; T.E. Owen; D.L. George
2004-03-01
In 1998, Southwest Research Institute{reg_sign} began a multi-year project co-funded by the Gas Research Institute (GRI) and the U.S. Department of Energy. The project goal is to develop a working prototype instrument module for natural gas energy measurement. The module will be used to retrofit a natural gas custody transfer flow meter for energy measurement, at a cost an order of magnitude lower than a gas chromatograph. Development and evaluation of the prototype retrofit natural gas energy flow meter in 2000-2001 included: (1) evaluation of the inferential gas energy analysis algorithm using supplemental gas databases and anticipated worst-case gas mixtures;more » (2) identification and feasibility review of potential sensing technologies for nitrogen diluent content; (3) experimental performance evaluation of infrared absorption sensors for carbon dioxide diluent content; and (4) procurement of a custom ultrasonic transducer and redesign of the ultrasonic pulse reflection correlation sensor for precision speed-of-sound measurements. A prototype energy meter module containing improved carbon dioxide and speed-of-sound sensors was constructed and tested in the GRI Metering Research Facility at SwRI. Performance of this module using transmission-quality natural gas and gas containing supplemental carbon dioxide up to 9 mol% resulted in gas energy determinations well within the inferential algorithm worst-case tolerance of {+-}2.4 Btu/scf (nitrogen diluent gas measured by gas chromatograph). A two-week field test was performed at a gas-fired power plant to evaluate the inferential algorithm and the data acquisition requirements needed to adapt the prototype energy meter module to practical field site conditions.« less
MISR Views a Fire-Scarred Landscape
NASA Technical Reports Server (NTRS)
2000-01-01
This MISR image pair shows 'before and after' views of the area around the Hanford Nuclear Reservation near Richland, Washington. On June 27, 2000, a fire in the dry sagebrush was sparked by an automobile crash. The flames were fanned by hot summer winds. By the day after the accident, about 100,000 acres had burned, and the fire's spread forced the closure of highways and loss of homes.
These images, from Terra orbits 2176 and 3341, were obtained by MISR's vertical-viewing (nadir) camera. Compare the area just above and to the right of the line of cumulus clouds in the May 15 image with the same area imaged on August 3. The darkened burn scar measures approximately 35 kilometers across. The Columbia River is seen wending its way around the area, and the Snake River branches off to the right.According to Idaho's National Interagency Fire Center, the US has been experiencing the worst fire season since 1996.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Responding to bushfire risk: the need for transformative adaptation
NASA Astrophysics Data System (ADS)
O'Neill, Saffron J.; Handmer, John
2012-03-01
The 2009 ‘Black Saturday’ bushfires led to 172 civilian deaths, and were proclaimed as one of Australia’s worst natural disasters. The Victorian Bushfires Royal Commission was set up in the wake of the fires to investigate the circumstances surrounding the death of each fatality. Here, results from an analysis undertaken for the Commission to examine the household preparedness policy ‘Prepare, Stay and Defend, or Leave Early’ (‘Stay or Go’), plus an examination of the Commission’s recommendations, are explored in the broader context of adaptation to bushfire. We find Victoria ill adapted to complex bushfire risk events like Black Saturday due to changing settlement patterns and the known vulnerabilities of populations living in fire prone areas, and increasingly in the future due to the influence of climate change extending fire seasons and their severity. We suggest that uncertainty needs to be better acknowledged and managed in fire risk situations, and that the responsibility for fire preparedness should be more justly distributed. We suggest that a transformation in adaptation is required to effectively manage complex bushfire risk events like Black Saturday, and provide four key ways in which transformation in bushfire preparedness could be achieved.
Fire Season 2015 in Alaska Set to Break Records
2017-12-08
Fires have raged throughout Alaska in 2015. The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Aqua satellite acquired this image on July 14, 2015. Actively burning areas, detected by the thermal bands on MODIS, are outlined in red. According to the most recent update (July 16, 2015) from the Alaska Interagency Coordination Center, about 304 fires were actively burning when MODIS imaged the area. To date, fires have charred a total of 4,854,924 acres in Alaska. The worst fire season in Alaska's history was in 2004. At this point in time, 2015 is a month ahead of the totals in 2004 putting it on track surpass the fire totals in 2004. The amount of acreage burned in Alaska during June 2015 shattered the previous acreage record set in June 2004 by more than 700,000 acres delivering a sobering piece of news for Alaskan residents. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Integrated Optoelectronic Networks for Application-Driven Multicore Computing
2017-05-08
hybrid photonic torus, the all-optical Corona crossbar, and the hybrid hierarchical Firefly crossbar. • The key challenges for waveguide photonics...improves SXR but with relatively higher EDP overhead. Our evaluation results indicate that the encoding schemes improve worst-case-SXR in Corona and...photonic crossbar architectures ( Corona and Firefly) indicate that our approach improves worst-case signal-to-noise ratio (SNR) by up to 51.7
Method of Generating Transient Equivalent Sink and Test Target Temperatures for Swift BAT
NASA Technical Reports Server (NTRS)
Choi, Michael K.
2004-01-01
The NASA Swift mission has a 600-km altitude and a 22 degrees maximum inclination. The sun angle varies from 45 degrees to 180 degrees in normal operation. As a result, environmental heat fluxes absorbed by the Burst Alert Telescope (BAT) radiator and loop heat pipe (LHP) compensation chambers (CCs) vary transiently. Therefore the equivalent sink temperatures for the radiator and CCs varies transiently. In thermal performance verification testing in vacuum, the radiator and CCs radiated heat to sink targets. This paper presents an analytical technique for generating orbit transient equivalent sink temperatures and a technique for generating transient sink target temperatures for the radiator and LHP CCs. Using these techniques, transient target temperatures for the radiator and LHP CCs were generated for three thermal environmental cases: worst hot case, worst cold case, and cooldown and warmup between worst hot case in sunlight and worst cold case in the eclipse, and three different heat transport values: 128 W, 255 W, and 382 W. The 128 W case assumed that the two LHPs transport 255 W equally to the radiator. The 255 W case assumed that one LHP fails so that the remaining LHP transports all the waste heat from the detector array to the radiator. The 382 W case assumed that one LHP fails so that the remaining LHP transports all the waste heat from the detector array to the radiator, and has a 50% design margin. All these transient target temperatures were successfully implemented in the engineering test unit (ETU) LHP and flight LHP thermal performance verification tests in vacuum.
1991-01-01
EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE
A semi-mechanistic model of dead fine fuel moisture for Temperate and Mediterranean ecosystems
NASA Astrophysics Data System (ADS)
Resco de Dios, Víctor; Fellows, Aaron; Boer, Matthias; Bradstock, Ross; Nolan, Rachel; Goulden, Michel
2014-05-01
Fire is a major disturbance in terrestrial ecosystems globally. It has an enormous economic and social cost, and leads to fatalities in the worst cases. The moisture content of the vegetation (fuel moisture) is one of the main determinants of fire risk. Predicting the moisture content of dead and fine fuel (< 2.5 cm in diameter) is particularly important, as this is often the most important component of the fuel complex for fire propagation. A variety of drought indices, empirical and mechanistic models have been proposed to model fuel moisture. A commonality across these different approaches is that they have been neither validated across large temporal datasets nor validated across broadly different vegetation types. Here, we present the results of a study performed at 6 locations in California, USA (5 sites) and New South Wales, Australia (1 site), where 10-hours fuel moisture content was continuously measured every 30 minutes during one full year at each site. We observed that drought indices did not accurately predict fuel moisture, and that empirical and mechanistic models both needed site-specific calibrations, which hinders their global application as indices of fuel moisture. We developed a novel, single equation and semi-mechanistic model, based on atmospheric vapor-pressure deficit. Across sites and years, mean absolute error (MAE) of predicted fuel moisture was 4.7%. MAE dropped <1% in the critical range of fuel moisture <10%. The high simplicity, accuracy and precision of our model makes it suitable for a wide range of applications: from operational purposes, to global vegetation models.
NASA Technical Reports Server (NTRS)
Strutzenberg, Louise L.; Putman, Gabriel C.
2011-01-01
The Ares I Scale Model Acoustics Test (ASMAT) is a series of live-fire tests of scaled rocket motors meant to simulate the conditions of the Ares I launch configuration. These tests have provided a well documented set of high fidelity measurements useful for validation including data taken over a range of test conditions and containing phenomena like Ignition Over-Pressure and water suppression of acoustics. Expanding from initial simulations of the ASMAT setup in a held down configuration, simulations have been performed using the Loci/CHEM computational fluid dynamics software for ASMAT tests of the vehicle at 5 ft. elevation (100 ft. real vehicle elevation) with worst case drift in the direction of the launch tower. These tests have been performed without water suppression and have compared the acoustic emissions for launch structures with and without launch mounts. In addition, simulation results have also been compared to acoustic and imagery data collected from similar live-fire tests to assess the accuracy of the simulations. Simulations have shown a marked change in the pattern of emissions after removal of the launch mount with a reduction in the overall acoustic environment experienced by the vehicle and the formation of highly directed acoustic waves moving across the platform deck. Comparisons of simulation results to live-fire test data showed good amplitude and temporal correlation and imagery comparisons over the visible and infrared wavelengths showed qualitative capture of all plume and pressure wave evolution features.
Local measles vaccination gaps in Germany and the role of vaccination providers.
Eichner, Linda; Wjst, Stephanie; Brockmann, Stefan O; Wolfers, Kerstin; Eichner, Martin
2017-08-14
Measles elimination in Europe is an urgent public health goal, yet despite the efforts of its member states, vaccination gaps and outbreaks occur. This study explores local vaccination heterogeneity in kindergartens and municipalities of a German county. Data on children from mandatory school enrolment examinations in 2014/15 in Reutlingen county were used. Children with unknown vaccination status were either removed from the analysis (best case) or assumed to be unvaccinated (worst case). Vaccination data were translated into expected outbreak probabilities. Physicians and kindergartens with statistically outstanding numbers of under-vaccinated children were identified. A total of 170 (7.1%) of 2388 children did not provide a vaccination certificate; 88.3% (worst case) or 95.1% (best case) were vaccinated at least once against measles. Based on the worst case vaccination coverage, <10% of municipalities and <20% of kindergartens were sufficiently vaccinated to be protected against outbreaks. Excluding children without a vaccination certificate (best case) leads to over-optimistic views: the overall outbreak probability in case of a measles introduction lies between 39.5% (best case) and 73.0% (worst case). Four paediatricians were identified who accounted for 41 of 109 unvaccinated children and for 47 of 138 incomplete vaccinations; GPs showed significantly higher rates of missing vaccination certificates and unvaccinated or under-vaccinated children than paediatricians. Missing vaccination certificates pose a severe problem regarding the interpretability of vaccination data. Although the coverage for at least one measles vaccination is higher in the studied county than in most South German counties and higher than the European average, many severe and potentially dangerous vaccination gaps occur locally. If other federal German states and EU countries show similar vaccination variability, measles elimination may not succeed in Europe.
NASA Astrophysics Data System (ADS)
Úbeda, Xavier; Sarricolea, Pablo
2016-11-01
This paper reviews the literature examining the wildfire phenomenon in Chile. Since ancient times, Chile's wildfires have shaped the country's landscape, but today, as in many other parts of the world, the fire regime - pattern, frequency and intensity - has grown at an alarming rate. In 2014, > 8000 fires were responsible for burning c. 130,000 ha, making it the worst year in Chile's recent history. The reasons for this increase appear to be the increment in the area planted with flammable species; the rejection of these landscape modifications on the part of local communities that target these plantations in arson attacks; and, the adoption of intensive forest management practices resulting in the accumulation of a high fuel load. These trends have left many native species in a precarious situation and forest plantation companies under considerable financial pressure. An additional problem is posed by fires at the wildland urban interface (WUI), threatening those inhabitants that live in Chile's most heavily populated cities. The prevalence of natural fires in Chile; the relationship between certain plant species and fire in terms of seed germination strategies and plant adaptation; the relationship between fire and invasive species; and, the need for fire prevention systems and territorial plans that include fire risk assessments are some of the key aspects discussed in this article. Several of the questions raised will require further research, including just how fire-dependent the ecosystems in Chile are, how the forest at the WUI can be better managed to prevent human and material damage, and how best to address the social controversy that pits the Mapuche population against the timber companies.
Private Rogers L. Taylor: Prisoner of the Japanese
2015-04-01
cooking , its body closely resembled that of a human baby. Other soldiers recall their memories regarding the local fare on Bataan. Lajzer recounted...Horse meat stunk so bad it was revolting. The cooks would boil it and then fry it over an open fire so it could be eaten. … Believe me, mules...gathering wood and water for cooking but the worst was the burial detail, which Taylor begrudgingly performed. This is not a detail he spoke of
Worst case estimation of homology design by convex analysis
NASA Technical Reports Server (NTRS)
Yoshikawa, N.; Elishakoff, Isaac; Nakagiri, S.
1998-01-01
The methodology of homology design is investigated for optimum design of advanced structures. for which the achievement of delicate tasks by the aid of active control system is demanded. The proposed formulation of homology design, based on the finite element sensitivity analysis, necessarily requires the specification of external loadings. The formulation to evaluate the worst case for homology design caused by uncertain fluctuation of loadings is presented by means of the convex model of uncertainty, in which uncertainty variables are assigned to discretized nodal forces and are confined within a conceivable convex hull given as a hyperellipse. The worst case of the distortion from objective homologous deformation is estimated by the Lagrange multiplier method searching the point to maximize the error index on the boundary of the convex hull. The validity of the proposed method is demonstrated in a numerical example using the eleven-bar truss structure.
Biomechanical behavior of a cemented ceramic knee replacement under worst case scenarios
NASA Astrophysics Data System (ADS)
Kluess, D.; Mittelmeier, W.; Bader, R.
2009-12-01
In connection with technological advances in the manufacturing of medical ceramics, a newly developed ceramic femoral component was introduced in total knee arthroplasty (TKA). The motivation to consider ceramics in TKA is based on the allergological and tribological benefits as proven in total hip arthroplasty. Owing to the brittleness and reduced fracture toughness of ceramic materials, the biomechanical performance has to be examined intensely. Apart from standard testing, we calculated the implant performance under different worst case scenarios including malposition, bone defects and stumbling. A finite-element-model was developed to calculate the implant performance in situ. The worst case conditions revealed principal stresses 12.6 times higher during stumbling than during normal gait. Nevertheless, none of the calculated principal stress amounts were above the critical strength of the ceramic material used. The analysis of malposition showed the necessity of exact alignment of the implant components.
Biomechanical behavior of a cemented ceramic knee replacement under worst case scenarios
NASA Astrophysics Data System (ADS)
Kluess, D.; Mittelmeier, W.; Bader, R.
2010-03-01
In connection with technological advances in the manufacturing of medical ceramics, a newly developed ceramic femoral component was introduced in total knee arthroplasty (TKA). The motivation to consider ceramics in TKA is based on the allergological and tribological benefits as proven in total hip arthroplasty. Owing to the brittleness and reduced fracture toughness of ceramic materials, the biomechanical performance has to be examined intensely. Apart from standard testing, we calculated the implant performance under different worst case scenarios including malposition, bone defects and stumbling. A finite-element-model was developed to calculate the implant performance in situ. The worst case conditions revealed principal stresses 12.6 times higher during stumbling than during normal gait. Nevertheless, none of the calculated principal stress amounts were above the critical strength of the ceramic material used. The analysis of malposition showed the necessity of exact alignment of the implant components.
Dima, Giovanna; Verzera, Antonella; Grob, Koni
2011-11-01
Party plates made of recycled paperboard with a polyolefin film on the food contact surface (more often polypropylene than polyethylene) were tested for migration of mineral oil into various foods applying reasonable worst case conditions. The worst case was identified as a slice of fried meat placed onto the plate while hot and allowed to cool for 1 h. As it caused the acceptable daily intake (ADI) specified by the Joint FAO/WHO Expert Committee on Food Additives (JECFA) to be exceeded, it is concluded that recycled paperboard is generally acceptable for party plates only when separated from the food by a functional barrier. Migration data obtained with oil as simulant at 70°C was compared to the migration into foods. A contact time of 30 min was found to reasonably cover the worst case determined in food.
Davis, Michael J; Janke, Robert
2018-01-04
The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.
NASA Astrophysics Data System (ADS)
Davis, Michael J.; Janke, Robert
2018-05-01
The effect of limitations in the structural detail available in a network model on contamination warning system (CWS) design was examined in case studies using the original and skeletonized network models for two water distribution systems (WDSs). The skeletonized models were used as proxies for incomplete network models. CWS designs were developed by optimizing sensor placements for worst-case and mean-case contamination events. Designs developed using the skeletonized network models were transplanted into the original network model for evaluation. CWS performance was defined as the number of people who ingest more than some quantity of a contaminant in tap water before the CWS detects the presence of contamination. Lack of structural detail in a network model can result in CWS designs that (1) provide considerably less protection against worst-case contamination events than that obtained when a more complete network model is available and (2) yield substantial underestimates of the consequences associated with a contamination event. Nevertheless, CWSs developed using skeletonized network models can provide useful reductions in consequences for contaminants whose effects are not localized near the injection location. Mean-case designs can yield worst-case performances similar to those for worst-case designs when there is uncertainty in the network model. Improvements in network models for WDSs have the potential to yield significant improvements in CWS designs as well as more realistic evaluations of those designs. Although such improvements would be expected to yield improved CWS performance, the expected improvements in CWS performance have not been quantified previously. The results presented here should be useful to those responsible for the design or implementation of CWSs, particularly managers and engineers in water utilities, and encourage the development of improved network models.
Nuclear winter from gulf war discounted
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, E.
Would a major conflagration in Kuwait's oil fields trigger a climate catastrophe akin to the 'nuclear winter' that got so much attention in the 1980s This question prompted a variety of opinions. The British Meteorological Office and researchers at Lawrence Livermore National Laboratory concluded that the effect of smoke from major oil fires in Kuwait on global temperatures is likely to be small; however, the obscuration of sunlight might significantly reduce surface temperatures locally. Michael MacCracken, leader of the researchers at Livermore, predicts that the worst plausible oil fires in the Gulf would produce a cloud of pollution about asmore » severe as that found on a bad day at the Los Angeles airport. The results of some mathematical modeling by the Livermore research group are reported.« less
Sensitivity of worst-case strom surge considering influence of climate change
NASA Astrophysics Data System (ADS)
Takayabu, Izuru; Hibino, Kenshi; Sasaki, Hidetaka; Shiogama, Hideo; Mori, Nobuhito; Shibutani, Yoko; Takemi, Tetsuya
2016-04-01
There are two standpoints when assessing risk caused by climate change. One is how to prevent disaster. For this purpose, we get probabilistic information of meteorological elements, from enough number of ensemble simulations. Another one is to consider disaster mitigation. For this purpose, we have to use very high resolution sophisticated model to represent a worst case event in detail. If we could use enough computer resources to drive many ensemble runs with very high resolution model, we can handle these all themes in one time. However resources are unfortunately limited in most cases, and we have to select the resolution or the number of simulations if we design the experiment. Applying PGWD (Pseudo Global Warming Downscaling) method is one solution to analyze a worst case event in detail. Here we introduce an example to find climate change influence on the worst case storm-surge, by applying PGWD to a super typhoon Haiyan (Takayabu et al, 2015). 1 km grid WRF model could represent both the intensity and structure of a super typhoon. By adopting PGWD method, we can only estimate the influence of climate change on the development process of the Typhoon. Instead, the changes in genesis could not be estimated. Finally, we drove SU-WAT model (which includes shallow water equation model) to get the signal of storm surge height. The result indicates that the height of the storm surge increased up to 20% owing to these 150 years climate change.
Assessing Satellite-Based Fire Data for use in the National Emissions Inventory
NASA Technical Reports Server (NTRS)
Soja, Amber J.; Al-Saadi, Jassim; Giglio, Louis; Randall, Dave; Kittaka, Chieko; Pouliot, George; Kordzi, Joseph J.; Raffuse, Sean; Pace, Thompson G.; Pierce, Thomas E.;
2009-01-01
Biomass burning is significant to emission estimates because: (1) it can be a major contributor of particulate matter and other pollutants; (2) it is one of the most poorly documented of all sources; (3) it can adversely affect human health; and (4) it has been identified as a significant contributor to climate change through feedbacks with the radiation budget. Additionally, biomass burning can be a significant contributor to a regions inability to achieve the National Ambient Air Quality Standards for PM 2.5 and ozone, particularly on the top 20% worst air quality days. The United States does not have a standard methodology to track fire occurrence or area burned, which are essential components to estimating fire emissions. Satellite imagery is available almost instantaneously and has great potential to enhance emission estimates and their timeliness. This investigation compares satellite-derived fire data to ground-based data to assign statistical error and helps provide confidence in these data. The largest fires are identified by all satellites and their spatial domain is accurately sensed. MODIS provides enhanced spatial and temporal information, and GOES ABBA data are able to capture more small agricultural fires. A methodology is presented that combines these satellite data in Near-Real-Time to produce a product that captures 81 to 92% of the total area burned by wildfire, prescribed, agricultural and rangeland burning. Each satellite possesses distinct temporal and spatial capabilities that permit the detection of unique fires that could be omitted if using data from only one satellite.
Wildland Fire Induced Heating of Dome 375 Perma-Con®
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flores, Eugene Michael
AET-1 was tasked by ADEM with determining the temperature rise in the drum contents of drums stored in the Dome 375 Perma-Con® at TA-54 given a wildland fire. The wildland fire causes radiative and convective heating on the Perma-Con® exterior. The wildland fire time histories for the radiative and convective heating environment were provided to AET-1 by EES-16. If the calculated temperature rise results in a drum content temperature over 40 °C, then ADEM desires a design solution to ensure the peak temperature remains below 40 °C. An axi-symmetric FE simulation was completed to determine the peak temperature of themore » contents of a drum stored in the Dome 375 Perma-Con® during a wildland fire event. Three wildland fire time histories for the radiative and convective heat transfer were provided by EES-16 and were inputs for the FE simulation. The maximum drum content temperature reached was found to be 110 °C while using inputs from the SiteG_2ms_4ign_wind_from_west.xlsx time history input and not including the SWB in the model. Including the SWB in the results in a peak drum content temperature of 61 °C for the SiteG_2ms_4ign_wind_from_west.xlsx inputs. EES-16 decided that by using fuel mitigation efforts, such as mowing the grass and shrubs near the Perma-Con® they could reduce the shrub/grass fuel loading near the Perma-Con® from 1.46 kg/m 2 to 0.146 kg/m 2 and by using a less conservative fuel loading for the debris field inside the Dome 375 perimeter, reducing it from 0.58 kg/m2 to 0.058 kg/m 2 in their model. They also greatly increased the resolution of their radiation model and increased the accuracy of their model’s required convergence value. Using this refined input the maximum drum content temperature was found to be 28 °C with no SWB present in the model. Additionally, this refined input model was modified to include worst case emissivity values for the concrete, drum and Perma-Con® interior, along with adding a 91 second long residual radiative heat flux of 2,000 W/m2 to the end of the refined wildland fire input. For this case the maximum drum content temperature was found to be 32 °C. For Rev. 2 of this calculation and additional simulation was run that included a cable fire heat flux on the exterior of the Perma-Con® that was calculated by FP-DO. Including the cable fire heat flux in the model without the SWB resulted in a peak drum content temperature over time of 43 °C. Including the SWB in the simulation with the cable fire heat flux resulted in a peak drum content temperature over time of 35 °C.« less
Electrical Evaluation of RCA MWS5001D Random Access Memory, Volume 4, Appendix C
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. Statistical analysis data is supplied along with write pulse width, read cycle time, write cycle time, and chip enable time data.
Electrical Evaluation of RCA MWS5501D Random Access Memory, Volume 2, Appendix a
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
The electrical characterization and qualification test results are presented for the RCA MWS5001D random access memory. The tests included functional tests, AC and DC parametric tests, AC parametric worst-case pattern selection test, determination of worst-case transition for setup and hold times, and a series of schmoo plots. The address access time, address readout time, the data hold time, and the data setup time are some of the results surveyed.
Probabilistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.
A Multidimensional Assessment of Children in Conflictual Contexts: The Case of Kenya
ERIC Educational Resources Information Center
Okech, Jane E. Atieno
2012-01-01
Children in Kenya's Kisumu District Primary Schools (N = 430) completed three measures of trauma. Respondents completed the "My Worst Experience Scale" (MWES; Hyman and Snook 2002) and its supplement, the "School Alienation and Trauma Survey" (SATS; Hyman and Snook 2002), sharing their worst experiences overall and specifically…
NASA Technical Reports Server (NTRS)
Nishimura, T.
1975-01-01
This paper proposes a worst-error analysis for dealing with problems of estimation of spacecraft trajectories in deep space missions. Navigation filters in use assume either constant or stochastic (Markov) models for their estimated parameters. When the actual behavior of these parameters does not follow the pattern of the assumed model, the filters sometimes result in very poor performance. To prepare for such pathological cases, the worst errors of both batch and sequential filters are investigated based on the incremental sensitivity studies of these filters. By finding critical switching instances of non-gravitational accelerations, intensive tracking can be carried out around those instances. Also the worst errors in the target plane provide a measure in assignment of the propellant budget for trajectory corrections. Thus the worst-error study presents useful information as well as practical criteria in establishing the maneuver and tracking strategy of spacecraft's missions.
Smoke pollution disrupted biodiversity during the 2015 El Niño fires in Southeast Asia
NASA Astrophysics Data System (ADS)
Y-H Lee, Benjamin P.; Davies, Zoe G.; Struebig, Matthew J.
2017-09-01
Forest and peatland fires during the 2015 El Niño drought were amongst the worst on record in Southeast Asia. They were a major contributor of carbon emissions across the region, with the associated smoke-induced haze causing an air pollution crisis that affected millions of people. We present evidence of air pollution impacts on biodiversity. Using daily acoustic recordings in central Singapore, we monitored the dawn chorus before, during and after the haze event. We demonstrate that levels of ecological community acoustic activity dropped dramatically during the haze, and that this decline was significantly associated with levels of air pollution considered ‘unhealthy’ to the human population. Acoustic disruption was apparent across four common indices of soundscape activity, with only a partial recovery to pre-haze levels observed 16 weeks after the smoke had dissipated. These impacts on ecological communities were likely to be even more severe closer to the fires, where air pollution levels were reported to be 15-fold greater than those recorded in Singapore. Our results indicate that large-scale air pollution crises may have hitherto underestimated and potentially far-reaching impacts on biodiversity, especially in parts of the world prone to extensive forest fires.
A System For Load Isolation And Precision Pointing
NASA Astrophysics Data System (ADS)
Keckler, Claude R.; Hamilton, Brian J.
1983-11-01
A system capable of satisfying the accuracy and stability requirements dictated by Shuttle-borne payloads utilizing large optics has been under joint NASA/Sperry development. This device, denoted the Annular Suspension and Pointing System, employs a unique combination of conventional gimbals and magnetic bearing actuators, thereby providing for the "complete" isolation of the payload from its external environment, as well as for extremely accurate and stable pointing (≍0.01 arcseconds). This effort has been pursued through the fabrication and laboratory evaluation of engineering model hardware. Results from these tests have been instrumental in generating high fidelity computer simulations of this load isolation and precision pointing system, and in permitting confident predictions of the system's on-orbit performance. Applicability of this system to the Solar Optical Telescope mission has been examined using the computer simulation. The worst case pointing error predicted for this payload while subjected to vernier reaction control system thruster firings and crew motions aboard Shuttle was approximately 0.006 arcseconds.
Mühlbacher, Axel C; Kaczynski, Anika; Zweifel, Peter; Johnson, F Reed
2016-12-01
Best-worst scaling (BWS), also known as maximum-difference scaling, is a multiattribute approach to measuring preferences. BWS aims at the analysis of preferences regarding a set of attributes, their levels or alternatives. It is a stated-preference method based on the assumption that respondents are capable of making judgments regarding the best and the worst (or the most and least important, respectively) out of three or more elements of a choice-set. As is true of discrete choice experiments (DCE) generally, BWS avoids the known weaknesses of rating and ranking scales while holding the promise of generating additional information by making respondents choose twice, namely the best as well as the worst criteria. A systematic literature review found 53 BWS applications in health and healthcare. This article expounds possibilities of application, the underlying theoretical concepts and the implementation of BWS in its three variants: 'object case', 'profile case', 'multiprofile case'. This paper contains a survey of BWS methods and revolves around study design, experimental design, and data analysis. Moreover the article discusses the strengths and weaknesses of the three types of BWS distinguished and offered an outlook. A companion paper focuses on special issues of theory and statistical inference confronting BWS in preference measurement.
Dispersion modeling of accidental releases of toxic gases - utility for the fire brigades.
NASA Astrophysics Data System (ADS)
Stenzel, S.; Baumann-Stanzer, K.
2009-09-01
Several air dispersion models are available for prediction and simulation of the hazard areas associated with accidental releases of toxic gases. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for effective presentation of results. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios”), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Viennese fire brigade, OMV Refining & Marketing GmbH and Synex Ries & Greßlehner GmbH. RETOMOD was funded by the KIRAS safety research program of the Austrian Ministry of Transport, Innovation and Technology (www.kiras.at). The main tasks of this project were 1. Sensitivity study and optimization of the meteorological input for modeling of the hazard areas (human exposure) during the accidental toxic releases. 2. Comparison of several model packages (based on reference scenarios) in order to estimate the utility for the fire brigades. For the purpose of our study the following models were tested and compared: ALOHA (Areal Location of Hazardous atmosphere, EPA), MEMPLEX (Keudel av-Technik GmbH), Trace (Safer System), Breeze (Trinity Consulting), SAM (Engineering office Lohmeyer). A set of reference scenarios for Chlorine, Ammoniac, Butane and Petrol were proceed, with the models above, in order to predict and estimate the human exposure during the event. Furthermore, the application of the observation-based analysis and forecasting system INCA, developed in the Central Institute for Meteorology and Geodynamics (ZAMG) in case of toxic release was investigated. INCA (Integrated Nowcasting through Comprehensive Analysis) data are calculated operationally with 1 km horizontal resolution and based on the weather forecast model ALADIN. The meteorological field's analysis with INCA include: Temperature, Humidity, Wind, Precipitation, Cloudiness and Global Radiation. In the frame of the project INCA data were compared with measurements from the meteorological observational network, conducted at traffic-near sites in Vienna. INCA analysis and very short term forecast fields (up to 6 hours) are found to be an advanced possibility to provide on-line meteorological input for the model package used by the fire brigade. Since the input requirements differ from model to model, and the outputs are based on unequal criteria for toxic area and exposure, a high degree of caution in the interpretation of the model results is required - especially in the case of slow wind speeds, stable atmospheric condition, and flow deflection by buildings in the urban area or by complex topography.
Effect of Impact Location on the Response of Shuttle Wing Leading Edge Panel 9
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Spellman, Regina L.; Hardy, Robin C.; Fasanella, Edwin L.; Jackson, Karen E.
2005-01-01
The objective of this paper is to compare the results of several simulations performed to determine the worst-case location for a foam impact on the Space Shuttle wing leading edge. The simulations were performed using the commercial non-linear transient dynamic finite element code, LS-DYNA. These simulations represent the first in a series of parametric studies performed to support the selection of the worst-case impact scenario. Panel 9 was selected for this study to enable comparisons with previous simulations performed during the Columbia Accident Investigation. The projectile for this study is a 5.5-in cube of typical external tank foam weighing 0.23 lb. Seven locations spanning the panel surface were impacted with the foam cube. For each of these cases, the foam was traveling at 1000 ft/s directly aft, along the orbiter X-axis. Results compared from the parametric studies included strains, contact forces, and material energies for various simulations. The results show that the worst case impact location was on the top surface, near the apex.
Physical explosion analysis in heat exchanger network design
NASA Astrophysics Data System (ADS)
Pasha, M.; Zaini, D.; Shariff, A. M.
2016-06-01
The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.
Environmental impact of a teratogenic actinide: a case study of americium-241
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J.; Yang, J.Y.
1985-10-16
Americium-241 is widely used as a radiation source, but it also has some potential risk if taken into the body. Although the radiotoxicity of americium-241 is small compared to other transuranic actinides, its effects on the reproductive system and on development of the placenta are more damaging than the effects of plutonium-239. A previous report based on a worst-case scenario involving a hypothetical fire accident in a contaminated facility indicated that there could have been a significant impact on nearby residents from a unit release of americium-241 via atmospheric dispersion. However, because the facility is located in a rural regionmore » where most drinking water supplies are drawn from private wells, it is believed that deposition of americium-241 from the atmosphere might also have impacts via the groundwater pathway by infiltration of rainwater. In this analysis, a three-dimensional analytical mathematical model is used to assess several aspects of americium-241 contamination of groundwater, including radioactive transformation, advection, dispersion, and soil sorption. Simulation results indicate that no significant radiological impacts would occur to the nearby residents via the groundwater pathway. 15 refs., 2 figs., 2 tabs.« less
NASA Astrophysics Data System (ADS)
Dobre, Mariana; Elliot, William J.; Brooks, Erin S.; Smith, Tim
2016-04-01
Wildfires can have major adverse effects on municipal water sources. Local governments need methods to evaluate fire risk and to develop mitigation procedures. The Sooke Lake Reservoir is the primary source of water for the city of Victoria, BC and the concern is that sediment delivered from upland burned areas could have a detrimental impact on the reservoir and the water supply. We conducted a sediment delivery modeling pilot study on a portion of the Sooke Lake Reservoir (specifically, the Trestle Creek Management Unit (TCMU)) to evaluate the potential impacts of wildfire on sediment delivery from hillslopes and sub-catchments. We used a process-based hydrologic and soil erosion model called Water Erosion Prediction Project geospatial interface, GeoWEPP, to predict the sediment delivery from specific return period design storms for two burn severity scenarios: real (low-intensity burn severity) and worst (high-intensity burn severity) case scenarios. The GeoWEPP model allows users to simulate streamflow and erosion from hillslope polygons within a watershed. The model requires information on the topographic, soil and vegetative characteristics for each hillslope and a weather file. WEPP default values and several assumptions were necessary to apply the model where data were missing. Based on a 10-m DEM we delineated 16 watersheds within the TCMU area. A long term 100-year daily climate file was generated for this analysis using the CLIGEN model based on the historical observations recorded at Concrete, WA in United States, and adjusted for observed monthly precipitation observed in the Sooke Basin. We ran 100-year simulations and calculated yearly and event-based return periods (for 2, 5, 10, 20, 25, and 50 years) for each of the 16 watersheds. Overall, WEPP simulations indicate that the storms that are most likely to produce the greatest runoff and sediment load in these coastal, maritime climates with relatively low rainfall intensities are likely to occur in the winter when the soils are not water repellant. The erosion rates varied from 0.34 tonnes/ha/year to 37.3 tonnes/ha/year with the most vulnerable slopes being those associated with steep shallow soils. The summation of the 10-year return period annual delivered sediment from all the watersheds for the worst case scenario during winter months is 17% greater than the total sediment delivery for the real case scenarios. Despite the data limitations, this analysis provides insight into the critical watersheds that will be major source areas of sediment following a wildfire. Watershed managers can use this information to plan and prioritize post-wildfire rehabilitation strategies and actions to minimize the risk of sediment delivery from the hillslopes that generate the greatest amount of sediment.
Faith, Daniel P.
2015-01-01
The phylogenetic diversity measure, (‘PD’), measures the relative feature diversity of different subsets of taxa from a phylogeny. At the level of feature diversity, PD supports the broad goal of biodiversity conservation to maintain living variation and option values. PD calculations at the level of lineages and features include those integrating probabilities of extinction, providing estimates of expected PD. This approach has known advantages over the evolutionarily distinct and globally endangered (EDGE) methods. Expected PD methods also have limitations. An alternative notion of expected diversity, expected functional trait diversity, relies on an alternative non-phylogenetic model and allows inferences of diversity at the level of functional traits. Expected PD also faces challenges in helping to address phylogenetic tipping points and worst-case PD losses. Expected PD may not choose conservation options that best avoid worst-case losses of long branches from the tree of life. We can expand the range of useful calculations based on expected PD, including methods for identifying phylogenetic key biodiversity areas. PMID:25561672
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Barth, J. L.; Stassinopoulos, E. G.; Burke, E. A.; Gee, G. B.
1999-01-01
The effects that solar proton events have on microelectronics and solar arrays are important considerations for spacecraft in geostationary and polar orbits and for interplanetary missions. Designers of spacecraft and mission planners are required to assess the performance of microelectronic systems under a variety of conditions. A number of useful approaches exist for predicting information about solar proton event fluences and, to a lesser extent, peak fluxes. This includes the cumulative fluence over the course of a mission, the fluence of a worst-case event during a mission, the frequency distribution of event fluences, and the frequency distribution of large peak fluxes. Naval Research Laboratory (NRL) and NASA Goddard Space Flight Center, under the sponsorship of NASA's Space Environments and Effects (SEE) Program, have developed a new model for predicting cumulative solar proton fluences and worst-case solar proton events as functions of mission duration and user confidence level. This model is called the Emission of Solar Protons (ESP) model.
NASA Technical Reports Server (NTRS)
Xapsos, M. A.; Barth, J. L.; Stassinopoulos, E. G.; Burke, Edward A.; Gee, G. B.
1999-01-01
The effects that solar proton events have on microelectronics and solar arrays are important considerations for spacecraft in geostationary and polar orbits and for interplanetary missions. Designers of spacecraft and mission planners are required to assess the performance of microelectronic systems under a variety of conditions. A number of useful approaches exist for predicting information about solar proton event fluences and, to a lesser extent, peak fluxes. This includes the cumulative fluence over the course of a mission, the fluence of a worst-case event during a mission, the frequency distribution of event fluences, and the frequency distribution of large peak fluxes. Naval Research Laboratory (NRL) and NASA Goddard Space Flight Center, under the sponsorship of NASA's Space Environments and Effects (SEE) Program, have developed a new model for predicting cumulative solar proton fluences and worst-case solar proton events as functions of mission duration and user confidence level. This model is called the Emission of Solar Protons (ESP) model.
Thermal surface characteristics of coal fires 1 results of in-situ measurements
NASA Astrophysics Data System (ADS)
Zhang, Jianzhong; Kuenzer, Claudia
2007-12-01
Natural underground coal fires are fires in coal seams occurring subsurface. The fires are ignited through a process named spontaneous combustion, which occurs based on a natural reaction but is usually triggered through human interaction. Coal mining activities expose coal to the air. This leads to the exothermal oxidation of the carbon in the coal with the air's oxygen to CO 2 and - under certain circumstances - to spontaneous combustion. Coal fires occur in many countries world wide - however, currently the Chinese coal mining industry faces the biggest problems with coal fires. Coal fires destroy the valuable resource coal and furthermore lead to many environmental degradation phenomena such as the deterioration of surrounding vegetation, land subsidence and the emission of toxic gasses (CO, N 2O). They additionally contribute to the emission of green house relevant gasses such as CO 2 and CH 4 to the atmosphere. In this paper we present thermal characteristics of coal fires as measured in-situ during a field campaign to the Wuda coal fire area in south-central Inner Mongolia, China. Thermal characteristics include temperature anomaly measurements at the surface, spatial surface temperature profiles of fire areas and unaffected background areas, diurnal temperature profiles, and temperature measurements inside of coal fire induced cracks in the overlying bedrock. For all the measurements the effects of uneven solar heating through influences of slope and aspect are considered. Our findings show that coal fires result in strong or subtle thermal surface anomalies. Especially the latter can easily be influenced by heating of the surrounding background material through solar influences. Temperature variation of background rocks with different albedo, slope, aspect or vegetation cover can substantially influence the detectability of thermal anomalies. In the worst case coal fire related thermal anomalies can be completely masked by solar patterns during the daytime. Thus, night-time analysis is the most suitable for thermal anomaly mapping of underground coal fires, although this is not always feasible. The heat of underground coal fires only progresses very slowly through conduction in the rock material. Anomalies of coal fires completely covered by solid unfractured bedrock are very weak and were only measured during the night. The thermal pattern of underground coal fires manifested on the surface during the daytime is thus the pattern of cracks and vents, which occur due to the volume loss underground and which support radiation and convective energy transport of hot gasses. Inside coal fire temperatures can hardly be measured and can only be recorded if the glowing coal is exposed through a wider crack in the overlaying bedrock. Direct coal fire temperatures measured ranged between 233 °C and 854 °C. The results presented can substantially support the planning of thermal mapping campaigns, analyses of coal fire thermal anomalies in remotely sensed data, and can provide initial and boundary conditions for coal fire related numerical modeling. In a second paper named "Thermal Characteristics of Coal Fires 2: results of measurements on simulated coal fires" [ Zhang J., Kuenzer C., Tetzlaff A., Oettl D., Zhukov B., Wagner W., 2007. Thermal Characteristics of Coal Fires 2: Result of measurements on simulated coal fires. Accepted for publication at Journal of Applied Geophysics. doi:10.1016/j.jappgeo.2007.08.003] we report about thermal characteristics of simulated coal fires simulated under simplified conditions. The simulated set up allowed us to measure even more parameters under undisturbed conditions — especially inside fire temperatures. Furthermore we could demonstrate the differences between open surface coal fires and covered underground coal fires. Thermal signals of coal fires in near range thermal remotely sensed imagery from an observing tower and from an airplane are presented and discussed.
Robust guaranteed-cost adaptive quantum phase estimation
NASA Astrophysics Data System (ADS)
Roy, Shibdas; Berry, Dominic W.; Petersen, Ian R.; Huntington, Elanor H.
2017-05-01
Quantum parameter estimation plays a key role in many fields like quantum computation, communication, and metrology. Optimal estimation allows one to achieve the most precise parameter estimates, but requires accurate knowledge of the model. Any inevitable uncertainty in the model parameters may heavily degrade the quality of the estimate. It is therefore desired to make the estimation process robust to such uncertainties. Robust estimation was previously studied for a varying phase, where the goal was to estimate the phase at some time in the past, using the measurement results from both before and after that time within a fixed time interval up to current time. Here, we consider a robust guaranteed-cost filter yielding robust estimates of a varying phase in real time, where the current phase is estimated using only past measurements. Our filter minimizes the largest (worst-case) variance in the allowable range of the uncertain model parameter(s) and this determines its guaranteed cost. It outperforms in the worst case the optimal Kalman filter designed for the model with no uncertainty, which corresponds to the center of the possible range of the uncertain parameter(s). Moreover, unlike the Kalman filter, our filter in the worst case always performs better than the best achievable variance for heterodyne measurements, which we consider as the tolerable threshold for our system. Furthermore, we consider effective quantum efficiency and effective noise power, and show that our filter provides the best results by these measures in the worst case.
Alfa, M J; Olson, N
2016-05-01
To determine which simulated-use test soils met the worst-case organic levels and viscosity of clinical secretions, and had the best adhesive characteristics. Levels of protein, carbohydrate and haemoglobin, and vibrational viscosity of clinical endoscope secretions were compared with test soils including ATS, ATS2015, Edinburgh, Edinburgh-M (modified), Miles, 10% serum and coagulated whole blood. ASTM D3359 was used for adhesion testing. Cleaning of a single-channel flexible intubation endoscope was tested after simulated use. The worst-case levels of protein, carbohydrate and haemoglobin, and viscosity of clinical material were 219,828μg/mL, 9296μg/mL, 9562μg/mL and 6cP, respectively. Whole blood, ATS2015 and Edinburgh-M were pipettable with viscosities of 3.4cP, 9.0cP and 11.9cP, respectively. ATS2015 and Edinburgh-M best matched the worst-case clinical parameters, but ATS had the best adhesion with 7% removal (36.7% for Edinburgh-M). Edinburgh-M and ATS2015 showed similar soiling and removal characteristics from the surface and lumen of a flexible intubation endoscope. Of the test soils evaluated, ATS2015 and Edinburgh-M were found to be good choices for the simulated use of endoscopes, as their composition and viscosity most closely matched worst-case clinical material. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Olson, Scott A.
1996-01-01
Contraction scour for all modelled flows ranged from 0.1 to 3.1 ft. The worst-case contraction scour occurred at the incipient-overtopping discharge. Abutment scour at the left abutment ranged from 10.4 to 12.5 ft with the worst-case occurring at the 500-year discharge. Abutment scour at the right abutment ranged from 25.3 to 27.3 ft with the worst-case occurring at the incipient-overtopping discharge. The worst-case total scour also occurred at the incipient-overtopping discharge. The incipient-overtopping discharge was in between the 100- and 500-year discharges. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Field, Robert D; van der Werf, Guido R; Fanin, Thierry; Fetzer, Eric J; Fuller, Ryan; Jethva, Hiren; Levy, Robert; Livesey, Nathaniel J; Luo, Ming; Torres, Omar; Worden, Helen M
2016-08-16
The 2015 fire season and related smoke pollution in Indonesia was more severe than the major 2006 episode, making it the most severe season observed by the NASA Earth Observing System satellites that go back to the early 2000s, namely active fire detections from the Terra and Aqua Moderate Resolution Imaging Spectroradiometers (MODIS), MODIS aerosol optical depth, Terra Measurement of Pollution in the Troposphere (MOPITT) carbon monoxide (CO), Aqua Atmospheric Infrared Sounder (AIRS) CO, Aura Ozone Monitoring Instrument (OMI) aerosol index, and Aura Microwave Limb Sounder (MLS) CO. The MLS CO in the upper troposphere showed a plume of pollution stretching from East Africa to the western Pacific Ocean that persisted for 2 mo. Longer-term records of airport visibility in Sumatra and Kalimantan show that 2015 ranked after 1997 and alongside 1991 and 1994 as among the worst episodes on record. Analysis of yearly dry season rainfall from the Tropical Rainfall Measurement Mission (TRMM) and rain gauges shows that, due to the continued use of fire to clear and prepare land on degraded peat, the Indonesian fire environment continues to have nonlinear sensitivity to dry conditions during prolonged periods with less than 4 mm/d of precipitation, and this sensitivity appears to have increased over Kalimantan. Without significant reforms in land use and the adoption of early warning triggers tied to precipitation forecasts, these intense fire episodes will reoccur during future droughts, usually associated with El Niño events.
NASA Astrophysics Data System (ADS)
Field, Robert D.; van der Werf, Guido R.; Fanin, Thierry; Fetzer, Eric J.; Fuller, Ryan; Jethva, Hiren; Levy, Robert; Livesey, Nathaniel J.; Luo, Ming; Torres, Omar; Worden, Helen M.
2016-08-01
The 2015 fire season and related smoke pollution in Indonesia was more severe than the major 2006 episode, making it the most severe season observed by the NASA Earth Observing System satellites that go back to the early 2000s, namely active fire detections from the Terra and Aqua Moderate Resolution Imaging Spectroradiometers (MODIS), MODIS aerosol optical depth, Terra Measurement of Pollution in the Troposphere (MOPITT) carbon monoxide (CO), Aqua Atmospheric Infrared Sounder (AIRS) CO, Aura Ozone Monitoring Instrument (OMI) aerosol index, and Aura Microwave Limb Sounder (MLS) CO. The MLS CO in the upper troposphere showed a plume of pollution stretching from East Africa to the western Pacific Ocean that persisted for 2 mo. Longer-term records of airport visibility in Sumatra and Kalimantan show that 2015 ranked after 1997 and alongside 1991 and 1994 as among the worst episodes on record. Analysis of yearly dry season rainfall from the Tropical Rainfall Measurement Mission (TRMM) and rain gauges shows that, due to the continued use of fire to clear and prepare land on degraded peat, the Indonesian fire environment continues to have nonlinear sensitivity to dry conditions during prolonged periods with less than 4 mm/d of precipitation, and this sensitivity appears to have increased over Kalimantan. Without significant reforms in land use and the adoption of early warning triggers tied to precipitation forecasts, these intense fire episodes will reoccur during future droughts, usually associated with El Niño events.
NASA Technical Reports Server (NTRS)
Field, Robert D.; van der Werf, Guido R.; Fanin, Thierry; Fetzer, Eric; Fuller, Ryan; Jethva, Hiren; Levy, Robert; Livesey, Nathaniel; Luo, Ming; Torres, Omar;
2016-01-01
The 2015 fire season and related smoke pollution in Indonesia was more severe than the major 2006 episode, making it the most severe season observed by the NASA Earth Observing System satellites that go back to the early 2000s, namely active fire detections from the Terra and Aqua Moderate Resolution Imaging Spectroradiometers (MODIS), MODIS aerosol optical depth, Terra Measurement of Pollution in the Troposphere (MOPITT) carbon monoxide (CO), Aqua Atmospheric Infrared Sounder (AIRS) CO, Aura Ozone Monitoring Instrument (OMI) aerosol index, and Aura Microwave Limb Sounder (MLS) CO. The MLS CO in the upper troposphere showed a plume of pollution stretching from East Africa to the western Pacific Ocean that persisted for two months. Longer-term records of airport visibility in Sumatra and Kalimantan show that 2015 ranked after 1997 and alongside 1991 and 1994 as among the worst episodes on record. Analysis of yearly dry season rainfall from the Tropical Rainfall Measurement Mission (TRMM) and rain gauges shows that, due to the continued use of fire to clear and prepare land on degraded peat, the Indonesian fire environment continues to have non-linear sensitivity to dry conditions during prolonged periods with less than 4mmday of precipitation, and this sensitivity appears to have increased over Kalimantan. Without significant reforms in land use and the adoption of early warning triggers tied to precipitation forecasts, these intense fire episodes will re-occur during future droughts, usually associated with El Nio events.
Field, Robert D.; van der Werf, Guido R.; Fanin, Thierry; Fetzer, Eric J.; Fuller, Ryan; Jethva, Hiren; Levy, Robert; Livesey, Nathaniel J.; Luo, Ming; Torres, Omar; Worden, Helen M.
2016-01-01
The 2015 fire season and related smoke pollution in Indonesia was more severe than the major 2006 episode, making it the most severe season observed by the NASA Earth Observing System satellites that go back to the early 2000s, namely active fire detections from the Terra and Aqua Moderate Resolution Imaging Spectroradiometers (MODIS), MODIS aerosol optical depth, Terra Measurement of Pollution in the Troposphere (MOPITT) carbon monoxide (CO), Aqua Atmospheric Infrared Sounder (AIRS) CO, Aura Ozone Monitoring Instrument (OMI) aerosol index, and Aura Microwave Limb Sounder (MLS) CO. The MLS CO in the upper troposphere showed a plume of pollution stretching from East Africa to the western Pacific Ocean that persisted for 2 mo. Longer-term records of airport visibility in Sumatra and Kalimantan show that 2015 ranked after 1997 and alongside 1991 and 1994 as among the worst episodes on record. Analysis of yearly dry season rainfall from the Tropical Rainfall Measurement Mission (TRMM) and rain gauges shows that, due to the continued use of fire to clear and prepare land on degraded peat, the Indonesian fire environment continues to have nonlinear sensitivity to dry conditions during prolonged periods with less than 4 mm/d of precipitation, and this sensitivity appears to have increased over Kalimantan. Without significant reforms in land use and the adoption of early warning triggers tied to precipitation forecasts, these intense fire episodes will reoccur during future droughts, usually associated with El Niño events. PMID:27482096
Hollis, Geoff
2018-04-01
Best-worst scaling is a judgment format in which participants are presented with a set of items and have to choose the superior and inferior items in the set. Best-worst scaling generates a large quantity of information per judgment because each judgment allows for inferences about the rank value of all unjudged items. This property of best-worst scaling makes it a promising judgment format for research in psychology and natural language processing concerned with estimating the semantic properties of tens of thousands of words. A variety of different scoring algorithms have been devised in the previous literature on best-worst scaling. However, due to problems of computational efficiency, these scoring algorithms cannot be applied efficiently to cases in which thousands of items need to be scored. New algorithms are presented here for converting responses from best-worst scaling into item scores for thousands of items (many-item scoring problems). These scoring algorithms are validated through simulation and empirical experiments, and considerations related to noise, the underlying distribution of true values, and trial design are identified that can affect the relative quality of the derived item scores. The newly introduced scoring algorithms consistently outperformed scoring algorithms used in the previous literature on scoring many-item best-worst data.
Scientific and social challenges for the management of fire-prone wildland-urban interfaces
NASA Astrophysics Data System (ADS)
Gill, A. Malcolm; Stephens, Scott L.
2009-09-01
At their worst, fires at the rural-urban or wildland-urban interface cause tragic loss of human lives and homes, but mitigating these fire effects through management elicits many social and scientific challenges. This paper addresses four interconnected management challenges posed by socially disastrous landscape fires. The issues concern various assets (particularly houses, human life and biodiversity), fuel treatments, and fire and human behaviours. The topics considered are: 'asset protection zones'; 'defensible space' and urban fire spread in relation to house ignition and loss; 'stay-or-go' policy and the prediction of time available for safe egress and the possible conflict between the creation of defensible space and wildland management objectives. The first scientific challenge is to model the effective width of an asset protection zone of an urban area. The second is to consider the effect of vegetation around a house, potentially defensible space, on fire arrival at the structure. The third scientific challenge is to present stakeholders with accurate information on rates of spread, and where the fire front is located, so as to allow them to plan safe egress or preparation time in their particular circumstances. The fourth scientific challenge is to be able to predict the effects of fires on wildland species composition. Associated with each scientific challenge is a social challenge: for the first two scientific challenges the social challenge is to co-ordinate fuel management within and between the urban and rural or wildland sides of the interface. For the third scientific challenge, the social challenge is to be aware of, and appropriately use, fire danger information so that the potential for safe egress from a home can be estimated most accurately. Finally, the fourth social challenge is to for local residents of wildland-urban interfaces with an interest in biodiversity conservation to understand the effects of fire regimes on biodiversity, thereby assisting hard-pressed wildland managers to make informed choices.
Viner, Brian J; Jannik, Tim; Hepworth, Allan; Adetona, Olorunfemi; Naeher, Luke; Eddy, Teresa; Doman, Eric; Blake, John
2018-02-01
The contaminated ground surface at Savannah River Site (SRS) is a result of the decades of work that has been performed maintaining the country's nuclear stockpile and performing research and development on nuclear materials. The volatilization of radionuclides during wildfire results in airborne particles that are dispersed within the smoke plume and may result in doses to downwind firefighters and the public. To better understand the risk that these smoke plumes present, we have characterized four regions at SRS in terms of their fuel characteristics and radiological contamination on the ground. Combined with general meteorological conditions describing typical and extreme burn conditions, we have simulated potential fires in these regions and predicted the potential radiological dose that could be received by firefighting personnel and the public surrounding the SRS. In all cases, the predicted cumulative dose was a small percent of the US Department of Energy regulatory limit (0.25 mSv). These predictions were conservative and assumed that firefighters would be exposed for the duration of their shift and the public would be exposed for the entire day over the duration of the burn. Realistically, firefighters routinely rotate off the firefront during their shift and the public would likely remain indoors much of the day. However, we show that even under worst-case conditions the regulatory limits are not exceeded. We can infer that the risks associated with wildfires would not be expected to cause cumulative doses above the level of concern to either responding personnel or the offsite public. Copyright © 2017 Elsevier Ltd. All rights reserved.
Viner, Brian J.; Jannik, Tim; Hepworth, Allan; ...
2017-11-22
The contaminated ground surface at Savannah River Site (SRS) is a result of the decades of work that has been performed maintaining the country's nuclear stockpile and performing research and development on nuclear materials. The volatilization of radionuclides during wildfire results in airborne particles that are dispersed within the smoke plume and may result in doses to downwind firefighters and the public. To better understand the risk that these smoke plumes present, we have characterized four regions at SRS in terms of their fuel characteristics and radiological contamination on the ground. Combined with general meteorological conditions describing typical and extrememore » burn conditions, we have simulated potential fires in these regions and predicted the potential radiological dose that could be received by firefighting personnel and the public surrounding the SRS. In all cases, the predicted cumulative dose was a small percent of the US Department of Energy regulatory limit (0.25 mSv). These predictions were conservative and assumed that firefighters would be exposed for the duration of their shift and the public would be exposed for the entire day over the duration of the burn. Realistically, firefighters routinely rotate off the firefront during their shift and the public would likely remain indoors much of the day. However, we show that even under worst-case conditions the regulatory limits are not exceeded. In conclusion, we can infer that the risks associated with wildfires would not be expected to cause cumulative doses above the level of concern to either responding personnel or the offsite public.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viner, Brian J.; Jannik, Tim; Hepworth, Allan
The contaminated ground surface at Savannah River Site (SRS) is a result of the decades of work that has been performed maintaining the country's nuclear stockpile and performing research and development on nuclear materials. The volatilization of radionuclides during wildfire results in airborne particles that are dispersed within the smoke plume and may result in doses to downwind firefighters and the public. To better understand the risk that these smoke plumes present, we have characterized four regions at SRS in terms of their fuel characteristics and radiological contamination on the ground. Combined with general meteorological conditions describing typical and extrememore » burn conditions, we have simulated potential fires in these regions and predicted the potential radiological dose that could be received by firefighting personnel and the public surrounding the SRS. In all cases, the predicted cumulative dose was a small percent of the US Department of Energy regulatory limit (0.25 mSv). These predictions were conservative and assumed that firefighters would be exposed for the duration of their shift and the public would be exposed for the entire day over the duration of the burn. Realistically, firefighters routinely rotate off the firefront during their shift and the public would likely remain indoors much of the day. However, we show that even under worst-case conditions the regulatory limits are not exceeded. In conclusion, we can infer that the risks associated with wildfires would not be expected to cause cumulative doses above the level of concern to either responding personnel or the offsite public.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMillan, W. W.; Pierce, R.; Sparling, L. C.
2010-01-05
Quantifying the impacts of remote sources on individual air quality exceedances remains a significant challenge for air quality forecasting. One goal of the 2006 Texas Air Quality Study (TEXAQS II) was to assess the impact of distant sources on air quality in east Texas. From 23-30 August 2006, retrievals of tropospheric carbon monoxide (CO) from NASA’s Atmospheric InfraRed Sounder (AIRS) reveal the transport of CO from fires in the United States Pacific Northwest to Houston, Texas. This transport occurred behind a cold front and contributed to the worst ozone exceedance period of the summer in the Houston area. We presentmore » supporting satellite observations from the NASA A-Train constellation of the vertical distribution of smoke aerosols and CO. Ground-based in situ CO measurements in Oklahoma and Texas track the CO plume as it moves south and indicate mixing of the aloft plume to the surface by turbulence in the nocturnal boundary layer and convection during the day. Ground-based aerosol speciation and lidar observations do not find appreciable smoke aerosol transport for this case. However, MODIS aerosol optical depths and model simulations indicate some smoke aerosols were transported from the Pacific Northwest through Texas to the Gulf of Mexico. Chemical transport and forward trajectory models confirm the three major observations: (1) the AIRS envisioned CO transport, (2) the satellite determined smoke plume height, and (3) the timing of the observed surface CO increases. Further, the forward trajectory simulations find two of the largest Pacific Northwest fires likely had the most significant impact.« less
NASA Technical Reports Server (NTRS)
Simon, M. K.; Polydoros, A.
1981-01-01
This paper examines the performance of coherent QPSK and QASK systems combined with FH or FH/PN spread spectrum techniques in the presence of partial-band multitone or noise jamming. The worst-case jammer and worst-case performance are determined as functions of the signal-to-background noise ratio (SNR) and signal-to-jammer power ratio (SJR). Asymptotic results for high SNR are shown to have a linear dependence between the jammer's optimal power allocation and the system error probability performance.
Optimal Analyses for 3×n AB Games in the Worst Case
NASA Astrophysics Data System (ADS)
Huang, Li-Te; Lin, Shun-Shii
The past decades have witnessed a growing interest in research on deductive games such as Mastermind and AB game. Because of the complicated behavior of deductive games, tree-search approaches are often adopted to find their optimal strategies. In this paper, a generalized version of deductive games, called 3×n AB games, is introduced. However, traditional tree-search approaches are not appropriate for solving this problem since it can only solve instances with smaller n. For larger values of n, a systematic approach is necessary. Therefore, intensive analyses of playing 3×n AB games in the worst case optimally are conducted and a sophisticated method, called structural reduction, which aims at explaining the worst situation in this game is developed in the study. Furthermore, a worthwhile formula for calculating the optimal numbers of guesses required for arbitrary values of n is derived and proven to be final.
How can health systems research reach the worst-off? A conceptual exploration.
Pratt, Bridget; Hyder, Adnan A
2016-11-15
Health systems research is increasingly being conducted in low and middle-income countries (LMICs). Such research should aim to reduce health disparities between and within countries as a matter of global justice. For such research to do so, ethical guidance that is consistent with egalitarian theories of social justice proposes it ought to (amongst other things) focus on worst-off countries and research populations. Yet who constitutes the worst-off is not well-defined. By applying existing work on disadvantage from political philosophy, the paper demonstrates that (at least) two options exist for how to define the worst-off upon whom equity-oriented health systems research should focus: those who are worst-off in terms of health or those who are systematically disadvantaged. The paper describes in detail how both concepts can be understood and what metrics can be relied upon to identify worst-off countries and research populations at the sub-national level (groups, communities). To demonstrate how each can be used, the paper considers two real-world cases of health systems research and whether their choice of country (Uganda, India) and research population in 2011 would have been classified as amongst the worst-off according to the proposed concepts. The two proposed concepts can classify different countries and sub-national populations as worst-off. It is recommended that health researchers (or other actors) should use the concept that best reflects their moral commitments-namely, to perform research focused on reducing health inequalities or systematic disadvantage more broadly. If addressing the latter, it is recommended that they rely on the multidimensional poverty approach rather than the income approach to identify worst-off populations.
Faith, Daniel P
2015-02-19
The phylogenetic diversity measure, ('PD'), measures the relative feature diversity of different subsets of taxa from a phylogeny. At the level of feature diversity, PD supports the broad goal of biodiversity conservation to maintain living variation and option values. PD calculations at the level of lineages and features include those integrating probabilities of extinction, providing estimates of expected PD. This approach has known advantages over the evolutionarily distinct and globally endangered (EDGE) methods. Expected PD methods also have limitations. An alternative notion of expected diversity, expected functional trait diversity, relies on an alternative non-phylogenetic model and allows inferences of diversity at the level of functional traits. Expected PD also faces challenges in helping to address phylogenetic tipping points and worst-case PD losses. Expected PD may not choose conservation options that best avoid worst-case losses of long branches from the tree of life. We can expand the range of useful calculations based on expected PD, including methods for identifying phylogenetic key biodiversity areas. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Haji Hosseinloo, Ashkan; Turitsyn, Konstantin
2016-04-01
Vibration energy harvesting has been shown as a promising power source for many small-scale applications mainly because of the considerable reduction in the energy consumption of the electronics and scalability issues of the conventional batteries. However, energy harvesters may not be as robust as the conventional batteries and their performance could drastically deteriorate in the presence of uncertainty in their parameters. Hence, study of uncertainty propagation and optimization under uncertainty is essential for proper and robust performance of harvesters in practice. While all studies have focused on expectation optimization, we propose a new and more practical optimization perspective; optimization for the worst-case (minimum) power. We formulate the problem in a generic fashion and as a simple example apply it to a linear piezoelectric energy harvester. We study the effect of parametric uncertainty in its natural frequency, load resistance, and electromechanical coupling coefficient on its worst-case power and then optimize for it under different confidence levels. The results show that there is a significant improvement in the worst-case power of thus designed harvester compared to that of a naively-optimized (deterministically-optimized) harvester.
Combining instruction prefetching with partial cache locking to improve WCET in real-time systems.
Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng
2013-01-01
Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking.
Combining Instruction Prefetching with Partial Cache Locking to Improve WCET in Real-Time Systems
Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng
2013-01-01
Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking. PMID:24386133
Kuijpers, Laura Maria Francisca; Maltha, Jessica; Guiraud, Issa; Kaboré, Bérenger; Lompo, Palpouguini; Devlieger, Hugo; Van Geet, Chris; Tinto, Halidou; Jacobs, Jan
2016-06-02
Plasmodium falciparum infection may cause severe anaemia, particularly in children. When planning a diagnostic study on children suspected of severe malaria in sub-Saharan Africa, it was questioned how much blood could be safely sampled; intended blood volumes (blood cultures and EDTA blood) were 6 mL (children aged <6 years) and 10 mL (6-12 years). A previous review [Bull World Health Organ. 89: 46-53. 2011] recommended not to exceed 3.8 % of total blood volume (TBV). In a simulation exercise using data of children previously enrolled in a study about severe malaria and bacteraemia in Burkina Faso, the impact of this 3.8 % safety guideline was evaluated. For a total of 666 children aged >2 months to <12 years, data of age, weight and haemoglobin value (Hb) were available. For each child, the estimated TBV (TBVe) (mL) was calculated by multiplying the body weight (kg) by the factor 80 (ml/kg). Next, TBVe was corrected for the degree of anaemia to obtain the functional TBV (TBVf). The correction factor consisted of the rate 'Hb of the child divided by the reference Hb'; both the lowest ('best case') and highest ('worst case') reference Hb values were used. Next, the exact volume that a 3.8 % proportion of this TBVf would present was calculated and this volume was compared to the blood volumes that were intended to be sampled. When applied to the Burkina Faso cohort, the simulation exercise pointed out that in 5.3 % (best case) and 11.4 % (worst case) of children the blood volume intended to be sampled would exceed the volume as defined by the 3.8 % safety guideline. Highest proportions would be in the age groups 2-6 months (19.0 %; worst scenario) and 6 months-2 years (15.7 %; worst case scenario). A positive rapid diagnostic test for P. falciparum was associated with an increased risk of violating the safety guideline in the worst case scenario (p = 0.016). Blood sampling in children for research in P. falciparum endemic settings may easily violate the proposed safety guideline when applied to TBVf. Ethical committees and researchers should be wary of this and take appropriate precautions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boone, James L.; Rakestraw, Danny L.; Rautenstrauch, Kurt R.
1997-12-16
GOPHERLTS AGAISSIZII (Desert Tortoise). Predation. A variety of predators, most notably coyotes (Canis Iatrans) and Common Ravens (Corvis corau) have been reported to prey on hatchling desert tortoises (Emst et al. 1994). Turtles of the United States and Canada (Smithsonian Institution Press, Washington, D.C. 578 pp.). Here, we report an observation of a hatchling tortoise, fitted with a radiotransmitter, that was preyed upon by native fire ants (Solenopsis sp.) in the eastern Mojave Desert at Yucca Mountain, Nevada (36 degrees 50 minutes N, 116 degree 25 minutes E). On 8/27/94, tortoise No.9315 (carapace length = 45 mm, age = 5more » d) was found alive with eyes, chin, and parts of the head and legs being eaten by ants. The tortoise was alive, but lethargic, and responded little when touched. Eight of 74 other radiomarked hatchlings monitored at Yucca Mountain during 1992-1994 were found dead with fire ants on their carcass 3-7 days after the hatchlings emerged from their nests. It is not known whether those tortoises were killed by ants or were being scavenged when found. While imported fire ants (S. invicta) have long been known to kill hatchling gopher tortoises (G. polyphemus; Mount 1981. J. Alabama Acad. Sci. 52: 71-78), native fire ants have previously not been implicated as predators of desert tortoises. However, only 1 of 75 (or at worst 9 of 75) was killed by fire ants, suggesting that although fire ants do kill hatchlings, they were not important predators on desert tortoises during this study. Tortoise specimens were deposited at the University of California at Berkeley.« less
Discussions On Worst-Case Test Condition For Single Event Burnout
NASA Astrophysics Data System (ADS)
Liu, Sandra; Zafrani, Max; Sherman, Phillip
2011-10-01
This paper discusses the failure characteristics of single- event burnout (SEB) on power MOSFETs based on analyzing the quasi-stationary avalanche simulation curves. The analyses show the worst-case test condition for SEB would be using the ion that has the highest mass that would result in the highest transient current due to charge deposition and displacement damage. The analyses also show it is possible to build power MOSFETs that will not exhibit SEB even when tested with the heaviest ion, which have been verified by heavy ion test data on SEB sensitive and SEB immune devices.
Applications of surface metrology in firearm identification
NASA Astrophysics Data System (ADS)
Zheng, X.; Soons, J.; Vorburger, T. V.; Song, J.; Renegar, T.; Thompson, R.
2014-01-01
Surface metrology is commonly used to characterize functional engineering surfaces. The technologies developed offer opportunities to improve forensic toolmark identification. Toolmarks are created when a hard surface, the tool, comes into contact with a softer surface and causes plastic deformation. Toolmarks are commonly found on fired bullets and cartridge cases. Trained firearms examiners use these toolmarks to link an evidence bullet or cartridge case to a specific firearm, which can lead to a criminal conviction. Currently, identification is typically based on qualitative visual comparison by a trained examiner using a comparison microscope. In 2009, a report by the National Academies called this method into question. Amongst other issues, they questioned the objectivity of visual toolmark identification by firearms examiners. The National Academies recommended the development of objective toolmark identification criteria and confidence limits. The National Institute of Standards and Technology (NIST) have applied its experience in surface metrology to develop objective identification criteria, measurement methods, and reference artefacts for toolmark identification. NIST developed the Standard Reference Material SRM 2460 standard bullet and SRM 2461 standard cartridge case to facilitate quality control and traceability of identifications performed in crime laboratories. Objectivity is improved through measurement of surface topography and application of unambiguous surface similarity metrics, such as the maximum value (ACCFMAX) of the areal cross correlation function. Case studies were performed on consecutively manufactured tools, such as gun barrels and breech faces, to demonstrate that, even in this worst case scenario, all the tested tools imparted unique surface topographies that were identifiable. These studies provide scientific support for toolmark evidence admissibility in criminal court cases.
Bartnicki, Jerzy; Amundsen, Ingar; Brown, Justin; Hosseini, Ali; Hov, Øystein; Haakenstad, Hilde; Klein, Heiko; Lind, Ole Christian; Salbu, Brit; Szacinski Wendel, Cato C; Ytre-Eide, Martin Album
2016-01-01
The Russian nuclear submarine K-27 suffered a loss of coolant accident in 1968 and with nuclear fuel in both reactors it was scuttled in 1981 in the outer part of Stepovogo Bay located on the eastern coast of Novaya Zemlya. The inventory of spent nuclear fuel on board the submarine is of concern because it represents a potential source of radioactive contamination of the Kara Sea and a criticality accident with potential for long-range atmospheric transport of radioactive particles cannot be ruled out. To address these concerns and to provide a better basis for evaluating possible radiological impacts of potential releases in case a salvage operation is initiated, we assessed the atmospheric transport of radionuclides and deposition in Norway from a hypothetical criticality accident on board the K-27. To achieve this, a long term (33 years) meteorological database has been prepared and used for selection of the worst case meteorological scenarios for each of three selected locations of the potential accident. Next, the dispersion model SNAP was run with the source term for the worst-case accident scenario and selected meteorological scenarios. The results showed predictions to be very sensitive to the estimation of the source term for the worst-case accident and especially to the sizes and densities of released radioactive particles. The results indicated that a large area of Norway could be affected, but that the deposition in Northern Norway would be considerably higher than in other areas of the country. The simulations showed that deposition from the worst-case scenario of a hypothetical K-27 accident would be at least two orders of magnitude lower than the deposition observed in Norway following the Chernobyl accident. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Busch, Martin H J; Vollmann, Wolfgang; Grönemeyer, Dietrich H W
2006-05-26
Active magnetic resonance imaging implants, for example stents, stent grafts or vena cava filters, are constructed as wireless inductively coupled transmit and receive coils. They are built as a resonator tuned to the Larmor frequency of a magnetic resonance system. The resonator can be added to or incorporated within the implant. This technology can counteract the shielding caused by eddy currents inside the metallic implant structure. This may allow getting diagnostic information of the implant lumen (in stent stenosis or thrombosis for example). The electro magnetic rf-pulses during magnetic resonance imaging induce a current in the circuit path of the resonator. A by material fatigue provoked partial rupture of the circuit path or a broken wire with touching surfaces can set up a relatively high resistance on a very short distance, which may behave as a point-like power source, a hot spot, inside the body part the resonator is implanted to. This local power loss inside a small volume can reach (1/4) of the total power loss of the intact resonating circuit, which itself is proportional to the product of the resonator volume and the quality factor and depends as well from the orientation of the resonator with respect to the main magnetic field and the imaging sequence the resonator is exposed to. First an analytical solution of a hot spot for thermal equilibrium is described. This analytical solution with a definite hot spot power loss represents the worst case scenario for thermal equilibrium inside a homogeneous medium without cooling effects. Starting with this worst case assumptions additional conditions are considered in a numerical simulation, which are more realistic and may make the results less critical. The analytical solution as well as the numerical simulations use the experimental experience of the maximum hot spot power loss of implanted resonators with a definite volume during magnetic resonance imaging investigations. The finite volume analysis calculates the time developing temperature maps for the model of a broken linear metallic wire embedded in tissue. Half of the total hot spot power loss is assumed to diffuse into both wire parts at the location of a defect. The energy is distributed from there by heat conduction. Additionally the effect of blood perfusion and blood flow is respected in some simulations because the simultaneous appearance of all worst case conditions, especially the absence of blood perfusion and blood flow near the hot spot, is very unlikely for vessel implants. The analytical solution as worst case scenario as well as the finite volume analysis for near worst case situations show not negligible volumes with critical temperature increases for part of the modeled hot spot situations. MR investigations with a high rf-pulse density lasting below a minute can establish volumes of several cubic millimeters with temperature increases high enough to start cell destruction. Longer exposure times can involve volumes larger than 100 mm3. Even temperature increases in the range of thermal ablation are reached for substantial volumes. MR sequence exposure time and hot spot power loss are the primary factors influencing the volume with critical temperature increases. Wire radius, wire material as well as the physiological parameters blood perfusion and blood flow inside larger vessels reduce the volume with critical temperature increases, but do not exclude a volume with critical tissue heating for resonators with a large product of resonator volume and quality factor. The worst case scenario assumes thermal equilibrium for a hot spot embedded in homogeneous tissue without any cooling due to blood perfusion or flow. The finite volume analysis can calculate the results for near and not close to worst case conditions. For both cases a substantial volume can reach a critical temperature increase in a short time. The analytical solution, as absolute worst case, points out that resonators with a small product of inductance volume and quality factor (Q V(ind) < 2 cm3) are definitely save. Stents for coronary vessels or resonators used as tracking devices for interventional procedures therefore have no risk of high temperature increases. The finite volume analysis shows for sure that also conditions not close to the worst case reach physiologically critical temperature increases for implants with a large product of inductance volume and quality factor (Q V(ind) > 10 cm3). Such resonators exclude patients from exactly the MRI investigation these devices are made for.
Busch, Martin HJ; Vollmann, Wolfgang; Grönemeyer, Dietrich HW
2006-01-01
Background Active magnetic resonance imaging implants, for example stents, stent grafts or vena cava filters, are constructed as wireless inductively coupled transmit and receive coils. They are built as a resonator tuned to the Larmor frequency of a magnetic resonance system. The resonator can be added to or incorporated within the implant. This technology can counteract the shielding caused by eddy currents inside the metallic implant structure. This may allow getting diagnostic information of the implant lumen (in stent stenosis or thrombosis for example). The electro magnetic rf-pulses during magnetic resonance imaging induce a current in the circuit path of the resonator. A by material fatigue provoked partial rupture of the circuit path or a broken wire with touching surfaces can set up a relatively high resistance on a very short distance, which may behave as a point-like power source, a hot spot, inside the body part the resonator is implanted to. This local power loss inside a small volume can reach ¼ of the total power loss of the intact resonating circuit, which itself is proportional to the product of the resonator volume and the quality factor and depends as well from the orientation of the resonator with respect to the main magnetic field and the imaging sequence the resonator is exposed to. Methods First an analytical solution of a hot spot for thermal equilibrium is described. This analytical solution with a definite hot spot power loss represents the worst case scenario for thermal equilibrium inside a homogeneous medium without cooling effects. Starting with this worst case assumptions additional conditions are considered in a numerical simulation, which are more realistic and may make the results less critical. The analytical solution as well as the numerical simulations use the experimental experience of the maximum hot spot power loss of implanted resonators with a definite volume during magnetic resonance imaging investigations. The finite volume analysis calculates the time developing temperature maps for the model of a broken linear metallic wire embedded in tissue. Half of the total hot spot power loss is assumed to diffuse into both wire parts at the location of a defect. The energy is distributed from there by heat conduction. Additionally the effect of blood perfusion and blood flow is respected in some simulations because the simultaneous appearance of all worst case conditions, especially the absence of blood perfusion and blood flow near the hot spot, is very unlikely for vessel implants. Results The analytical solution as worst case scenario as well as the finite volume analysis for near worst case situations show not negligible volumes with critical temperature increases for part of the modeled hot spot situations. MR investigations with a high rf-pulse density lasting below a minute can establish volumes of several cubic millimeters with temperature increases high enough to start cell destruction. Longer exposure times can involve volumes larger than 100 mm3. Even temperature increases in the range of thermal ablation are reached for substantial volumes. MR sequence exposure time and hot spot power loss are the primary factors influencing the volume with critical temperature increases. Wire radius, wire material as well as the physiological parameters blood perfusion and blood flow inside larger vessels reduce the volume with critical temperature increases, but do not exclude a volume with critical tissue heating for resonators with a large product of resonator volume and quality factor. Conclusion The worst case scenario assumes thermal equilibrium for a hot spot embedded in homogeneous tissue without any cooling due to blood perfusion or flow. The finite volume analysis can calculate the results for near and not close to worst case conditions. For both cases a substantial volume can reach a critical temperature increase in a short time. The analytical solution, as absolute worst case, points out that resonators with a small product of inductance volume and quality factor (Q Vind < 2 cm3) are definitely save. Stents for coronary vessels or resonators used as tracking devices for interventional procedures therefore have no risk of high temperature increases. The finite volume analysis shows for sure that also conditions not close to the worst case reach physiologically critical temperature increases for implants with a large product of inductance volume and quality factor (Q Vind > 10 cm3). Such resonators exclude patients from exactly the MRI investigation these devices are made for. PMID:16729878
Jow, Uei-Ming; Ghovanloo, Maysam
2012-12-21
We present a design methodology for an overlapping hexagonal planar spiral coil (hex-PSC) array, optimized for creation of a homogenous magnetic field for wireless power transmission to randomly moving objects. The modular hex-PSC array has been implemented in the form of three parallel conductive layers, for which an iterative optimization procedure defines the PSC geometries. Since the overlapping hex-PSCs in different layers have different characteristics, the worst case coil-coupling condition should be designed to provide the maximum power transfer efficiency (PTE) in order to minimize the spatial received power fluctuations. In the worst case, the transmitter (Tx) hex-PSC is overlapped by six PSCs and surrounded by six other adjacent PSCs. Using a receiver (Rx) coil, 20 mm in radius, at the coupling distance of 78 mm and maximum lateral misalignment of 49.1 mm (1/√3 of the PSC radius) we can receive power at a PTE of 19.6% from the worst case PSC. Furthermore, we have studied the effects of Rx coil tilting and concluded that the PTE degrades significantly when θ > 60°. Solutions are: 1) activating two adjacent overlapping hex-PSCs simultaneously with out-of-phase excitations to create horizontal magnetic flux and 2) inclusion of a small energy storage element in the Rx module to maintain power in the worst case scenarios. In order to verify the proposed design methodology, we have developed the EnerCage system, which aims to power up biological instruments attached to or implanted in freely behaving small animal subjects' bodies in long-term electrophysiology experiments within large experimental arenas.
Failed State 2030: Nigeria - A Case Study
2011-02-01
disastrous ecological conditions in its Niger Delta region, and is fighting one of the modern world?s worst legacies of political and economic corruption. A ...world’s worst legacies of political and economic corruption. A nation with more than 350 ethnic groups, 250 languages, and three distinct religious...happening in the world. The discus- sion herein is a mix of cultural sociology, political science, econom - ics, military science (sometimes called
NASA Technical Reports Server (NTRS)
Avila, Arturo
2011-01-01
The Standard JPL thermal engineering practice prescribes worst-case methodologies for design. In this process, environmental and key uncertain thermal parameters (e.g., thermal blanket performance, interface conductance, optical properties) are stacked in a worst case fashion to yield the most hot- or cold-biased temperature. Thus, these simulations would represent the upper and lower bounds. This, effectively, represents JPL thermal design margin philosophy. Uncertainty in the margins and the absolute temperatures is usually estimated by sensitivity analyses and/or by comparing the worst-case results with "expected" results. Applicability of the analytical model for specific design purposes along with any temperature requirement violations are documented in peer and project design review material. In 2008, NASA released NASA-STD-7009, Standard for Models and Simulations. The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the Modeling and Simulation (M&S) credibility, and the reporting of the M&S results. The Mars Exploration Rover (MER) project thermal control system M&S activity was chosen as a case study determining whether JPL practice is in line with the standard and to identify areas of non-compliance. This paper summarizes the results and makes recommendations regarding the application of this standard to JPL thermal M&S practices.
2012-04-30
DoD SERC Aeronautics & Astronautics 5/16/2012 NPS 9th Annual Acquisition Research Symposium...0.6 0.7 0.8 0.9 1 0 60 120 180 240 300 360 420 480 540 600 Pr ob ab ili ty to c om pl et e a m is si on Time (mins) architecture 1 architecture 2...1 6 11 /1 6 12 /1 6 13 /1 6 14 /1 6 15 /1 6 1Pr ob ab ili ty to c om pl et e a m is si on % of system failures worst-case in arch1 worst-case in
NASA Technical Reports Server (NTRS)
Retallick, F. D.
1980-01-01
Directly-fired, separately-fired, and oxygen-augmented MHD power plants incorporating a disk geometry for the MHD generator were studied. The base parameters defined for four near-optimum-performance MHD steam power systems of various types are presented. The finally selected systems consisted of (1) two directly fired cases, one at 1920 K (2996F) preheat and the other at 1650 K (2500 F) preheat, (2) a separately-fired case where the air is preheated to the same level as the higher temperature directly-fired cases, and (3) an oxygen augmented case with the same generator inlet temperature of 2839 (4650F) as the high temperature directly-fired and separately-fired cases. Supersonic Mach numbers at the generator inlet, gas inlet swirl, and constant Hall field operation were specified based on disk generator optimization. System pressures were based on optimization of MHD net power. Supercritical reheat stream plants were used in all cases. Open and closed cycle component costs are summarized and compared.
Fine-Scale Structure Design for 3D Printing
NASA Astrophysics Data System (ADS)
Panetta, Francis Julian
Modern additive fabrication technologies can manufacture shapes whose geometric complexities far exceed what existing computational design tools can analyze or optimize. At the same time, falling costs have placed these fabrication technologies within the average consumer's reach. Especially for inexpert designers, new software tools are needed to take full advantage of 3D printing technology. This thesis develops such tools and demonstrates the exciting possibilities enabled by fine-tuning objects at the small scales achievable by 3D printing. The thesis applies two high-level ideas to invent these tools: two-scale design and worst-case analysis. The two-scale design approach addresses the problem that accurately simulating--let alone optimizing--the full-resolution geometry sent to the printer requires orders of magnitude more computational power than currently available. However, we can decompose the design problem into a small-scale problem (designing tileable structures achieving a particular deformation behavior) and a macro-scale problem (deciding where to place these structures in the larger object). This separation is particularly effective, since structures for every useful behavior can be designed once, stored in a database, then reused for many different macroscale problems. Worst-case analysis refers to determining how likely an object is to fracture by studying the worst possible scenario: the forces most efficiently breaking it. This analysis is needed when the designer has insufficient knowledge or experience to predict what forces an object will undergo, or when the design is intended for use in many different scenarios unknown a priori. The thesis begins by summarizing the physics and mathematics necessary to rigorously approach these design and analysis problems. Specifically, the second chapter introduces linear elasticity and periodic homogenization. The third chapter presents a pipeline to design microstructures achieving a wide range of effective isotropic elastic material properties on a single-material 3D printer. It also proposes a macroscale optimization algorithm placing these microstructures to achieve deformation goals under prescribed loads. The thesis then turns to worst-case analysis, first considering the macroscale problem: given a user's design, the fourth chapter aims to determine the distribution of pressures over the surface creating the highest stress at any point in the shape. Solving this problem exactly is difficult, so we introduce two heuristics: one to focus our efforts on only regions likely to concentrate stresses and another converting the pressure optimization into an efficient linear program. Finally, the fifth chapter introduces worst-case analysis at the microscopic scale, leveraging the insight that the structure of periodic homogenization enables us to solve the problem exactly and efficiently. Then we use this worst-case analysis to guide a shape optimization, designing structures with prescribed deformation behavior that experience minimal stresses in generic use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahi-Anwar, M; Young, S; Lo, P
Purpose: A method to discriminate different types of renal cell carcinoma (RCC) was developed using attenuation values observed in multiphasic contrast-enhanced CT. This work evaluates the sensitivity of this RCC discrimination task at different CT radiation dose levels. Methods: We selected 5 cases of kidney lesion patients who had undergone four-phase CT scans covering the abdomen to the lilac crest. Through an IRB-approved study, the scans were conducted on 64-slice CT scanners (Definition AS/Definition Flash, Siemens Healthcare) using automatic tube-current modulation (TCM). The protocol included an initial baseline unenhanced scan, followed by three post-contrast injection phases. CTDIvol (32 cm phantom)more » measured between 9 to 35 mGy for any given phase. As a preliminary study, we limited the scope to the cortico-medullary phase—shown previously to be the most discriminative phase. A previously validated method was used to simulate a reduced dose acquisition via adding noise to raw CT sinogram data, emulating corresponding images at simulated doses of 50%, 25%, and 10%. To discriminate the lesion subtype, ROIs were placed in the most enhancing region of the lesion. The mean HU value of an ROI was extracted and used to discriminate to the worst-case RCC subtype, ranked in the order of clear cell, papillary, chromophobe and the benign oncocytoma. Results: Two patients exhibited a change of worst case RCC subtype between original and simulated scans, at 25% and 10% doses. In one case, the worst-case RCC subtype changed from oncocytoma to chromophobe at 10% and 25% doses, while the other case changed from oncocytoma to clear cell at 10% dose. Conclusion: Based on preliminary results from an initial cohort of 5 patients, worst-case RCC subtypes remained constant at all simulated dose levels except for 2 patients. Further study conducted on more patients will be needed to confirm our findings. Institutional research agreement, Siemens Healthcare; Past recipient, research grant support, Siemens Healthcare; Consultant, Toshiba America Medical Systems; Consultant, Samsung Electronics; NIH Grant Support from: U01 CA181156.« less
NASA Astrophysics Data System (ADS)
Field, R. D.; van der Werf, G.; Fanin, T.; Fetzer, E. J.; Fuller, R. A.; Jethva, H. T.; Levy, R. C.; Livesey, N. J.; Luo, M.; Torres, O.; Worden, H. M.
2016-12-01
The 2015 fire season and related smoke pollution in Indonesia was more severe than the major 2006 episode, making it the most severe season observed by the NASA Earth Observing System satellites that go back to the early 2000s, namely active fire detections from the Terra and Aqua Moderate Resolution Imaging Spectroradiometers (MODIS), MODIS aerosol optical depth, Terra Measurement of Pollution in the Troposphere (MOPITT) carbon monoxide (CO), Aqua Atmospheric Infrared Sounder (AIRS) CO, Aura Ozone Monitoring Instrument (OMI) aerosol index, and Aura Microwave Limb Sounder (MLS) CO. The MLS CO in the upper troposphere showed a plume of pollution stretching from East Africa to the western Pacific Ocean that persisted for two months. Longer-term records of airport visibility in Sumatra and Kalimantan show that 2015 ranked after 1997 and alongside 1991 and 1994 as among the worst episodes on record. Analysis of yearly dry season rainfall from the Tropical Rainfall Measurement Mission (TRMM) and rain gauges shows that, due to the continued use of fire to clear and prepare land on degraded peat, the Indonesian fire environment continues to have non-linear sensitivity to dry conditions during prolonged periods with less than 4mm/day of precipitation, and this sensitivity appears to have increased over Kalimantan. Without significant reforms in land use and the adoption of early warning triggers tied to precipitation forecasts, these intense fire episodes will re-occur during future droughts, usually associated with El Niño events. Characterization of this signifcant event was only possible with EOS data, from the A-train instruments especially.
Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino
2017-03-01
Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet tight dose limits. For robust optimization, the worst case dose approach was less sensitive to uncertainties than was the minmax approach for the prostate and skull-based cancer patients, whereas the minmax approach was superior for the head and neck cancer patients. The robustness of the IMPT plans was remarkably better after robust optimization than after PTV-based optimization, and the NLP-PTV-based optimization outperformed the LP-PTV-based optimization regarding robustness of clinical target volume coverage. In addition, plans generated using LP-based methods had notably fewer scanning spots than did those generated using NLP-based methods. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Yang, Tsong-Shing; Chi, Ching-Chi; Wang, Shu-Hui; Lin, Jing-Chi; Lin, Ko-Ming
2016-10-01
Biologic therapies are more effective but more costly than conventional therapies in treating psoriatic arthritis. To evaluate the cost-efficacy of etanercept, adalimumab and golimumab therapies in treating active psoriatic arthritis in a Taiwanese setting. We conducted a meta-analysis of randomized placebo-controlled trials to calculate the incremental efficacy of etanercept, adalimumab and golimumab, respectively, in achieving Psoriatic Arthritis Response Criteria (PsARC) and a 20% improvement in the American College of Rheumatology score (ACR20). The base, best, and worst case incremental cost-effectiveness ratios (ICERs) for one subject to achieve PsARC and ACR20 were calculated. The annual ICER per PsARC responder were US$27 047 (best scenario US$16 619; worst scenario US$31 350), US$39 339 (best scenario US$31 846; worst scenario US$53 501) and US$27 085 (best scenario US$22 716; worst scenario US$33 534) for etanercept, adalimumab and golimumab, respectively. The annual ICER per ACR20 responder were US$27 588 (best scenario US$20 900; worst scenario US$41 800), US$39 339 (best scenario US$25 236; worst scenario US$83 595) and US$33 534 (best scenario US$27 616; worst scenario US$44 013) for etanercept, adalimumab and golimumab, respectively. In a Taiwanese setting, etanercept had the lowest annual costs per PsARC and ACR20 responder, while adalimumab had the highest annual costs per PsARC and ACR responder. © 2015 Asia Pacific League of Associations for Rheumatology and Wiley Publishing Asia Pty Ltd.
Manzello, Samuel L; Suzuki, Sayaka; Nii, Daisaku
2017-03-01
Structure ignition by wind-driven firebrand showers is an important fire spread mechanism in large outdoor fires. Experiments were conducted with three common mulch types (shredded hardwood mulch, Japanese Cypress wood chips, and pine bark nuggets) placed adjacent to realistic-scale reentrant corners. In the first series of experiments, mulch beds were placed adjacent to a re-entrant corner constructed with wood studs and lined with oriented strand board (OSB) as the sheathing. The premise behind conducting experiments with no siding treatments applied was predicated on the notion that bare OSB mulch contact would be a worst-case scenario, and therefore, a wall assembly in the most vulnerable state to mulch ignition. In the second series of experiments, vinyl siding was applied to the re-entrant corner assemblies (wood studs/OSB/moisture barrier/vinyl siding), and the influence of vertical separation distance (102 mm or 203 mm) on wall ignition from adjacent mulch beds was determined. The vertical separation distance was maintained by applying gypsum board to the base of the re-entrant corner. The siding itself did not influence the ignition process for the mulch beds, as the mulch beds were the first to ignite from the firebrand showers. In all experiments, it was observed that firebrands produced smoldering ignition in the mulch beds, this transitioned to flaming ignition, and the re-entrant corner assembly was exposed to the flaming mulch beds. With no siding treatments applied, the flaming mulch beds ignited the re-entrant corner, and ignition was observed to propagate to the back side of re-entrant corner assembly under all wind speeds (6 m/s to 8 m/s). With respect to the re-entrant corners fitted with vinyl siding, the mulch type, vertical separation distance, and wind speed were important parameters as to whether flaming ignition was observed to propagate to the back-side of a reentrant corner assembly. Mulches clearly pose an ignition hazard to structures in large outdoor fires.
Wire-reinforced endotracheal tube fire during tracheostomy -A case report-.
Shin, Young Duck; Lim, Seung-Woon; Bae, Jin Ho; Yim, Kyoung Hoon; Sim, Jae Hwan; Kwon, Eun Jung
2012-08-01
Every operation could have a fire emergency, especially in the case of a tracheostomy. When a flammable gas meets a source of heat, the danger of fire is remarkable. A tracheal tube filled with a high concentration of oxygen is also a great risk factor for fire. Intra-tracheal tube fire is a rare, yet critical emergency with catastrophic consequences. Thus, numerous precautions are taken during a tracheostomy like, use of a special tube to prevent laser damage, ballooning of the tube with normal saline instead of air, and dilution of FiO(2) with helium or nitrogen. Since the first recorded cases on tube fires, most of the fires were initiated in the balloon and the tip. In the present case report, however, we came across a fire incidence, which originated from the wire.
Wire-reinforced endotracheal tube fire during tracheostomy -A case report-
Shin, Young Duck; Bae, Jin Ho; Yim, Kyoung Hoon; Sim, Jae Hwan; Kwon, Eun Jung
2012-01-01
Every operation could have a fire emergency, especially in the case of a tracheostomy. When a flammable gas meets a source of heat, the danger of fire is remarkable. A tracheal tube filled with a high concentration of oxygen is also a great risk factor for fire. Intra-tracheal tube fire is a rare, yet critical emergency with catastrophic consequences. Thus, numerous precautions are taken during a tracheostomy like, use of a special tube to prevent laser damage, ballooning of the tube with normal saline instead of air, and dilution of FiO2 with helium or nitrogen. Since the first recorded cases on tube fires, most of the fires were initiated in the balloon and the tip. In the present case report, however, we came across a fire incidence, which originated from the wire. PMID:22949984
Designing and Maintaining a Communication Consulting Relationship: A Fire Officer Case Study
ERIC Educational Resources Information Center
Cragan, John F.
2008-01-01
This case study describes a 35-year communication consulting relationship with the Illinois Fire Chiefs' Association. This case explains the fire chiefs' educational problems, the five-step method for creating an educational curriculum for fire officers, and the five-step procedure for continuous evaluation of the curriculum. Finally, an…
Potential Operating Room Fire Hazard of Bone Cement.
Sibia, Udai S; Connors, Kevin; Dyckman, Sarah; Zahiri, Hamid R; George, Ivan; Park, Adrian E; MacDonald, James H
Approximately 600 cases of operating room (OR) fires are reported annually. Despite extensive fire safety education and training, complete elimination of OR fires still has not been achieved. Each fire requires an ignition source, a fuel source, and an oxidizer. In this case report, we describe the potential fire hazard of bone cement in the OR. A total knee arthroplasty was performed with a standard medial parapatellar arthrotomy. Tourniquet control was used. After bone cement was applied to the prepared tibial surface, the surgeon used an electrocautery device to resect residual lateral meniscus tissue-and started a fire in the operative field. The surgeon suffocated the fire with a dry towel and prevented injury to the patient. We performed a PubMed search with a cross-reference search for relevant papers and found no case reports outlining bone cement as a potential fire hazard in the OR. To our knowledge, this is the first case report identifying bone cement as a fire hazard. OR fires related to bone cement can be eliminated by correctly assessing the setting time of the cement and avoiding application sites during electrocautery.
A Worst-Case Approach for On-Line Flutter Prediction
NASA Technical Reports Server (NTRS)
Lind, Rick C.; Brenner, Martin J.
1998-01-01
Worst-case flutter margins may be computed for a linear model with respect to a set of uncertainty operators using the structured singular value. This paper considers an on-line implementation to compute these robust margins in a flight test program. Uncertainty descriptions are updated at test points to account for unmodeled time-varying dynamics of the airplane by ensuring the robust model is not invalidated by measured flight data. Robust margins computed with respect to this uncertainty remain conservative to the changing dynamics throughout the flight. A simulation clearly demonstrates this method can improve the efficiency of flight testing by accurately predicting the flutter margin to improve safety while reducing the necessary flight time.
Minimax Quantum Tomography: Estimators and Relative Entropy Bounds.
Ferrie, Christopher; Blume-Kohout, Robin
2016-03-04
A minimax estimator has the minimum possible error ("risk") in the worst case. We construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O(1/sqrt[N])-in contrast to that of classical probability estimation, which is O(1/N)-where N is the number of copies of the quantum state used. We trace this deficiency to sampling mismatch: future observations that determine risk may come from a different sample space than the past data that determine the estimate. This makes minimax estimators very biased, and we propose a computationally tractable alternative with similar behavior in the worst case, but superior accuracy on most states.
Sustainability and the origins of wildland fire research
Diane M. Smith
2015-01-01
When looking for the origins of wildland fire research in the Forest Service, forester C. E. (Mike) Hardy makes the case for 1922, when Harry Gisborne became the agencyâs first full-time fire researcher. Leading fire historian Stephen Pyne argues that fire research originated under the leadership of Coert DuBois, who oversaw the first fire case study in 1911, the first...
Boehmler, Erick M.; Severance, Timothy
1997-01-01
Contraction scour for all modelled flows ranged from 3.8 to 6.1 ft. The worst-case contraction scour occurred at the 500-year discharge. Abutment scour ranged from 4.0 to 6.7 ft. The worst-case abutment scour also occurred at the 500-year discharge. Pier scour ranged from 9.1 to 10.2. The worst-case pier scour occurred at the 500-year discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Olson, Scott A.; Hammond, Robert E.
1996-01-01
Contraction scour for all modelled flows ranged from 0.0 to 0.9 ft. The worst-case contraction scour occurred at the 500-year discharge. Abutment scour at the left abutment ranged from 3.1 to 10.3 ft. with the worst-case occurring at the 500-year discharge. Abutment scour at the right abutment ranged from 6.4 to 10.4 ft. with the worst-case occurring at the 100-year discharge.Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Burns, Ronda L.; Medalie, Laura
1997-01-01
Contraction scour for the modelled flows ranged from 1.0 to 2.7 ft. The worst-case contraction scour occurred at the incipient-overtopping discharge. Abutment scour ranged from 8.4 to 17.6 ft. The worst-case abutment scour for the right abutment occurred at the incipient-overtopping discharge. For the left abutment, the worst-case abutment scour occurred at the 500-year discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Burns, R.L.; Medalie, Laura
1998-01-01
Contraction scour for all modelled flows ranged from 0.0 to 2.1 ft. The worst-case contraction scour occurred at the 500-year discharge. Left abutment scour ranged from 6.7 to 8.7 ft. The worst-case left abutment scour occurred at the incipient roadway-overtopping discharge. Right abutment scour ranged from 7.8 to 9.5 ft. The worst-case right abutment scour occurred at the 500-year discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A crosssection of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and Davis, 1995, p. 46). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Jain, Dhruv; Tikku, Gargi; Bhadana, Pallavi; Dravid, Chandrashekhar; Grover, Rajesh Kumar
2017-08-01
We investigated World Health Organization (WHO) grading and pattern of invasion based histological schemes as independent predictors of disease-free survival, in oral squamous carcinoma patients. Tumor resection slides of eighty-seven oral squamous carcinoma patients [pTNM: I&II/III&IV-32/55] were evaluated. Besides examining various patterns of invasion, invasive front grade, predominant and worst (highest) WHO grade were recorded. For worst WHO grading, poor-undifferentiated component was estimated semi-quantitatively at advancing tumor edge (invasive growth front) in histology sections. Tumor recurrence was observed in 31 (35.6%) cases. The 2-year disease-free survival was 47% [Median: 656; follow-up: 14-1450] days. Using receiver operating characteristic curves, we defined poor-undifferentiated component exceeding 5% of tumor as the cutoff to assign an oral squamous carcinoma as grade-3, when following worst WHO grading. Kaplan-Meier curves for disease-free survival revealed prognostic association with nodal involvement, tumor size, worst WHO grading; most common pattern of invasion and invasive pattern grading score (sum of two most predominant patterns of invasion). In further multivariate analysis, tumor size (>2.5cm) and worst WHO grading (grade-3 tumors) independently predicted reduced disease-free survival [HR, 2.85; P=0.028 and HR, 3.37; P=0.031 respectively]. The inter-observer agreement was moderate for observers who semi-quantitatively estimated percentage of poor-undifferentiated morphology in oral squamous carcinomas. Our results support the value of semi-quantitative method to assign tumors as grade-3 with worst WHO grading for predicting reduced disease-free survival. Despite limitations, of the various histological tumor stratification schemes, WHO grading holds adjunctive value for its prognostic role, ease and universal familiarity. Copyright © 2017 Elsevier Inc. All rights reserved.
Sørensen, Peter B; Thomsen, Marianne; Assmuth, Timo; Grieger, Khara D; Baun, Anders
2010-08-15
This paper helps bridge the gap between scientists and other stakeholders in the areas of human and environmental risk management of chemicals and engineered nanomaterials. This connection is needed due to the evolution of stakeholder awareness and scientific progress related to human and environmental health which involves complex methodological demands on risk management. At the same time, the available scientific knowledge is also becoming more scattered across multiple scientific disciplines. Hence, the understanding of potentially risky situations is increasingly multifaceted, which again challenges risk assessors in terms of giving the 'right' relative priority to the multitude of contributing risk factors. A critical issue is therefore to develop procedures that can identify and evaluate worst case risk conditions which may be input to risk level predictions. Therefore, this paper suggests a conceptual modelling procedure that is able to define appropriate worst case conditions in complex risk management. The result of the analysis is an assembly of system models, denoted the Worst Case Definition (WCD) model, to set up and evaluate the conditions of multi-dimensional risk identification and risk quantification. The model can help optimize risk assessment planning by initial screening level analyses and guiding quantitative assessment in relation to knowledge needs for better decision support concerning environmental and human health protection or risk reduction. The WCD model facilitates the evaluation of fundamental uncertainty using knowledge mapping principles and techniques in a way that can improve a complete uncertainty analysis. Ultimately, the WCD is applicable for describing risk contributing factors in relation to many different types of risk management problems since it transparently and effectively handles assumptions and definitions and allows the integration of different forms of knowledge, thereby supporting the inclusion of multifaceted risk components in cumulative risk management. Copyright 2009 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mandrà, Salvatore; Giacomo Guerreschi, Gian; Aspuru-Guzik, Alán
2016-07-01
We present an exact quantum algorithm for solving the Exact Satisfiability problem, which belongs to the important NP-complete complexity class. The algorithm is based on an intuitive approach that can be divided into two parts: the first step consists in the identification and efficient characterization of a restricted subspace that contains all the valid assignments of the Exact Satisfiability; while the second part performs a quantum search in such restricted subspace. The quantum algorithm can be used either to find a valid assignment (or to certify that no solution exists) or to count the total number of valid assignments. The query complexities for the worst-case are respectively bounded by O(\\sqrt{{2}n-{M\\prime }}) and O({2}n-{M\\prime }), where n is the number of variables and {M}\\prime the number of linearly independent clauses. Remarkably, the proposed quantum algorithm results to be faster than any known exact classical algorithm to solve dense formulas of Exact Satisfiability. As a concrete application, we provide the worst-case complexity for the Hamiltonian cycle problem obtained after mapping it to a suitable Occupation problem. Specifically, we show that the time complexity for the proposed quantum algorithm is bounded by O({2}n/4) for 3-regular undirected graphs, where n is the number of nodes. The same worst-case complexity holds for (3,3)-regular bipartite graphs. As a reference, the current best classical algorithm has a (worst-case) running time bounded by O({2}31n/96). Finally, when compared to heuristic techniques for Exact Satisfiability problems, the proposed quantum algorithm is faster than the classical WalkSAT and Adiabatic Quantum Optimization for random instances with a density of constraints close to the satisfiability threshold, the regime in which instances are typically the hardest to solve. The proposed quantum algorithm can be straightforwardly extended to the generalized version of the Exact Satisfiability known as Occupation problem. The general version of the algorithm is presented and analyzed.
Large-scale weather dynamics during the 2015 haze event in Singapore
NASA Astrophysics Data System (ADS)
Djamil, Yudha; Lee, Wen-Chien; Tien Dat, Pham; Kuwata, Mikinori
2017-04-01
The 2015 haze event in South East Asia is widely considered as a period of the worst air quality in the region in more than a decade. The source of the haze was from forest and peatland fire in Sumatra and Kalimantan Islands, Indonesia. The fires were mostly came from the practice of forest clearance known as slash and burn, to be converted to palm oil plantation. Such practice of clearance although occurs seasonally but at 2015 it became worst by the impact of strong El Nino. The long period of dryer atmosphere over the region due to El Nino makes the fire easier to ignite, spread and difficult to stop. The biomass emission from the forest and peatland fire caused large-scale haze pollution problem in both Islands and further spread into the neighboring countries such as Singapore and Malaysia. In Singapore, for about two months (September-October, 2015) the air quality was in the unhealthy level. Such unfortunate condition caused some socioeconomic losses such as school closure, cancellation of outdoor events, health issues and many more with total losses estimated as S700 million. The unhealthy level of Singapore's air quality is based on the increasing pollutant standard index (PSI>120) due to the haze arrival, it even reached a hazardous level (PSI= 300) for several days. PSI is a metric of air quality in Singapore that aggregate six pollutants (SO2, PM10, PM2.5, NO2, CO and O3). In this study, we focused on PSI variability in weekly-biweekly time scales (periodicity < 30 days) since it is the least understood compare to their diurnal and seasonal scales. We have identified three dominant time scales of PSI ( 5, 10 and 20 days) using Wavelet method and investigated their large-scale atmospheric structures. The PSI associated large-scale column moisture horizontal structures over the Indo-Pacific basin are dominated by easterly propagating gyres in synoptic (macro) scale for the 5 days ( 10 and 20 days) time scales. The propagating gyres manifest as cyclical column moisture flux trajectory around Singapore region. Some of its phases are identified to be responsible in transporting the haze from its source to Singapore. The haze source was identified by compositing number of hotspots in grid-space based on the three time scales of PSI. Further discussion about equatorial waves during the haze event will also be presented.
Managing risk in a challenging financial environment.
Kaufman, Kenneth
2008-08-01
Five strategies can help hospital financial leaders balance their organizations' financial and risk positions: Understand the hospital's financial condition; Determine the desired level of risk; Consider total risk; Use a portfolio approach; Explore best-case/worst-case scenarios to measure risk.
Macoveciuc, Ioana; Márquez-Grant, Nicholas; Horsfall, Ian; Zioupos, Peter
2017-06-01
Burning of human remains is one method used by perpetrators to conceal fatal trauma and expert opinions regarding the degree of skeletal evidence concealment are often disparate. This experiment aimed to reduce this incongruence in forensic anthropological interpretation of burned human remains and implicitly contribute to the development of research methodologies sufficiently robust to withstand forensic scrutiny in the courtroom. We have tested the influence of thermal alteration on pre-existing sharp and blunt trauma on twenty juvenile sheep radii in the laboratory using an automated impact testing system and an electric furnace. The testing conditions simulated a worst-case scenario where remains with pre-existing sharp or blunt trauma were exposed to burning with an intentional vehicular fire scenario in mind. All impact parameters as well as the burning conditions were based on those most commonly encountered in forensic cases and maintained constant throughout the experiment. The results have shown that signatures associated with sharp and blunt force trauma were not masked by heat exposure and highlights the potential for future standardization of fracture analysis in burned bone. Our results further emphasize the recommendation given by other experts on handling, processing and recording burned remains at the crime scene and mortuary. Copyright © 2017 Elsevier B.V. All rights reserved.
Ash from huge Australian bushfires in 2009 circled the globe
NASA Astrophysics Data System (ADS)
Kumar, Mohi
2011-06-01
On 7 February 2009, record high temperatures, low rainfall and humidity, and fast blowing winds caused sparks in the bush near the Australian city of Melbourne to ignite much of the southeastern region of the state of Victoria. In just a few days, more than 4500 square kilometers had burned and 173 people had died in what has been called the worst natural disaster in Australian history. The fires released so much smoke that daytime on 7 February was plunged into darkness in Melbourne. Indeed, soot particles and other aerosols are known to scatter and absorb solar radiation. However, airborne particles released by fires are typically thought to remain in the atmosphere close to their sources. In fact, climate models pay little attention to the scattering and absorbing effects of fire-borne aerosols because they are not believed to reach altitudes above 10 kilometers, in the stratosphere, where circulation patterns would distribute a plume of pollution around the globe, possibly leading to global cooling effects. Ash from volcanic plumes has long been considered the sole method by which aerosols and gases could be injected into the stratosphere from the Earth's surface. However, Australia's bushfires of 2009 showed otherwise. (Journal of Geophysical Research-Atmospheres, doi:10.1029/2010JD015162, 2011)
Alcohol based surgical prep solution and the risk of fire in the operating room: a case report
Batra, Sumit; Gupta, Rajiv
2008-01-01
A few cases of fire in the operating room are reported in the literature. The factors that may initiate these fires are many and include alcohol based surgical prep solutions, electrosurgical equipment, flammable drapes etc. We are reporting a case of fire in the operating room while operating on a patient with burst fracture C6 vertebra with quadriplegia. The cause of the fire was due to incomplete drying of the covering drapes with an alcohol based surgical prep solution. This paper discusses potential preventive measures to minimize the incidence of fire in the operating room. PMID:18439304
A Fully Coupled Multi-Rigid-Body Fuel Slosh Dynamics Model Applied to the Triana Stack
NASA Technical Reports Server (NTRS)
London, K. W.
2001-01-01
A somewhat general multibody model is presented that accounts for energy dissipation associated with fuel slosh and which unifies some of the existing more specialized representations. This model is used to predict the mutation growth time constant for the Triana Spacecraft, or Stack, consisting of the Triana Observatory mated with the Gyroscopic Upper Stage of GUS (includes the solid rocket motor, SRM, booster). At the nominal spin rate of 60 rpm and with 145 kg of hydrazine propellant on board, a time constant of 116 s is predicted for worst case sloshing of a spherical slug model compared to 1,681 s (nominal), 1,043 s (worst case) for sloshing of a three degree of freedom pendulum model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, W; Schild, S; Bues, M
Purpose: We compared conventionally optimized intensity-modulated proton therapy (IMPT) treatment plans against the worst-case robustly optimized treatment plans for lung cancer. The comparison of the two IMPT optimization strategies focused on the resulting plans' ability to retain dose objectives under the influence of patient set-up, inherent proton range uncertainty, and dose perturbation caused by respiratory motion. Methods: For each of the 9 lung cancer cases two treatment plans were created accounting for treatment uncertainties in two different ways: the first used the conventional Method: delivery of prescribed dose to the planning target volume (PTV) that is geometrically expanded from themore » internal target volume (ITV). The second employed the worst-case robust optimization scheme that addressed set-up and range uncertainties through beamlet optimization. The plan optimality and plan robustness were calculated and compared. Furthermore, the effects on dose distributions of the changes in patient anatomy due to respiratory motion was investigated for both strategies by comparing the corresponding plan evaluation metrics at the end-inspiration and end-expiration phase and absolute differences between these phases. The mean plan evaluation metrics of the two groups were compared using two-sided paired t-tests. Results: Without respiratory motion considered, we affirmed that worst-case robust optimization is superior to PTV-based conventional optimization in terms of plan robustness and optimality. With respiratory motion considered, robust optimization still leads to more robust dose distributions to respiratory motion for targets and comparable or even better plan optimality [D95% ITV: 96.6% versus 96.1% (p=0.26), D5% - D95% ITV: 10.0% versus 12.3% (p=0.082), D1% spinal cord: 31.8% versus 36.5% (p =0.035)]. Conclusion: Worst-case robust optimization led to superior solutions for lung IMPT. Despite of the fact that robust optimization did not explicitly account for respiratory motion it produced motion-resistant treatment plans. However, further research is needed to incorporate respiratory motion into IMPT robust optimization.« less
Debris flow run-out simulation and analysis using a dynamic model
NASA Astrophysics Data System (ADS)
Melo, Raquel; van Asch, Theo; Zêzere, José L.
2018-02-01
Only two months after a huge forest fire occurred in the upper part of a valley located in central Portugal, several debris flows were triggered by intense rainfall. The event caused infrastructural and economic damage, although no lives were lost. The present research aims to simulate the run-out of two debris flows that occurred during the event as well as to calculate via back-analysis the rheological parameters and the excess rain involved. Thus, a dynamic model was used, which integrates surface runoff, concentrated erosion along the channels, propagation and deposition of flow material. Afterwards, the model was validated using 32 debris flows triggered during the same event that were not considered for calibration. The rheological and entrainment parameters obtained for the most accurate simulation were then used to perform three scenarios of debris flow run-out on the basin scale. The results were confronted with the existing buildings exposed in the study area and the worst-case scenario showed a potential inundation that may affect 345 buildings. In addition, six streams where debris flow occurred in the past and caused material damage and loss of lives were identified.
Fractionated analysis of paired-electrode nerve recordings.
Fiore, Lorenzo; Lorenzetti, Walter; Ratti, Giovannino; Geppetti, Laura
2003-12-30
Multi-unit activity recorded from two electrodes positioned at a distance on a nerve may be analysed by cross-correlation, but units similar in direction and velocity of propagation cannot be distinguished and separately evaluated by this method. To overcome this limit, we added two features, represented by the impulse amplitudes of the paired recordings, to the dimension given by the impulse delay. The analysis was fractionated according to the new dimensions. In experimental recordings from the locomotor appendage of the lobster Homarus americanus, the fractionated analysis proved capable of identifying the contributions of single active units, even if these were superimposed and indiscernible in the global cross-correlation histogram. Up to 5 motor and 10 sensory units could be identified. The shape of the paired impulses was evaluated by an averaging procedure. Analogous evaluations on simulated recordings made it possible to estimate the influences exerted on performance by variations in noise level and in the number and firing rate of active units. The global signal could be resolved into single units even under the worst conditions. Accuracy in evaluating the amount of unit activity varied, exceeding 90% in about half of the cases tested; a similar performance was attained by evaluation of the impulse shapes.
RSRM TP-H1148 Main Grain Propellant Crack Initiation Evaluation
NASA Technical Reports Server (NTRS)
Earnest, Todd E.
2005-01-01
Pressurized TP-HI 148 propellant fracture toughness testing was performed to assess the potential for initiation of visually undetectable cracks in the RSRM forward segment transition region during motor ignition. Two separate test specimens were used in this evaluation. Testing was performed in cold-gas and hot-fire environments, and under both static and dynamic pressurization conditions. Analysis of test results demonstrates safety factors against initiation of visually undetectable cracks in excess of 8.0. The Reusable Solid Rocket Motor (RSRM) forward segment is cast with PBAN propellant (TP-HI 148) to form T an 1 1-point star configuration that transitions to a tapered center perforated bore (see Figure 1). The geometry of the transition region between the fin valleys and the bore causes a localized area of high strain during horizontal storage. Updated analyses using worst-case mechanical properties at 40 F and improved modeling techniques indicated a slight reduction in safety margins over previous predictions. Although there is no history of strain induced cracks or flaws in the transition region propellant, a proactive test effort was initiated to better understand the implications of the new analysis, primarily the resistance of TP-H1148 propellant to crack initiation' during RSRM ignition.
The year-long unprecedented European heat and drought of 1540 - a worst case
NASA Astrophysics Data System (ADS)
Wetter, Oliver
2015-04-01
The heat waves of 2003 in Western Europe and 2010 in Russia, commonly labelled as rare climatic anomalies outside of previous experience, are often taken as harbingers of more frequent extremes in the global warming-influenced future. However, a recent reconstruction of spring-summer temperatures for WE resulted in the likelihood of significantly higher temperatures in 1540. In order to check the plausibility of this result we investigated the severity of the 1540 drought by putting forward the argument of the known soil desiccation-temperature feedback. Based on more than 300 first-hand documentary weather report sources originating from an area of 2 to 3 million km2, we show that Europe was affected by an unprecedented 11-month-long Megadrought. The estimated number of precipitation days and precipitation amount for Central and Western Europe in 1540 is significantly lower than the 100-year minima of the instrumental measurement period for spring, summer and autumn. This result is supported by independent documentary evidence about extremely low river flows and Europe-wide wild-, forestand settlement fires. We found that an event of this severity cannot be simulated by state-of-the-art climate models.
Integrated Safety Risk Reduction Approach to Enhancing Human-Rated Spaceflight Safety
NASA Astrophysics Data System (ADS)
Mikula, J. F. Kip
2005-12-01
This paper explores and defines the current accepted concept and philosophy of safety improvement based on a Reliability enhancement (called here Reliability Enhancement Based Safety Theory [REBST]). In this theory a Reliability calculation is used as a measure of the safety achieved on the program. This calculation may be based on a math model or a Fault Tree Analysis (FTA) of the system, or on an Event Tree Analysis (ETA) of the system's operational mission sequence. In each case, the numbers used in this calculation are hardware failure rates gleaned from past similar programs. As part of this paper, a fictional but representative case study is provided that helps to illustrate the problems and inaccuracies of this approach to safety determination. Then a safety determination and enhancement approach based on hazard, worst case analysis, and safety risk determination (called here Worst Case Based Safety Theory [WCBST]) is included. This approach is defined and detailed using the same example case study as shown in the REBST case study. In the end it is concluded that an approach combining the two theories works best to reduce Safety Risk.
Estimated cost of universal public coverage of prescription drugs in Canada
Morgan, Steven G.; Law, Michael; Daw, Jamie R.; Abraham, Liza; Martin, Danielle
2015-01-01
Background: With the exception of Canada, all countries with universal health insurance systems provide universal coverage of prescription drugs. Progress toward universal public drug coverage in Canada has been slow, in part because of concerns about the potential costs. We sought to estimate the cost of implementing universal public coverage of prescription drugs in Canada. Methods: We used published data on prescribing patterns and costs by drug type, as well as source of funding (i.e., private drug plans, public drug plans and out-of-pocket expenses), in each province to estimate the cost of universal public coverage of prescription drugs from the perspectives of government, private payers and society as a whole. We estimated the cost of universal public drug coverage based on its anticipated effects on the volume of prescriptions filled, products selected and prices paid. We selected these parameters based on current policies and practices seen either in a Canadian province or in an international comparator. Results: Universal public drug coverage would reduce total spending on prescription drugs in Canada by $7.3 billion (worst-case scenario $4.2 billion, best-case scenario $9.4 billion). The private sector would save $8.2 billion (worst-case scenario $6.6 billion, best-case scenario $9.6 billion), whereas costs to government would increase by about $1.0 billion (worst-case scenario $5.4 billion net increase, best-case scenario $2.9 billion net savings). Most of the projected increase in government costs would arise from a small number of drug classes. Interpretation: The long-term barrier to the implementation of universal pharmacare owing to its perceived costs appears to be unjustified. Universal public drug coverage would likely yield substantial savings to the private sector with comparatively little increase in costs to government. PMID:25780047
Estimated cost of universal public coverage of prescription drugs in Canada.
Morgan, Steven G; Law, Michael; Daw, Jamie R; Abraham, Liza; Martin, Danielle
2015-04-21
With the exception of Canada, all countries with universal health insurance systems provide universal coverage of prescription drugs. Progress toward universal public drug coverage in Canada has been slow, in part because of concerns about the potential costs. We sought to estimate the cost of implementing universal public coverage of prescription drugs in Canada. We used published data on prescribing patterns and costs by drug type, as well as source of funding (i.e., private drug plans, public drug plans and out-of-pocket expenses), in each province to estimate the cost of universal public coverage of prescription drugs from the perspectives of government, private payers and society as a whole. We estimated the cost of universal public drug coverage based on its anticipated effects on the volume of prescriptions filled, products selected and prices paid. We selected these parameters based on current policies and practices seen either in a Canadian province or in an international comparator. Universal public drug coverage would reduce total spending on prescription drugs in Canada by $7.3 billion (worst-case scenario $4.2 billion, best-case scenario $9.4 billion). The private sector would save $8.2 billion (worst-case scenario $6.6 billion, best-case scenario $9.6 billion), whereas costs to government would increase by about $1.0 billion (worst-case scenario $5.4 billion net increase, best-case scenario $2.9 billion net savings). Most of the projected increase in government costs would arise from a small number of drug classes. The long-term barrier to the implementation of universal pharmacare owing to its perceived costs appears to be unjustified. Universal public drug coverage would likely yield substantial savings to the private sector with comparatively little increase in costs to government. © 2015 Canadian Medical Association or its licensors.
The Effect of Prescribed Burns and Wildfire on Vegetation in Bastrop State Park, TX
NASA Astrophysics Data System (ADS)
Justice, C. J.
2014-12-01
In 2011, central Texas had its worst drought since the 1950's. This, in conjunction with the strong winds produced by Tropical Storm Lee created conditions that made possible the Bastrop County Complex Fire in September 2011. These record-breaking wildfires burned over 95% of the 6,565-acre Bastrop State Park (BSP). Since 2003, BSP had been using prescribed burns as a management practice to reduce fuel load and prevent high severity wildfires. Although these prescribed fires did not prevent the 2011 wildfires they may have mitigated their effects. This study considered the effect of prescribed burn history and wildfire burn severity on vegetation recovery in BSP since the 2011 wildfire. The hypotheses of this study are that prescribed burn history and wildfire burn severity separately and jointly have affected post wildfire vegetation. To test these hypotheses, data were collected in 2013 from 46 plots across BSP using the Fire Effects Monitoring and Inventory (FIREMON) protocol to determine herbaceous plant density, shrub density, overstory density, and midstory tree density. Data were analyzed using analyses of variance (ANOVA) to determine the effects of prescribed fire and wildfire severity on these vegetation measurements. It was found that more severely burned plots had more herbaceous plants, fewer midstory trees, and lower shrub densities than less severely burned plots. Contrary to an initial hypotheses, there were few relationships between prescribed burn history and wildfire effects. The only significant effect detected for prescribed burning was the positive effect of prescribed fire on midstory tree density, but only for plots that were not severely burned in the wildfire. In this system, burn severity had a greater effect on post-wildfire vegetation than prescribed burns.
Matthew P. Thompson; Patrick Freeborn; Jon D. Rieck; Dave Calkin; Julie W. Gilbertson-Day; Mark A. Cochrane; Michael S. Hand
2016-01-01
We present a case study of the Las Conchas Fire (2011) to explore the role of previously burned areas (wildfires and prescribed fires) on suppression effectiveness and avoided exposure. Methodological innovations include characterisation of the joint dynamics of fire growth and suppression activities, development of a fire line effectiveness framework, and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strauss, Henry
This research was mostly concerned with asymmetric vertical displacement event (AVDE) disruptions, which are the worst case scenario for producing a large asymmetric wall force. This is potentially a serious problem in ITER.
Multiple usage of the CD PLUS/UNIX system: performance in practice.
Volkers, A C; Tjiam, I A; van Laar, A; Bleeker, A
1995-01-01
In August 1994, the CD PLUS/Ovid literature retrieval system based on UNIX was activated for the Faculty of Medicine and Health Sciences of Erasmus University in Rotterdam, the Netherlands. There were up to 1,200 potential users. Tests were carried out to determine the extent to which searching for literature was affected by other end users of the system. In the tests, search times and download times were measured in relation to a varying number of continuously active workstations. Results indicated a linear relationship between search times and the number of active workstations. In the "worst case" situation with sixteen active workstations, the time required for record retrieval increased by a factor of sixteen and downloading time by a factor of sixteen over the "best case" of no other active stations. However, because the worst case seldom, if ever, happens in real life, these results are considered acceptable. PMID:8547902
Multiple usage of the CD PLUS/UNIX system: performance in practice.
Volkers, A C; Tjiam, I A; van Laar, A; Bleeker, A
1995-10-01
In August 1994, the CD PLUS/Ovid literature retrieval system based on UNIX was activated for the Faculty of Medicine and Health Sciences of Erasmus University in Rotterdam, the Netherlands. There were up to 1,200 potential users. Tests were carried out to determine the extent to which searching for literature was affected by other end users of the system. In the tests, search times and download times were measured in relation to a varying number of continuously active workstations. Results indicated a linear relationship between search times and the number of active workstations. In the "worst case" situation with sixteen active workstations, the time required for record retrieval increased by a factor of sixteen and downloading time by a factor of sixteen over the "best case" of no other active stations. However, because the worst case seldom, if ever, happens in real life, these results are considered acceptable.
Carter, D A; Hirst, I L
2000-01-07
This paper considers the application of one of the weighted risk indicators used by the Major Hazards Assessment Unit (MHAU) of the Health and Safety Executive (HSE) in formulating advice to local planning authorities on the siting of new major accident hazard installations. In such cases the primary consideration is to ensure that the proposed installation would not be incompatible with existing developments in the vicinity, as identified by the categorisation of the existing developments and the estimation of individual risk values at those developments. In addition a simple methodology, described here, based on MHAU's "Risk Integral" and a single "worst case" even analysis, is used to enable the societal risk aspects of the hazardous installation to be considered at an early stage of the proposal, and to determine the degree of analysis that will be necessary to enable HSE to give appropriate advice.
Simulation of wind-driven dispersion of fire pollutants in a street canyon using FDS.
Pesic, Dusica J; Blagojevic, Milan Dj; Zivkovic, Nenad V
2014-01-01
Air quality in urban areas attracts great attention due to increasing pollutant emissions and their negative effects on human health and environment. Numerous studies, such as those by Mouilleau and Champassith (J Loss Prevent Proc 22(3): 316-323, 2009), Xie et al. (J Hydrodyn 21(1): 108-117, 2009), and Yassin (Environ Sci Pollut Res 20(6): 3975-3988, 2013) focus on the air pollutant dispersion with no buoyancy effect or weak buoyancy effect. A few studies, such as those by Hu et al. (J Hazard Mater 166(1): 394-406, 2009; J Hazard Mater 192(3): 940-948, 2011; J Civ Eng Manag (2013)) focus on the fire-induced dispersion of pollutants with heat buoyancy release rate in the range from 0.5 to 20 MW. However, the air pollution source might very often be concentrated and intensive, as a consequence of the hazardous materials fire. Namely, transportation of fuel through urban areas occurs regularly, because it is often impossible to find alternative supply routes. It is accompanied with the risk of fire accident occurrences. Accident prevention strategies require analysis of the worst scenarios in which fire products jeopardize the exposed population and environment. The aim of this article is to analyze the impact of wind flow on air pollution and human vulnerability to fire products in a street canyon. For simulation of the gasoline tanker truck fire as a result of a multivehicle accident, computational fluid dynamics large eddy simulation method has been used. Numerical results show that the fire products flow vertically upward, without touching the walls of the buildings in the absence of wind. However, when the wind velocity reaches the critical value, the products touch the walls of the buildings on both sides of the street canyon. The concentrations of carbon monoxide and soot decrease, whereas carbon dioxide concentration increases with the rise of height above the street canyon ground level. The longitudinal concentration of the pollutants inside the street increases with the rise of the wind velocity at the roof level of the street canyon.
40 CFR 90.119 - Certification procedure-testing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... must select the duty cycle that will result in worst-case emission results for certification. For any... facility, in which case instrumentation and equipment specified by the Administrator must be made available... manufacturers may not use any equipment, instruments, or tools to identify malfunctioning, maladjusted, or...
Ivanoff, Michael A.
1997-01-01
Contraction scour for all modelled flows ranged from 2.1 to 4.2 ft. The worst-case contraction scour occurred at the 500-year discharge. Left abutment scour ranged from 14.3 to 14.4 ft. The worst-case left abutment scour occurred at the incipient roadwayovertopping and 500-year discharge. Right abutment scour ranged from 15.3 to 18.5 ft. The worst-case right abutment scour occurred at the 100-year and the incipient roadwayovertopping discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) give “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
In situ LTE exposure of the general public: Characterization and extrapolation.
Joseph, Wout; Verloock, Leen; Goeminne, Francis; Vermeeren, Günter; Martens, Luc
2012-09-01
In situ radiofrequency (RF) exposure of the different RF sources is characterized in Reading, United Kingdom, and an extrapolation method to estimate worst-case long-term evolution (LTE) exposure is proposed. All electric field levels satisfy the International Commission on Non-Ionizing Radiation Protection (ICNIRP) reference levels with a maximal total electric field value of 4.5 V/m. The total values are dominated by frequency modulation (FM). Exposure levels for LTE of 0.2 V/m on average and 0.5 V/m maximally are obtained. Contributions of LTE to the total exposure are limited to 0.4% on average. Exposure ratios from 0.8% (LTE) to 12.5% (FM) are obtained. An extrapolation method is proposed and validated to assess the worst-case LTE exposure. For this method, the reference signal (RS) and secondary synchronization signal (S-SYNC) are measured and extrapolated to the worst-case value using an extrapolation factor. The influence of the traffic load and output power of the base station on in situ RS and S-SYNC signals are lower than 1 dB for all power and traffic load settings, showing that these signals can be used for the extrapolation method. The maximal extrapolated field value for LTE exposure equals 1.9 V/m, which is 32 times below the ICNIRP reference levels for electric fields. Copyright © 2012 Wiley Periodicals, Inc.
Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System.
Chinnadurai, Sunil; Selvaprabhu, Poongundran; Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho
2017-09-18
In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach's algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme.
MP3 player listening sound pressure levels among 10 to 17 year old students.
Keith, Stephen E; Michaud, David S; Feder, Katya; Haider, Ifaz; Marro, Leonora; Thompson, Emma; Marcoux, Andre M
2011-11-01
Using a manikin, equivalent free-field sound pressure level measurements were made from the portable digital audio players of 219 subjects, aged 10 to 17 years (93 males) at their typical and "worst-case" volume levels. Measurements were made in different classrooms with background sound pressure levels between 40 and 52 dBA. After correction for the transfer function of the ear, the median equivalent free field sound pressure levels and interquartile ranges (IQR) at typical and worst-case volume settings were 68 dBA (IQR = 15) and 76 dBA (IQR = 19), respectively. Self-reported mean daily use ranged from 0.014 to 12 h. When typical sound pressure levels were considered in combination with the average daily duration of use, the median noise exposure level, Lex, was 56 dBA (IQR = 18) and 3.2% of subjects were estimated to exceed the most protective occupational noise exposure level limit in Canada, i.e., 85 dBA Lex. Under worst-case listening conditions, 77.6% of the sample was estimated to listen to their device at combinations of sound pressure levels and average daily durations for which there is no known risk of permanent noise-induced hearing loss, i.e., ≤ 75 dBA Lex. Sources and magnitudes of measurement uncertainties are also discussed.
Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System
Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho
2017-01-01
In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach’s algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme. PMID:28927019
Boehmler, Erick M.; Weber, Matthew A.
1997-01-01
Contraction scour for all modelled flows ranged from 0.0 to 0.3 ft. The worst-case contraction scour occurred at the incipient overtopping discharge, which was less than the 100-year discharge. Abutment scour ranged from 6.2 to 9.4 ft. The worst-case abutment scour for the right abutment was 9.4 feet at the 100-year discharge. The worst-case abutment scour for the left abutment was 8.6 feet at the incipient overtopping discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Burns, Ronda L.; Degnan, James R.
1997-01-01
Contraction scour for all modelled flows ranged from 2.6 to 4.6 ft. The worst-case contraction scour occurred at the incipient roadway-overtopping discharge. The left abutment scour ranged from 11.6 to 12.1 ft. The worst-case left abutment scour occurred at the incipient road-overtopping discharge. The right abutment scour ranged from 13.6 to 17.9 ft. The worst-case right abutment scour occurred at the 500-year discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in Tables 1 and 2. A cross-section of the scour computed at the bridge is presented in Figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 46). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Bergschmidt, Philipp; Dammer, Rebecca; Zietz, Carmen; Finze, Susanne; Mittelmeier, Wolfram; Bader, Rainer
2016-06-01
Evaluation of the adhesive strength of femoral components to the bone cement is a relevant parameter for predicting implant safety. In the present experimental study, three types of cemented femoral components (metallic, ceramic and silica/silane-layered ceramic) of the bicondylar Multigen Plus knee system, implanted on composite femora were analysed. A pull-off test with the femoral components was performed after different load and several cementing conditions (four groups and n=3 components of each metallic, ceramic and silica/silane-layered ceramic in each group). Pull-off forces were comparable for the metallic and the silica/silane-layered ceramic femoral components (mean 4769 N and 4298 N) under standard test condition, whereas uncoated ceramic femoral components showed reduced pull-off forces (mean 2322 N). Loading under worst-case conditions led to decreased adhesive strength by loosening of the interface implant and bone cement using uncoated metallic and ceramic femoral components, respectively. Silica/silane-coated ceramic components were stably fixed even under worst-case conditions. Loading under high flexion angles can induce interfacial tensile stress, which could promote early implant loosening. In conclusion, a silica/silane-coating layer on the femoral component increased their adhesive strength to bone cement. Thicker cement mantles (>2 mm) reduce adhesive strength of the femoral component and can increase the risk of cement break-off.
Validation of a contemporary prostate cancer grading system using prostate cancer death as outcome.
Berney, Daniel M; Beltran, Luis; Fisher, Gabrielle; North, Bernard V; Greenberg, David; Møller, Henrik; Soosay, Geraldine; Scardino, Peter; Cuzick, Jack
2016-05-10
Gleason scoring (GS) has major deficiencies and a novel system of five grade groups (GS⩽6; 3+4; 4+3; 8; ⩾9) has been recently agreed and included in the WHO 2016 classification. Although verified in radical prostatectomies using PSA relapse for outcome, it has not been validated using prostate cancer death as an outcome in biopsy series. There is debate whether an 'overall' or 'worst' GS in biopsies series should be used. Nine hundred and eighty-eight prostate cancer biopsy cases were identified between 1990 and 2003, and treated conservatively. Diagnosis and grade was assigned to each core as well as an overall grade. Follow-up for prostate cancer death was until 31 December 2012. A log-rank test assessed univariable differences between the five grade groups based on overall and worst grade seen, and using univariable and multivariable Cox proportional hazards. Regression was used to quantify differences in outcome. Using both 'worst' and 'overall' GS yielded highly significant results on univariate and multivariate analysis with overall GS slightly but insignificantly outperforming worst GS. There was a strong correlation with the five grade groups and prostate cancer death. This is the largest conservatively treated prostate cancer cohort with long-term follow-up and contemporary assessment of grade. It validates the formation of five grade groups and suggests that the 'worst' grade is a valid prognostic measure.
Stochastic Robust Mathematical Programming Model for Power System Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Cong; Changhyeok, Lee; Haoyong, Chen
2016-01-01
This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.
NASA Astrophysics Data System (ADS)
Uriarte, M.
2008-12-01
Recent increases in the price of oil have generated much interest in biofuel development. Despite the increasing demand, the social and environmental impacts of large scale adoption of biofuels at both regional and national scales remain understudied, especially in developing economies. Here we use municipality-level data for the state of São Paulo in Brasil to explore the effects of fires associated with sugarcane cultivation on respiratory health of elderly and children. We examined the effects of fires occurring in the same year in which respiratory cases were reported as well as chronic effects associated with long-term cultivation of sugarcane. Across the state, respiratory morbidity attributable to fires accounted for 113 elderly and 317 child cases, approximately 1.8% of total cases in each group. Although no chronic effects of fire were detected for the elderly group, an additional 650 child cases can be attributed to the long term cultivation of sugar cane increasing to 5.4% the percent of children cases that can be attributed to fire. For municipalities with greater than 50% of the land in sugarcane the percentage increased to 15% and 12 % respectively for elderly and children. An additional 209 child cases could also be attributed to past exposure to fires associated with sugarcane, suggesting that in total 38% of children respiratory cases could be attributed to current or chronic exposure to fires in these municipalities. The harmful effects of cane- associated fires on health are not only a burden for the public health system but also for household economies. This type of information should be incorporated into land use decisions and discussions of biofuel sustainability.
Minimax Quantum Tomography: Estimators and Relative Entropy Bounds
Ferrie, Christopher; Blume-Kohout, Robin
2016-03-04
A minimax estimator has the minimum possible error (“risk”) in the worst case. Here we construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O (1/more » $$\\sqrt{N}$$ ) —in contrast to that of classical probability estimation, which is O (1/N) —where N is the number of copies of the quantum state used. We trace this deficiency to sampling mismatch: future observations that determine risk may come from a different sample space than the past data that determine the estimate. Lastly, this makes minimax estimators very biased, and we propose a computationally tractable alternative with similar behavior in the worst case, but superior accuracy on most states.« less
NASA Technical Reports Server (NTRS)
Coakley, P.; Kitterer, B.; Treadaway, M.
1982-01-01
Charging and discharging characteristics of dielectric samples exposed to 1-25 keV and 25-100 keV electrons in a laboratory environment are reported. The materials examined comprised OSR, Mylar, Kapton, perforated Kapton, and Alphaquartz, serving as models for materials employed on spacecraft in geosynchronous orbit. The tests were performed in a vacuum chamber with electron guns whose beams were rastered over the entire surface of the planar samples. The specimens were examined in low-impedance-grounded, high-impedance-grounded, and isolated configurations. The worst-case and average peak discharge currents were observed to be independent of the incident electron energy, the time-dependent changes in the worst case discharge peak current were independent of the energy, and predischarge surface potentials are negligibly dependent on incident monoenergetic electrons.
Worst-case space radiation environments for geocentric missions
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.; Seltzer, S. M.
1976-01-01
Worst-case possible annual radiation fluences of energetic charged particles in the terrestrial space environment, and the resultant depth-dose distributions in aluminum, were calculated in order to establish absolute upper limits to the radiation exposure of spacecraft in geocentric orbits. The results are a concise set of data intended to aid in the determination of the feasibility of a particular mission. The data may further serve as guidelines in the evaluation of standard spacecraft components. Calculations were performed for each significant particle species populating or visiting the magnetosphere, on the basis of volume occupied by or accessible to the respective species. Thus, magnetospheric space was divided into five distinct regions using the magnetic shell parameter L, which gives the approximate geocentric distance (in earth radii) of a field line's equatorial intersect.
``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis
NASA Astrophysics Data System (ADS)
Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin
Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyette, J.A.; Breck, J.E.; Coleman, P.R.
1986-03-01
The purpose is to provide an assessment of the potential health and environmental impacts of continuing to store M55 rockets filled with nerve agent GB or VX at their current storage locations at Anniston Army Depot in Alabama, Lexington-Blue Grass Depot Activity in Kentucky, Pine Bluff Arsenal in Arkansas, Tooele Army Depot in Utah, and Umatilla Depot Activity in Oregon. The assessment considers the possible impacts of (1) normal storage (with no release to the environment) and (2) two postulated accidents on the air quality, ground and surface water, aquatic ecology, terrestrial ecology, human health, and cultural and socioeconomic resourcesmore » in and around the various storage depots. The analysis considers three basic scenarios during storage: (1) normal operations; (2) a minor spill of agent (the contents of one rocket released to the biosphere); and (3) a maximum credible event or MCE. The MCE is an igloo fire resulting in the aerosolization of a small (in the case of GB) or an extremely small (in the case of VX) percentage of the igloo's nerve agent contents to the biosphere. The extremely low probabilities of such accidents, which are reported elsewhere, are noted. Our assessments of the impacts of a minor spill and of an MCE consider two sets of meteorological conditions: conservative most likely and worst-case. In addition, we assume that an agent plume would travel toward the area of highest population density. 21 figs., 47 tabs.« less
Hepatitis Aand E Co-Infection with Worst Outcome.
Saeed, Anjum; Cheema, Huma Arshad; Assiri, Asaad
2016-06-01
Infections are still a major problem in the developing countries like Pakistan because of poor sewage disposal and economic restraints. Acute viral hepatitis like Aand E are not uncommon in pediatric age group because of unhygienic food handling and poor sewage disposal, but majority recovers well without any complications. Co-infections are rare occurrences and physicians need to be well aware while managing such conditions to avoid worst outcome. Co-infection with hepatitis Aand E is reported occasionally in the literature, however, other concurrent infections such as hepatitis A with Salmonellaand hepatotropic viruses like viral hepatitis B and C are present in the literature. Co-infections should be kept in consideration when someone presents with atypical symptoms or unusual disease course like this presented case. We report here a girl child who had acute hepatitis A and E concurrent infections and presented with hepatic encephalopathy and had worst outcome, despite all the supportive measures being taken.
Implementation of School Health Promotion: Consequences for Professional Assistance
ERIC Educational Resources Information Center
Boot, N. M. W. M.; de Vries, N. K.
2012-01-01
Purpose: This case study aimed to examine the factors influencing the implementation of health promotion (HP) policies and programs in secondary schools and the consequences for professional assistance. Design/methodology/approach: Group interviews were held in two schools that represented the best and worst case of implementation of a health…
Compression in the Superintendent Ranks
ERIC Educational Resources Information Center
Saron, Bradford G.; Birchbauer, Louis J.
2011-01-01
Sadly, the fiscal condition of school systems now not only is troublesome, but in some cases has surpassed all expectations for the worst-case scenario. Among the states, one common response is to drop funding for public education to inadequate levels, leading to permanent program cuts, school closures, staff layoffs, district dissolutions and…
Fire and the Design of Educational Buildings. Building Bulletin 7. Sixth Edition.
ERIC Educational Resources Information Center
Department of Education and Science, London (England).
This bulletin offers guidance on English school premises regulations applying to safety protection against fires in the following general areas: means of escape in case of fire; precautionary measures to prevent fire; fire warning systems and fire fighting; fire spreading speed; structures and materials resistant to fires; and damage control. It…
40 CFR 85.2115 - Notification of intent to certify.
Code of Federal Regulations, 2013 CFR
2013-07-01
... testing and durability demonstration represent worst case with respect to emissions of all those... submitted by the aftermarket manufacturer to: Mod Director, MOD (EN-340F), Attention: Aftermarket Parts, 401...
40 CFR 85.2115 - Notification of intent to certify.
Code of Federal Regulations, 2012 CFR
2012-07-01
... testing and durability demonstration represent worst case with respect to emissions of all those... submitted by the aftermarket manufacturer to: Mod Director, MOD (EN-340F), Attention: Aftermarket Parts, 401...
40 CFR 85.2115 - Notification of intent to certify.
Code of Federal Regulations, 2014 CFR
2014-07-01
... testing and durability demonstration represent worst case with respect to emissions of all those... submitted by the aftermarket manufacturer to: Mod Director, MOD (EN-340F), Attention: Aftermarket Parts, 401...
Code of Federal Regulations, 2011 CFR
2011-01-01
... size and type will vary only with climate, the number of stories, and the choice of simulation tool... practice for some climates or buildings, but represent a reasonable worst case of energy cost resulting...
Code of Federal Regulations, 2012 CFR
2012-01-01
... size and type will vary only with climate, the number of stories, and the choice of simulation tool... practice for some climates or buildings, but represent a reasonable worst case of energy cost resulting...
Code of Federal Regulations, 2013 CFR
2013-01-01
... size and type will vary only with climate, the number of stories, and the choice of simulation tool... practice for some climates or buildings, but represent a reasonable worst case of energy cost resulting...
Code of Federal Regulations, 2014 CFR
2014-01-01
... size and type will vary only with climate, the number of stories, and the choice of simulation tool... practice for some climates or buildings, but represent a reasonable worst case of energy cost resulting...
Code of Federal Regulations, 2010 CFR
2010-01-01
... size and type will vary only with climate, the number of stories, and the choice of simulation tool... practice for some climates or buildings, but represent a reasonable worst case of energy cost resulting...
Adaptive Attitude Control of the Crew Launch Vehicle
NASA Technical Reports Server (NTRS)
Muse, Jonathan
2010-01-01
An H(sub infinity)-NMA architecture for the Crew Launch Vehicle was developed in a state feedback setting. The minimal complexity adaptive law was shown to improve base line performance relative to a performance metric based on Crew Launch Vehicle design requirements for all most all of the Worst-on-Worst dispersion cases. The adaptive law was able to maintain stability for some dispersions that are unstable with the nominal control law. Due to the nature of the H(sub infinity)-NMA architecture, the augmented adaptive control signal has low bandwidth which is a great benefit for a manned launch vehicle.
Manzello, Samuel L.; Suzuki, Sayaka; Nii, Daisaku
2015-01-01
Structure ignition by wind-driven firebrand showers is an important fire spread mechanism in large outdoor fires. Experiments were conducted with three common mulch types (shredded hardwood mulch, Japanese Cypress wood chips, and pine bark nuggets) placed adjacent to realistic-scale reentrant corners. In the first series of experiments, mulch beds were placed adjacent to a re-entrant corner constructed with wood studs and lined with oriented strand board (OSB) as the sheathing. The premise behind conducting experiments with no siding treatments applied was predicated on the notion that bare OSB mulch contact would be a worst-case scenario, and therefore, a wall assembly in the most vulnerable state to mulch ignition. In the second series of experiments, vinyl siding was applied to the re-entrant corner assemblies (wood studs/OSB/moisture barrier/vinyl siding), and the influence of vertical separation distance (102 mm or 203 mm) on wall ignition from adjacent mulch beds was determined. The vertical separation distance was maintained by applying gypsum board to the base of the re-entrant corner. The siding itself did not influence the ignition process for the mulch beds, as the mulch beds were the first to ignite from the firebrand showers. In all experiments, it was observed that firebrands produced smoldering ignition in the mulch beds, this transitioned to flaming ignition, and the re-entrant corner assembly was exposed to the flaming mulch beds. With no siding treatments applied, the flaming mulch beds ignited the re-entrant corner, and ignition was observed to propagate to the back side of re-entrant corner assembly under all wind speeds (6 m/s to 8 m/s). With respect to the re-entrant corners fitted with vinyl siding, the mulch type, vertical separation distance, and wind speed were important parameters as to whether flaming ignition was observed to propagate to the back-side of a reentrant corner assembly. Mulches clearly pose an ignition hazard to structures in large outdoor fires. PMID:28184098
Validation of a contemporary prostate cancer grading system using prostate cancer death as outcome
Berney, Daniel M; Beltran, Luis; Fisher, Gabrielle; North, Bernard V; Greenberg, David; Møller, Henrik; Soosay, Geraldine; Scardino, Peter; Cuzick, Jack
2016-01-01
Background: Gleason scoring (GS) has major deficiencies and a novel system of five grade groups (GS⩽6; 3+4; 4+3; 8; ⩾9) has been recently agreed and included in the WHO 2016 classification. Although verified in radical prostatectomies using PSA relapse for outcome, it has not been validated using prostate cancer death as an outcome in biopsy series. There is debate whether an ‘overall' or ‘worst' GS in biopsies series should be used. Methods: Nine hundred and eighty-eight prostate cancer biopsy cases were identified between 1990 and 2003, and treated conservatively. Diagnosis and grade was assigned to each core as well as an overall grade. Follow-up for prostate cancer death was until 31 December 2012. A log-rank test assessed univariable differences between the five grade groups based on overall and worst grade seen, and using univariable and multivariable Cox proportional hazards. Regression was used to quantify differences in outcome. Results: Using both ‘worst' and ‘overall' GS yielded highly significant results on univariate and multivariate analysis with overall GS slightly but insignificantly outperforming worst GS. There was a strong correlation with the five grade groups and prostate cancer death. Conclusions: This is the largest conservatively treated prostate cancer cohort with long-term follow-up and contemporary assessment of grade. It validates the formation of five grade groups and suggests that the ‘worst' grade is a valid prognostic measure. PMID:27100731
2017-03-01
GENDER INTEGRATION IN THE CAREER FIRE SERVICES: A COMPARATIVE CASE STUDY OF MEN IN NURSING by Anna L. Schermerhorn-Collins March 2017...IN THE CAREER FIRE SERVICES: A COMPARATIVE CASE STUDY OF MEN IN NURSING 5. FUNDING NUMBERS 6. AUTHOR(S) Anna L. Schermerhorn-Collins 7... comparative case study of men in nursing. Research is based in academic and historical accounts, in addition to the use of participant-observation
You can use this free software program to complete the Off-site Consequence Analyses (both worst case scenarios and alternative scenarios) required under the Risk Management Program rule, so that you don't have to do calculations by hand.
49 CFR 194.105 - Worst case discharge.
Code of Federal Regulations, 2013 CFR
2013-10-01
...: Prevention measure Standard Credit(percent) Secondary containment >100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...
49 CFR 194.105 - Worst case discharge.
Code of Federal Regulations, 2010 CFR
2010-10-01
...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...
49 CFR 194.105 - Worst case discharge.
Code of Federal Regulations, 2014 CFR
2014-10-01
...: Prevention measure Standard Credit(percent) Secondary containment >100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...
49 CFR 194.105 - Worst case discharge.
Code of Federal Regulations, 2012 CFR
2012-10-01
...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...
49 CFR 194.105 - Worst case discharge.
Code of Federal Regulations, 2011 CFR
2011-10-01
...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pace, J.V. III; Cramer, S.N.; Knight, J.R.
1980-09-01
Calculations of the skyshine gamma-ray dose rates from three spent fuel storage pools under worst case accident conditions have been made using the discrete ordinates code DOT-IV and the Monte Carlo code MORSE and have been compared to those of two previous methods. The DNA 37N-21G group cross-section library was utilized in the calculations, together with the Claiborne-Trubey gamma-ray dose factors taken from the same library. Plots of all results are presented. It was found that the dose was a strong function of the iron thickness over the fuel assemblies, the initial angular distribution of the emitted radiation, and themore » photon source near the top of the assemblies. 16 refs., 11 figs., 7 tabs.« less
LANDSAT-D MSS/TM tuned orbital jitter analysis model LDS900
NASA Technical Reports Server (NTRS)
Pollak, T. E.
1981-01-01
The final LANDSAT-D orbital dynamic math model (LSD900), comprised of all test validated substructures, was used to evaluate the jitter response of the MSS/TM experiments. A dynamic forced response analysis was performed at both the MSS and TM locations on all structural modes considered (thru 200 Hz). The analysis determined the roll angular response of the MSS/TM experiments to improve excitation generated by component operation. Cross axis and cross experiment responses were also calculated. The excitations were analytically represented by seven and nine term Fourier series approximations, for the MSS and TM experiment respectively, which enabled linear harmonic solution techniques to be applied to response calculations. Single worst case jitter was estimated by variations of the eigenvalue spectrum of model LSD 900. The probability of any worst case mode occurrence was investigated.
PERSPECTIVE: Fire on the fringe
NASA Astrophysics Data System (ADS)
Pyne, Stephen J.
2009-09-01
Stephen J Pyne For the past two decades fire agencies have grappled with a seemingly new and intractable problem. Like the return of smallpox or polio, an issue they thought had vanished reappeared in virulent form. Year by year, the unthinkable became the undeniable: all across many industrial nations settlements began to burn. The earliest formal study followed the 1983 Ash Wednesday fires that swept through southeastern Australia [1]. That report remains definitive: nearly every subsequent inquiry has reaffirmed its conclusions about how houses actually burn and what remedial measures could counter the destruction [2, 3]. In many respects these insights simply adapted to nominal `wildlands' the lessons long learned for urban fire protection. Ban combustible roofing. Plug openings where embers might enter buildings. Establish defensible spaces. Provide firefighters. The larger concern was that wild landscapes and cityscapes were being intermixed in dangerous and unprecedented ways, like some kind of environmental matter and anti-matter. That mingling assumed two different forms. One was typical of developed nations with extensive wildlands in which suburban (or exurban) sprawl pushed against reserved landscapes. In 1987 researchers with the US Forest Service coined a name for this variant, the awkwardly labeled `wildland/urban interface' (WUI) or I-zone [4]. The second pattern found its best expression in Mediterranean Europe. Here agricultural lands were being abandoned, and then partially reclaimed by exurbanites [5]. The upshot for both was an explosion of fuels, houses (and communities) not built according to standard fire codes, and the absence of formal fire brigades [6]. The solution seemed obvious: install standard fire protection measures. More broadly, remove the houses or remove the wildlands. The apparitional fires would vanish as had urban conflagrations before them. In effect, define the problem as one that existing engineering, or techniques upgraded by further research, could solve. The drivers behind sprawl were fundamentally irrational: they resided in such inchoate urgings as aesthetics, a desire to `live in nature', a longing for personal privacy and social isolation. Correction required the imposition of science-based reason onto the scene, which argued for research. What you propose as a solution depends on how you define the problem. Houses were burning and residents too often dying; this was clearly a threat to public safety, an incitement for political action, and an incentive for research. But what were the causes? Scholarly disciplines and national traditions defined it differently. Europeans thought the issue fundamentally social. The breakdown in the old landscape created a disorder of which free-burning fire was a manifestation. This was in keeping with a long heritage of European thinking that identified fire with unrest and that argued that fire control was primarily a matter of social control. People needed to reassert their presence on the land. Those countries with large public estates such as Australia and the US conceived the problem in a converse way. At issue was the unwise (and unwarranted) encroachment of people into the bush. An ideal response would be to banish people from the fringe regions. Fire is `natural' and belongs in wildlands: it is people who upset the order of things. While government has a duty to shield its citizens from harm, it should not allow such measures to destroy nature preserves or the capacity of fire to propagate through them. People have to learn to `live with' fire. In both cases the prevailing assumption is that science will identify solutions, which society will apply. Yet here we have a case of countries implicitly pointing their national sciences in different directions because of their distinctive histories. It would seem that history as a discipline might also have something to contribute to this discourse both in terms of tracking land use and of explicating ideas about how people and land ought to coexist. And along with history one might add those other scholarships that analyze cultural values, beliefs and mores, and the relation of institutions - science among them - to their sustaining societies. They are not there. An intellectual border, a kind of WUI, divides them. Grudgingly, research has accommodated some sociology and economics, partly in the hope that they will help agencies educate the general population about the proper way to cope, that is, they might bring some rationality, as the agencies understand it, to an issue awash with free-floating folly. As for other disciplines, they belong on the other side of the fringe. Yet they too might redefine the topic in ways that add practical heft to our understanding. One might, for example, compare the contemporary wave of migrations to previous ones. For countries like the US and Australia, the anomalous interface becomes a replay of earlier colonization. Then, the dynamic was an agricultural frontier that chewed up landscapes, cast fire about, and saw combustible settlements burn lethally to the ground. Now, the process involves an urban out- migration that stuffs landscapes with exurban enclaves and regrown vegetation, that unwisely tries to ostracize all fire, and that is witnessing a macabre reburning of new communities. Industrial countries, that is, are recolonizing their once-rural landscapes, and as long as that process continues, so will wild fires. The older frontier went aflame where land use, informed by prevailing economies, met a favorable climate. Today's frontier is likewise most active where a global climate meets a globalizing economy. The older frontier was not equally dangerous; some sites suffered repeatedly, and some not at all. So, today, the worst outbreaks are regional: southeastern Australia, especially the Victorian mountains; California, which accounts for nearly 85% of America's losses; northwest Iberia, particularly the mountains of Portugal and the overgrown paisaje of Galicia. While international in scope, the real hazards reside in particular places, and while telegenically graphic, the economic losses elsewhere are no worse than those caused by tornadoes. The pressures for the earlier frontier were deep, and often damaged both land and settlers, but until the momentum had exhausted itself, there was little reform possible. So it may take the Great Recession, or worse, to stem the flow of money that has underwritten the colonization of subprime landscapes. Besides, sprawl is interbreeding with whatever hazard it meets. Fire in the I-zone is less damaging than sprawl in floodplains, coastal plains, or earthquake zones. Over the past 20 years, the responsible agencies have largely succeeded in learning how to protect people and houses when fires break out. The tenets of Fire Wise, Fire Safe, and Community Fireguard are widely known. A new kind of landscape is emerging. The worst hazards reside in the older communities that need retrofitting. A fatal plague is becoming a seasonal nuisance. But an appeal to other scholarships might - still can - illuminate the powers and limits of the proposed remediations, which ultimately rely for their success on cultural acceptance. Fire is about context: it synthesizes its surroundings. Yet the only research context allowed is a universalist science, such that the science of south Australia can join that of Catalonia and of Missoula, Montana. It does not mingle with other scholarship. In this way we have come to understand in marvelous detail how houses burn, but not why houses are there in the first place. We understand how to prevent roofs from igniting during ember attacks, but not how to cope with sprawl's attack on the landscape. So long as we leave fire on the fringe of scholarship, it will roar through the fringes of our new- settled countryside. References [1] Wilson A A G and Ferguson I S 1986 Predicting the probability of house survival during bushfires J. Environ. Management 23 259-70 [2] Gill A M and Stephens S L 2009 Scientific and social challenges for the management of fire-prone wildland-urban interfaces Environ. Res. Lett. 4 034014 [3] Cohen 2008 The wildland-urban interface fire problem: a consequence of the fire exclusion paradigm Forest History Today (Fall) 20-6 [4] Sommers W T 2008 The emergence of the wildland-urban interface concept Forest History Today (Fall) 12-9 [5] Peira J S et al (eds) 2006 Incêndios Florestais em Portugal. Caracterização, Impactes e Prevenção (Lisbon: Instituto Superior de Agronomia) [6] Pyne S 2008 Spark and sprawl: a world tour Forest History Today (Fall) 4-11
An Alaskan Theater Airlift Model.
1982-02-19
overt attack on American soil . In any case, such a reaotion represents the worst-case scenario In that theater forces would be denied the advantages of...NNSETNTAFE,SS(l06), USL (100), 7 TNET,THOV,1X(100) REAL A,CHKTIN INTEGER ORIC,DEST,ISCTMP,WXFLG,ALLW,T(RT,ZPTR,ZONE, * FTNFLG.WX,ZONLST(150) DATA ZNSI
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-15
... Service (NPS) for the Florida leafwing and the pine rockland ecosystem, in general. Sea Level Rise... habitat. In the best case scenario, which assumes low sea level rise, high financial resources, proactive... human population. In the worst case scenario, which assumes high sea level rise, low financial resources...
A Different Call to Arms: Women in the Core of the Communications Revolution.
ERIC Educational Resources Information Center
Rush, Ramona R.
A "best case" model for the role of women in the postindustrial communications era predicts positive leadership roles based on the preindustrial work characteristics of cooperation and consensus. A "worst case" model finds women entrepreneurs succumbing to the competitive male ethos and extracting the maximum amount of work…
Impacts of single and recurrent wildfires on topsoil moisture regime
NASA Astrophysics Data System (ADS)
González-Pelayo, Oscar; Malvar, Maruxa; van den Elsen, Erik; Hosseini, Mohammadreza; Coelho, Celeste; Ritsema, Coen; Bautista, Susana; Keizer, Jacob
2017-04-01
The increasing fire recurrence on forest in the Mediterranean basin is well-established by future climate scenarios due to land use changes and climate predictions. By this, shifts on mature pine woodlands to shrub rangelands are of major importance on forest ecosystems buffer functions, since historical patterns of established vegetation help to recover from fire disturbances. This fact, together with the predicted expansion of the drought periods, will affect feedback processes of vegetation patterns since water availability on these seasons are driven by post-fire local soil properties. Although fire impacts of soil properties and water availability has been widely studied using the fire severity as the main factor, little research is developed on post-fire soil moisture patterns, including the fire recurrence as a key explanatory variable. The following research investigated, in pine woodlands of north central Portugal, the short-term consequences (one year after a fire) of wildfire recurrence on the surface soil moisture content (SMC) and on effective soil water (SWEFF, parameter that includes actual daily soil moisture, soil field capacity-FC and permanent wilting point-PWP). The study set-up includes analyses at two fire recurrence scenarios (1x- and 4x-burnt since 1975), at a patch level (shrub patch/interpatch) and at two soil depths (2.5 and 7.5 cm) in a nested approach. Understanding how fire recurrence affects water in soil over space and time is the main goal of this research. The use of soil moisture sensors in a nested approach, the rainfall features and analyses on basic soil properties as soil organic matter, texture, bulk density, pF curves, soil water repellency and soil surface components will establish which factors has the largest role in controlling soil moisture behavior. Main results displayed, in a seasonal and yearly basis, no differences on SMC as increasing fire recurrence (1x- vs 4x-burnt) neither between patch/interpatch microsites at both two soil depths. Otherwise, in a yearly basis and during soil drying cycles, it was found less effective water on soil at the surface layers of the 4x-burnt and between shrub interpatches, based on the worst soil hydrological conditions (PWP) and the increasing percentage of abiotic soil surface components as increasing fire recurrence. Our results suggest that the inclusion of soil hydrological properties, as pF-curves, on the soil water effectiveness calculation seems to be a better indicator of water availability that volumetric SM per se. Otherwise, the use of a nested approach methodology, stresses how fire recurrence, expected increases in the summer drought spells and, the increasing dominance of abiotic soil surface components, are the factors that much influence soil eco-hydrological functioning in fire prone ecosystems. Furthermore, this research point out how post-fire soil structural quality into plant interpatches could provoke looping feedback processes triggering desertification situations also in humid Mediterranean forestlands.
A critique of the historical-fire-regime concept in conservation.
Freeman, Johanna; Kobziar, Leda; Rose, Elizabeth White; Cropper, Wendell
2017-10-01
Prescribed fire is widely accepted as a conservation tool because fire is essential to the maintenance of native biodiversity in many terrestrial communities. Approaches to this land-management technique vary greatly among continents, and sharing knowledge internationally can inform application of prescribed fire worldwide. In North America, decisions about how and when to apply prescribed fire are typically based on the historical-fire-regime concept (HFRC), which holds that replicating the pattern of fires ignited by lightning or preindustrial humans best promotes native species in fire-prone regions. The HFRC rests on 3 assumptions: it is possible to infer historical fire regimes accurately; fire-suppressed communities are ecologically degraded; and reinstating historical fire regimes is the best course of action despite the global shift toward novel abiotic and biotic conditions. We examined the underpinnings of these assumptions by conducting a literature review on the use of historical fire regimes to inform the application of prescribed fire. We found that the practice of inferring historical fire regimes for entire regions or ecosystems often entails substantial uncertainty and can yield equivocal results; ecological outcomes of fire suppression are complex and may not equate to degradation, depending on the ecosystem and context; and habitat fragmentation, invasive species, and other modern factors can interact with fire to produce novel and in some cases negative ecological outcomes. It is therefore unlikely that all 3 assumptions will be fully upheld for any landscape in which prescribed fire is being applied. Although the HFRC is a valuable starting point, it should not be viewed as the sole basis for developing prescribed fire programs. Rather, fire prescriptions should also account for other specific, measurable ecological parameters on a case-by-case basis. To best achieve conservation goals, researchers should seek to understand contemporary fire-biota interactions across trophic levels, functional groups, spatial and temporal scales, and management contexts. © 2017 Society for Conservation Biology.
2016-01-01
Fire plays an increasingly significant role in tropical forest and savanna ecosystems, contributing to greenhouse gas emissions and impacting on biodiversity. Emerging research shows the potential role of Indigenous land-use practices for controlling deforestation and reducing CO2 emissions. Analysis of satellite imagery suggests that Indigenous lands have the lowest incidence of wildfires, significantly contributing to maintaining carbon stocks and enhancing biodiversity. Yet acknowledgement of Indigenous peoples' role in fire management and control is limited, and in many cases dismissed, especially in policy-making circles. In this paper, we review existing data on Indigenous fire management and impact, focusing on examples from tropical forest and savanna ecosystems in Venezuela, Brazil and Guyana. We highlight how the complexities of community owned solutions for fire management are being lost as well as undermined by continued efforts on fire suppression and firefighting, and emerging approaches to incorporate Indigenous fire management into market- and incentive-based mechanisms for climate change mitigation. Our aim is to build a case for supporting Indigenous fire practices within all scales of decision-making by strengthening Indigenous knowledge systems to ensure more effective and sustainable fire management. This article is part of the themed issue ‘The interaction of fire and mankind’. PMID:27216507
Mistry, Jayalaxshmi; Bilbao, Bibiana A; Berardi, Andrea
2016-06-05
Fire plays an increasingly significant role in tropical forest and savanna ecosystems, contributing to greenhouse gas emissions and impacting on biodiversity. Emerging research shows the potential role of Indigenous land-use practices for controlling deforestation and reducing CO2 emissions. Analysis of satellite imagery suggests that Indigenous lands have the lowest incidence of wildfires, significantly contributing to maintaining carbon stocks and enhancing biodiversity. Yet acknowledgement of Indigenous peoples' role in fire management and control is limited, and in many cases dismissed, especially in policy-making circles. In this paper, we review existing data on Indigenous fire management and impact, focusing on examples from tropical forest and savanna ecosystems in Venezuela, Brazil and Guyana. We highlight how the complexities of community owned solutions for fire management are being lost as well as undermined by continued efforts on fire suppression and firefighting, and emerging approaches to incorporate Indigenous fire management into market- and incentive-based mechanisms for climate change mitigation. Our aim is to build a case for supporting Indigenous fire practices within all scales of decision-making by strengthening Indigenous knowledge systems to ensure more effective and sustainable fire management.This article is part of the themed issue 'The interaction of fire and mankind'. © 2016 The Author(s).
Fire deaths in aircraft without the crashworthy fuel system.
Springate, C S; McMeekin, R R; Ruehle, C J
1989-10-01
Cases reported to the Armed Forces Institute of Pathology were examined for occupants of helicopters without the crashworthy fuel system (CWFS) who survived crashes but died as a result of postcrash fires. There were 16 fire deaths in the 9 such accidents which occurred between January 1976 and April 1984. All of these victims would have survived if there had been no postcrash fire. Partial body destruction by fire probably prevented inclusion of many other cases. The dramatic reduction in fire deaths and injuries due to installation of the CWFS in Army helicopters is discussed. The author concludes that fire deaths and injuries in aircraft accidents could almost be eliminated by fitting current and future aircraft with the CWFS.
RMP Guidance for Offsite Consequence Analysis
Offsite consequence analysis (OCA) consists of a worst-case release scenario and alternative release scenarios. OCA is required from facilities with chemicals above threshold quantities. RMP*Comp software can be used to perform calculations described here.
Near real-time wildfire mapping using spatially-refined satellite data: The rim fire case study
Patricia Oliva; Wilfrid Schroeder
2015-01-01
Fire incident teams depend on accurate fire diagnostics and predictive data to guide daily positioning and tactics of fire crews. Currently, the U.S. Department of Agriculture - Forest Service National Infrared Operations (NIROPs) nighttime airborne data provides daily information about the fire front and total fire affected area of priority fires to the incident teams...
Risk to the public from carbon fibers released in civil aircraft accidents
NASA Technical Reports Server (NTRS)
1980-01-01
Because carbon fibers are strong, stiff, and lightweight, they are attractive for use in composite structures. Because they also have high electrical conductivity, free carbon fibers settling on electrical conductors can cause malfunctions. If released from the composite by burning, the fibers may become a hazard to exposed electrical and electronic equipment. As part of a Federal study of the potential hazard associated with the use of carbon fibers, NASA assessed the public risk associated with crash fire accidents of civil aircraft. The NASA study projected a dramatic increase in the use of carbon composites in civil aircraft and developed technical data to support the risk assessment. Personal injury was found to be extremely unlikely. In 1993, the year chosen as a focus for the study, the expected annual cost of damage caused by released carbon fibers is only $1000. Even the worst-case carbon fiber incident simulated (costing $178,000 once in 34,000 years) was relatively low-cost compared with the usual air transport accident cost. On the basis of these observations, the NASA study concluded that exploitation of composites should continue, that additional protection of avionics is unnecessary, and that development of alternate materials specifically to overcome this problem is not justified.
An uncommon case of random fire-setting behavior associated with Todd paralysis: a case report.
Kanehisa, Masayuki; Morinaga, Katsuhiko; Kohno, Hisae; Maruyama, Yoshihiro; Ninomiya, Taiga; Ishitobi, Yoshinobu; Tanaka, Yoshihiro; Tsuru, Jusen; Hanada, Hiroaki; Yoshikawa, Tomoya; Akiyoshi, Jotaro
2012-08-31
The association between fire-setting behavior and psychiatric or medical disorders remains poorly understood. Although a link between fire-setting behavior and various organic brain disorders has been established, associations between fire setting and focal brain lesions have not yet been reported. Here, we describe the case of a 24-year-old first time arsonist who suffered Todd's paralysis prior to the onset of a bizarre and random fire-setting behavior. A case of a 24-year-old man with a sudden onset of a bizarre and random fire-setting behavior is reported. The man, who had been arrested on felony arson charges, complained of difficulties concentrating and of recent memory disturbances with leg weakness. A video-EEG recording demonstrated a close relationship between the focal motor impairment and a clear-cut epileptic ictal discharge involving the bilateral motor cortical areas. The SPECT result was statistically analyzed by comparing with standard SPECT images obtained from our institute (easy Z-score imaging system; eZIS). eZIS revealed hypoperfusion in cingulate cortex, basal ganglia and hyperperfusion in frontal cortex,. A neuropsychological test battery revealed lower than normal scores for executive function, attention, and memory, consistent with frontal lobe dysfunction. The fire-setting behavior and Todd's paralysis, together with an unremarkable performance on tests measuring executive function fifteen months prior, suggested a causal relationship between this organic brain lesion and the fire-setting behavior. The case describes a rare and as yet unreported association between random, impulse-driven fire-setting behavior and damage to the brain and suggests a disconnection of frontal lobe structures as a possible pathogenic mechanism.
Planning Education for Regional Economic Integration: The Case of Paraguay and MERCOSUR.
ERIC Educational Resources Information Center
McGinn, Noel
This paper examines the possible impact of MERCOSUR on Paraguay's economic and educational systems. MERCOSUR is a trade agreement among Argentina, Brazil, Paraguay, and Uruguay, under which terms all import tariffs among the countries will be eliminated by 1994. The countries will enter into a common economic market. The worst-case scenario…
Asteroid Bennu Temperature Maps for OSIRIS-REx Spacecraft and Instrument Thermal Analyses
NASA Technical Reports Server (NTRS)
Choi, Michael K.; Emery, Josh; Delbo, Marco
2014-01-01
A thermophysical model has been developed to generate asteroid Bennu surface temperature maps for OSIRIS-REx spacecraft and instrument thermal design and analyses at the Critical Design Review (CDR). Two-dimensional temperature maps for worst hot and worst cold cases are used in Thermal Desktop to assure adequate thermal design margins. To minimize the complexity of the Bennu geometry in Thermal Desktop, it is modeled as a sphere instead of the radar shape. The post-CDR updated thermal inertia and a modified approach show that the new surface temperature predictions are more benign. Therefore the CDR Bennu surface temperature predictions are conservative.
NASA Astrophysics Data System (ADS)
Mustapa, S. A. S.; Ramli Sulong, N. H.
2017-06-01
This research deals with contribution of hybrid fillers with palm oil clinker (POC) as a novel bio-filler in solvent-borne intumescent fire protective coating for steel. The hybrid fillers with POC were mixed in appropriate amount of additives and acrylic binder to produce the intumescent coatings. The intumescent coatings were characterized by using Bunsen burner test, surface spread of flame, thermogravimetric analysis, field emission scanning electron microscopy, static immersion and Instron micro tester equipment. Specimen with POC as a single filler has significantly enhanced the fire protection performances of the intumescent coating due to the high thermal stability of POC, where less than 10% of temperature different when compared to specimens with hybrid fillers. From the flame spread classification, class 1 is the best classification while Class 4 is the worst and considered high risk. All specimens was classified as class 1 since the final spread of flame was less than 165 mm. For hybrid fillers composition, specimen consist of POC/Al(OH)3/TiO2 has significantly improved the water resistance of the coating due to the low solubility of Al(OH)3 in water, while specimen contain of Mg(OH)2 had higher mechanical strength due to the strong bonding between the metal surface and acrylic binder/Mg(OH)2 filler. It was found that coating with the incorporation of all hybrid fillers gives excellent fire protection performance with good thermal stability, water resistance and mechanical properties. It can be concluded that, the selection of appropriate composition of fillers and binder in intumescent coating was highly influence the intumescent coating performance.
Lessons learnt from a factory fire with asbestos-containing fallout.
Bridgman, S A
1999-06-01
Fallout containing asbestos from a factory fire at Tranmere, Wirral, England, landed on a highly populated urban area with an estimated 16000 people living in the area worst affected, which included a shipbuilding community. There was considerable public concern over the health impact of the acute environmental incident, and great media interest. A descriptive study was carried out of the acute environmental incident and its management, and the difficulties encountered. Practical lessons learnt include need for: increased fire-fighter awareness of potential adverse health effects from asbestos in the structure of buildings; early involvement of both Local Authority environmental health and National Health Service public health departments; creation of a systematic local database of potential environmental health hazards in the structure of buildings as well as their contents; 24 hour on-call arrangements with laboratories expert in analyses of fire fallout; rapid quantitative analyses of multiple environmental samples; district written policy on handling asbestos incidents; systematic assessment of fright and media factors in public impact of an incident; dedicated public help-lines open long hours; consistent evidence-based public messages from all those communicating with the public; measurement of asbestos levels in the street and homes for public reassurance; local and health authorities' subscription to an environmental incident support service; formation of an acute environmental incident team to jointly manage and publicly report on airborne acute environmental incidents; clear government definition of responsibilities of different agencies. This paper provides a description of important lessons learnt during an acute environmental incident with asbestos-containing fallout. It will be helpful to those involved in the practical planning for and management of future incidents.
Availability Simulation of AGT Systems
DOT National Transportation Integrated Search
1975-02-01
The report discusses the analytical and simulation procedures that were used to evaluate the effects of failure in a complex dual mode transportation system based on a worst case study-state condition. The computed results are an availability figure ...
Carbon monoxide screen for signalized intersections COSIM, version 3.0 : technical documentation.
DOT National Transportation Integrated Search
2008-07-01
The Illinois Department of Transportation (IDOT) currently uses the computer screening model Illinois : CO Screen for Intersection Modeling (COSIM) to estimate worst-case CO concentrations for proposed roadway : projects affecting signalized intersec...
40 CFR 68.25 - Worst-case release scenario analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...
40 CFR 68.25 - Worst-case release scenario analysis.
Code of Federal Regulations, 2011 CFR
2011-07-01
... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...
40 CFR 68.25 - Worst-case release scenario analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...
40 CFR 68.25 - Worst-case release scenario analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...
40 CFR 68.25 - Worst-case release scenario analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...
RMP Guidance for Warehouses - Chapter 4: Offsite Consequence Analysis
Offsite consequence analysis (OCA) informs government and the public about potential consequences of an accidental toxic or flammable chemical release at your facility, and consists of a worst-case release scenario and alternative release scenarios.
RMP Guidance for Chemical Distributors - Chapter 4: Offsite Consequence Analysis
How to perform the OCA for regulated substances, informing the government and the public about potential consequences of an accidental chemical release at your facility. Includes calculations for worst-case scenario, alternative scenarios, and endpoints.
... damage to the tissue and bone supporting the teeth. In the worst cases, you can lose teeth. In gingivitis, the gums become red and swollen. ... flossing and regular cleanings by a dentist or dental hygienist. Untreated gingivitis can lead to periodontitis. If ...
NASA Technical Reports Server (NTRS)
Bury, Kristen M.; Kerslake, Thomas W.
2008-01-01
NASA's new Orion Crew Exploration Vehicle has geometry that orients the reaction control system (RCS) thrusters such that they can impinge upon the surface of Orion's solar array wings (SAW). Plume impingement can cause Paschen discharge, chemical contamination, thermal loading, erosion, and force loading on the SAW surface, especially when the SAWs are in a worst-case orientation (pointed 45 towards the aft end of the vehicle). Preliminary plume impingement assessment methods were needed to determine whether in-depth, timeconsuming calculations were required to assess power loss. Simple methods for assessing power loss as a result of these anomalies were developed to determine whether plume impingement induced power losses were below the assumed contamination loss budget of 2 percent. This paper details the methods that were developed and applies them to Orion's worst-case orientation.
Response of the North American corn belt to climate warming, CO2
NASA Astrophysics Data System (ADS)
1983-08-01
The climate of the North American corn belt was characterized to estimate the effects of climatic change on that agricultural region. Heat and moisture characteristics of the current corn belt were identified and mapped based on a simulated climate for a doubling of atmospheric CO2 concentrations. The result was a map of the projected corn belt corresponding to the simulated climatic change. Such projections were made with and without an allowance for earlier planting dates that could occur under a CO2-induced climatic warming. Because the direct effects of CO2 increases on plants, improvements in farm technology, and plant breeding are not considered, the resulting projections represent an extreme or worst case. The results indicate that even for such a worst case, climatic conditions favoring corn production would not extend very far into Canada. Climatic buffering effects of the Great Lakes would apparently retard northeastward shifts in corn-belt location.
NASA Technical Reports Server (NTRS)
Lee, P. J.
1985-01-01
For a frequency-hopped noncoherent MFSK communication system without jammer state information (JSI) in a worst case partial band jamming environment, it is well known that the use of a conventional unquantized metric results in very poor performance. In this paper, a 'normalized' unquantized energy metric is suggested for such a system. It is shown that with this metric, one can save 2-3 dB in required signal energy over the system with hard decision metric without JSI for the same desired performance. When this very robust metric is compared to the conventional unquantized energy metric with JSI, the loss in required signal energy is shown to be small. Thus, the use of this normalized metric provides performance comparable to systems for which JSI is known. Cutoff rate and bit error rate with dual-k coding are used for the performance measures.
Centaur Propellant Thermal Conditioning Study
NASA Technical Reports Server (NTRS)
Blatt, M. H.; Pleasant, R. L.; Erickson, R. C.
1976-01-01
A wicking investigation revealed that passive thermal conditioning was feasible and provided considerable weight advantage over active systems using throttled vent fluid in a Centaur D-1s launch vehicle. Experimental wicking correlations were obtained using empirical revisions to the analytical flow model. Thermal subcoolers were evaluated parametrically as a function of tank pressure and NPSP. Results showed that the RL10 category I engine was the best candidate for boost pump replacement and the option showing the lowest weight penalty employed passively cooled acquisition devices, thermal subcoolers, dry ducts between burns and pumping of subcooler coolant back into the tank. A mixing correlation was identified for sizing the thermodynamic vent system mixer. Worst case mixing requirements were determined by surveying Centaur D-1T, D-1S, IUS, and space tug vehicles. Vent system sizing was based upon worst case requirements. Thermodynamic vent system/mixer weights were determined for each vehicle.
Updated model assessment of pollution at major U. S. airports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamartino, R.J.; Rote, D.M.
1979-02-01
The air quality impact of aircraft at and around Los Angeles International Airport (LAX) was simulated for hours of peak aircraft operation and 'worst case' pollutant dispersion conditions by using an updated version of the Argonne Airport Vicinity Air Pollution model; field programs at LAX, O'Hara, and John F. Kennedy International Airports determined the 'worst case' conditions. Maximum carbon monoxide concentrations at LAX were low relative to National Ambient Air Quality Standards; relatively high and widespread hydrocarbon concentrations indicated that aircraft emissions may aggravate oxidant problems near the airport; nitrogen oxide concentrations were close to the levels set in proposedmore » standards. Data on typical time-in-mode for departing and arriving aircraft, the 8/4/77 diurnal variation in airport activity, and carbon monoxide concentration isopleths are given, and the update factors in the model are discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundaram, Sriram; Grenat, Aaron; Naffziger, Samuel
Power management techniques can be effective at extracting more performance and energy efficiency out of mature systems on chip (SoCs). For instance, the peak performance of microprocessors is often limited by worst case technology (Vmax), infrastructure (thermal/electrical), and microprocessor usage assumptions. Performance/watt of microprocessors also typically suffers from guard bands associated with the test and binning processes as well as worst case aging/lifetime degradation. Similarly, on multicore processors, shared voltage rails tend to limit the peak performance achievable in low thread count workloads. In this paper, we describe five power management techniques that maximize the per-part performance under the before-mentionedmore » constraints. Using these techniques, we demonstrate a net performance increase of up to 15% depending on the application and TDP of the SoC, implemented on 'Bristol Ridge,' a 28-nm CMOS, dual-core x 86 accelerated processing unit.« less
VEGA Launch Vehicle Dynamic Environment: Flight Experience and Qualification Status
NASA Astrophysics Data System (ADS)
Di Trapani, C.; Fotino, D.; Mastrella, E.; Bartoccini, D.; Bonnet, M.
2014-06-01
VEGA Launch Vehicle (LV) during flight is equipped with more than 400 sensors (pressure transducers, accelerometers, microphones, strain gauges...) aimed to catch the physical phenomena occurring during the mission. Main objective of these sensors is to verify that the flight conditions are compliant with the launch vehicle and satellite qualification status and to characterize the phenomena that occur during flight. During VEGA development, several test campaigns have been performed in order to characterize its dynamic environment and identify the worst case conditions, but only with the flight data analysis is possible to confirm the worst cases identified and check the compliance of the operative life conditions with the components qualification status.Scope of the present paper is to show a comparison of the sinusoidal dynamic phenomena that occurred during VEGA first and second flight and give a summary of the launch vehicle qualification status.
NASA Astrophysics Data System (ADS)
Bury, Kristen M.; Kerslake, Thomas W.
2008-06-01
NASA's new Orion Crew Exploration Vehicle has geometry that orients the reaction control system (RCS) thrusters such that they can impinge upon the surface of Orion's solar array wings (SAW). Plume impingement can cause Paschen discharge, chemical contamination, thermal loading, erosion, and force loading on the SAW surface, especially when the SAWs are in a worst-case orientation (pointed 45 towards the aft end of the vehicle). Preliminary plume impingement assessment methods were needed to determine whether in-depth, timeconsuming calculations were required to assess power loss. Simple methods for assessing power loss as a result of these anomalies were developed to determine whether plume impingement induced power losses were below the assumed contamination loss budget of 2 percent. This paper details the methods that were developed and applies them to Orion's worst-case orientation.
An interior-point method-based solver for simulation of aircraft parts riveting
NASA Astrophysics Data System (ADS)
Stefanova, Maria; Yakunin, Sergey; Petukhova, Margarita; Lupuleac, Sergey; Kokkolaras, Michael
2018-05-01
The particularities of the aircraft parts riveting process simulation necessitate the solution of a large amount of contact problems. A primal-dual interior-point method-based solver is proposed for solving such problems efficiently. The proposed method features a worst case polynomial complexity bound ? on the number of iterations, where n is the dimension of the problem and ε is a threshold related to desired accuracy. In practice, the convergence is often faster than this worst case bound, which makes the method applicable to large-scale problems. The computational challenge is solving the system of linear equations because the associated matrix is ill conditioned. To that end, the authors introduce a preconditioner and a strategy for determining effective initial guesses based on the physics of the problem. Numerical results are compared with ones obtained using the Goldfarb-Idnani algorithm. The results demonstrate the efficiency of the proposed method.
Statistical analysis of QC data and estimation of fuel rod behaviour
NASA Astrophysics Data System (ADS)
Heins, L.; Groβ, H.; Nissen, K.; Wunderlich, F.
1991-02-01
The behaviour of fuel rods while in reactor is influenced by many parameters. As far as fabrication is concerned, fuel pellet diameter and density, and inner cladding diameter are important examples. Statistical analyses of quality control data show a scatter of these parameters within the specified tolerances. At present it is common practice to use a combination of superimposed unfavorable tolerance limits (worst case dataset) in fuel rod design calculations. Distributions are not considered. The results obtained in this way are very conservative but the degree of conservatism is difficult to quantify. Probabilistic calculations based on distributions allow the replacement of the worst case dataset by a dataset leading to results with known, defined conservatism. This is achieved by response surface methods and Monte Carlo calculations on the basis of statistical distributions of the important input parameters. The procedure is illustrated by means of two examples.
A Graph Based Backtracking Algorithm for Solving General CSPs
NASA Technical Reports Server (NTRS)
Pang, Wanlin; Goodwin, Scott D.
2003-01-01
Many AI tasks can be formalized as constraint satisfaction problems (CSPs), which involve finding values for variables subject to constraints. While solving a CSP is an NP-complete task in general, tractable classes of CSPs have been identified based on the structure of the underlying constraint graphs. Much effort has been spent on exploiting structural properties of the constraint graph to improve the efficiency of finding a solution. These efforts contributed to development of a class of CSP solving algorithms called decomposition algorithms. The strength of CSP decomposition is that its worst-case complexity depends on the structural properties of the constraint graph and is usually better than the worst-case complexity of search methods. Its practical application is limited, however, since it cannot be applied if the CSP is not decomposable. In this paper, we propose a graph based backtracking algorithm called omega-CDBT, which shares merits and overcomes the weaknesses of both decomposition and search approaches.
Rella, R; Sturaro, A; Parvoli, G; Ferrara, D; Casellato, U; Vadalà, G
2005-01-01
In Italy, every summer forest fires attract public attention due to the number of victims, the intensity of the fires, the areas devastated, the environmental damage and the loss of property. Excluding some fires by natural causes, other causes are related to the social, economic, and productive profile of the territory. The erroneous expectation is that wooded areas destroyed by fire can then be used for private interests. Often, a fire, started to clear a small area, can completely change the expected result, producing disaster, loss of property, destruction of entire forests and resident fauna, and kill innocent people. In this case report, the reconstruction of an arson scene, the analytical techniques and the results obtained are illustrated in this paper, with the aim of sharing with other research laboratories the current knowledge on forest fire.
Tongue-tied: Confused meanings for common fire terminology can lead to fuels mismanagement
Theresa B. Jain; Russell T. Graham; David S. Pilliod
2004-01-01
The ineffective and inconsistent use of terminology among fire managers, scientists, resource managers and the public is a constant problem in resource management. In fire management and fire science, the terms fire severity, burn severity and fire intensity are defined in a variety of ways, used inconsistently and, in some cases, interchangeably.
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
Analysis of geometric moments as features for firearm identification.
Md Ghani, Nor Azura; Liong, Choong-Yeun; Jemain, Abdul Aziz
2010-05-20
The task of identifying firearms from forensic ballistics specimens is exacting in crime investigation since the last two decades. Every firearm, regardless of its size, make and model, has its own unique 'fingerprint'. These fingerprints transfer when a firearm is fired to the fired bullet and cartridge case. The components that are involved in producing these unique characteristics are the firing chamber, breech face, firing pin, ejector, extractor and the rifling of the barrel. These unique characteristics are the critical features in identifying firearms. It allows investigators to decide on which particular firearm that has fired the bullet. Traditionally the comparison of ballistic evidence has been a tedious and time-consuming process requiring highly skilled examiners. Therefore, the main objective of this study is the extraction and identification of suitable features from firing pin impression of cartridge case images for firearm recognition. Some previous studies have shown that firing pin impression of cartridge case is one of the most important characteristics used for identifying an individual firearm. In this study, data are gathered using 747 cartridge case images captured from five different pistols of type 9mm Parabellum Vektor SP1, made in South Africa. All the images of the cartridge cases are then segmented into three regions, forming three different set of images, i.e. firing pin impression image, centre of firing pin impression image and ring of firing pin impression image. Then geometric moments up to the sixth order were generated from each part of the images to form a set of numerical features. These 48 features were found to be significantly different using the MANOVA test. This high dimension of features is then reduced into only 11 significant features using correlation analysis. Classification results using cross-validation under discriminant analysis show that 96.7% of the images were classified correctly. These results demonstrate the value of geometric moments technique for producing a set of numerical features, based on which the identification of firearms are made.
Motor vehicle-related burns: a review of 107 cases.
Papaevangelou, J; Batchelor, J S; Roberts, A H
1995-02-01
Motor vehicles are a major cause of morbidity and mortality. Burn injuries sustained from motor vehicles form a small but important subgroup. The authors have reviewed the case notes of 107 patients with motor vehicle-related burns over a 13-year period. The age ranged from 18 months to 65 years and the male to female ratio was 4:1. The mechanisms of injury were variable, although four major categories could be identified. These accounted for 83 per cent of the cases. Car fires following road traffic accidents was the largest group accounting for 48.5 per cent of cases. The remaining three groups were: motorcycle-related burns following road traffic accidents (6.5 per cent of cases), garage fire-related burns (15 per cent of cases) and car radiator-related burns (13 per cent of cases). Garage fire-related burns had the highest mortality of the four groups (25 per cent). This study demonstrated that garage fire burns are an important subgroup of motor vehicle-related burns.
A stochastic Forest Fire Model for future land cover scenarios assessment
NASA Astrophysics Data System (ADS)
Fiorucci, P.; Holmes, T.; Gaetani, F.; D'Andrea, M.
2009-04-01
Land cover change and forest fire interaction under climate and socio-economics changes, is one of the main issues of the 21th century. The capability of defining future scenarios of land cover and fire regime allow forest managers to better understand the best actions to be carried out and their long term effects. In this paper a new methodology for land cover change simulations under climate change and fire disturbance is presented and discussed. The methodology is based on the assumption that forest fires exhibits power law frequency-area distribution. The well known Forest Fire Model (FFM), which is an example of self organized criticality, is able to reproduce this behavior. Starting from this observation, a modified version of the FFM has been developed. The new model, called Modified Forest Fire Model (MFFM) introduces several new features. A stochastic model for vegetation growth and regrowth after fire occurrence has been implemented for different kind of vegetations. In addition, a stochastic fire propagation model taking into account topography and vegetation cover has been introduced. The MFFM has been developed with the purpose of estimating vegetation cover changes and fire regimes over a time windows of many years for a given spatial region. Two different case studies have been carried out. The first case study is related with Liguria (Italy), a region of 5400 km2 lying between the Cote d'Azur, France, and Tuscany, Italy, on the northwest coast of the Tyrrhenian Sea. This region is characterized by Mediterranean fire regime. The second case study has been carried out in California (Florida) on a region having similar area and characterized by similar climate conditions. In both cases the model well represents the actual fire regime in terms of power law parameters proving interesting results about future land cover scenarios under climate, land use and socio-economics change.
Fire Safety for Retired Adults: Participant's Coursebook.
ERIC Educational Resources Information Center
Walker (Bonnie) and Associates, Inc., Crofton, MD.
The risk of dying from fire increases substantially among older adults. This document contains a collection of fire safety information for elderly people. Information includes procedures to follow in case of fire and early warning technologies such as smoke alarms. The booklet describes potential sources of fires (smoking, home heating, kitchens,…
ASTM F1717 standard for the preclinical evaluation of posterior spinal fixators: can we improve it?
La Barbera, Luigi; Galbusera, Fabio; Villa, Tomaso; Costa, Francesco; Wilke, Hans-Joachim
2014-10-01
Preclinical evaluation of spinal implants is a necessary step to ensure their reliability and safety before implantation. The American Society for Testing and Materials reapproved F1717 standard for the assessment of mechanical properties of posterior spinal fixators, which simulates a vertebrectomy model and recommends mimicking vertebral bodies using polyethylene blocks. This set-up should represent the clinical use, but available data in the literature are few. Anatomical parameters depending on the spinal level were compared to published data or measurements on biplanar stereoradiography on 13 patients. Other mechanical variables, describing implant design were considered, and all parameters were investigated using a numerical parametric finite element model. Stress values were calculated by considering either the combination of the average values for each parameter or their worst-case combination depending on the spinal level. The standard set-up represents quite well the anatomy of an instrumented average thoracolumbar segment. The stress on the pedicular screw is significantly influenced by the lever arm of the applied load, the unsupported screw length, the position of the centre of rotation of the functional spine unit and the pedicular inclination with respect to the sagittal plane. The worst-case combination of parameters demonstrates that devices implanted below T5 could potentially undergo higher stresses than those described in the standard suggestions (maximum increase of 22.2% at L1). We propose to revise F1717 in order to describe the anatomical worst case condition we found at L1 level: this will guarantee higher safety of the implant for a wider population of patients. © IMechE 2014.
A Method for Assessing Material Flammability for Micro-Gravity Environments
NASA Technical Reports Server (NTRS)
Steinhaus, T.; Olenick, S. M.; Sifuentes, A.; Long, R. T.; Torero, J. L.
1999-01-01
On a spacecraft, one of the greatest fears during a mission is the outbreak of a fire. Since spacecraft are enclosed spaces and depend highly on technical electronics, a small fire could cause a large amount of damage. NASA uses upward flame spread as a "worst case scenario" evaluation for materials and the Heat and Visible Smoke Release Rates Test to assess the damage potential of a fire. Details of these tests and the protocols followed are provided by the "Flammability, Odor, Offgassing, and Compatibility Requirements and Test Procedures for Materials in Environments that Support Combustion" document. As pointed by Ohlemiller and Villa, the upward flame spread test does not address the effect of external radiation on ignition and spread. External radiation, as that coming from an overheated electrical component, is a plausible fire scenario in a space facility and could result in a reversal of the flammability rankings derived from the upward flame spread test. The "Upward Flame Propagation Test" has been the subject of strong criticism in the last few years. In many cases, theoretical exercises and experimental results have demonstrated the possibility of a reversal in the material flammability rankings from normal to micro-gravity. Furthermore, the need to incorporate information on the effects of external radiation and opposed flame spread when ranking materials based on their potential to burn in micro-gravity has been emphasized. Experiments conducted in a 2.2 second drop tower with an ethane burner in an air cross flow have emphasized that burning at the trailing edge is deterred in micro-gravity due to the decreased oxygen transport. For very low air flow velocities (U<0.005 m/s) the flame envelopes the burner and a slight increase in velocity results in extinction of the trailing edge (U>0.01 m/s). Only for U>0.l m/s extinction is observed at the leading edge (blow-off). Three dimensional numerical calculations performed for thin cellulose centrally ignited with an axisymmetric source have shown that under the presence of a forced flow slower than 0.035 m/s flames spreads only opposing the flow. Extinction is observed at the trailing edge with no concurrent propagation. Experiments conducted by the same authors at the JAMIC 10 second drop tower verified these calculations. Reducing the oxygen supply to the flame also results in a decrease of the Damk6hler number which might lead to extinction. Greyson et al. and Ferkul conducted experiments in micro-gravity (5 second drop tower) with thin paper and observed that at very low flow velocities concurrent flame spread will stop propagating and the flame will reduce in size and extinguish. They noted that quenching differs significantly from blow-off in that the upstream leading edge will remain anchored to the burn out edge.
Passive fire building protection system evaluation (case study: millennium ict centre)
NASA Astrophysics Data System (ADS)
Rahman, Vinky; Stephanie
2018-03-01
Passive fire protection system is a system that refers to the building design, both regarding of architecture and structure. This system usually consists of structural protection that protects the structure of the building and prevents the spread of fire and facilitate the evacuation process in case of fire. Millennium ICT Center is the largest electronic shopping center in Medan, Indonesia. As a public building that accommodates the crowd, this building needs a fire protection system by the standards. Therefore, the purpose of this study is to evaluate passive fire protection system of Millennium ICT Center building. The study was conducted to describe the facts of the building as well as direct observation to the research location. The collected data is then processed using the AHP (Analytical Hierarchy Process) method in its weighting process to obtain the reliability value of passive fire protection fire system. The results showed that there are some components of passive fire protection system in the building, but some are still unqualified. The first section in your paper
Learning Search Control Knowledge for Deep Space Network Scheduling
NASA Technical Reports Server (NTRS)
Gratch, Jonathan; Chien, Steve; DeJong, Gerald
1993-01-01
While the general class of most scheduling problems is NP-hard in worst-case complexity, in practice, for specific distributions of problems and constraints, domain-specific solutions have been shown to perform in much better than exponential time.
Availability Analysis of Dual Mode Systems
DOT National Transportation Integrated Search
1974-04-01
The analytical procedures presented define a method of evaluating the effects of failures in a complex dual-mode system based on a worst case steady-state analysis. The computed result is an availability figure of merit and not an absolute prediction...
Part of a May 1999 series on the Risk Management Program Rule and issues related to chemical emergency management. Explains hazard versus risk, worst-case and alternative release scenarios, flammable endpoints and toxic endpoints.
General RMP Guidance - Chapter 4: Offsite Consequence Analysis
This chapter provides basic compliance information, not modeling methodologies, for people who plan to do their own air dispersion modeling. OCA is a required part of the risk management program, and involves worst-case and alternative release scenarios.
INCORPORATING NONCHEMICAL STRESSORS INTO CUMMULATIVE RISK ASSESSMENTS
The risk assessment paradigm has begun to shift from assessing single chemicals using "reasonable worst case" assumptions for individuals to considering multiple chemicals and community-based models. Inherent in community-based risk assessment is examination of all stressors a...
30 CFR 254.26 - What information must I include in the “Worst case discharge scenario” appendix?
Code of Federal Regulations, 2011 CFR
2011-07-01
... limits of current technology, for the range of environmental conditions anticipated at your facility; and... Society for Testing of Materials (ASTM) publication F625-94, Standard Practice for Describing...
30 CFR 254.26 - What information must I include in the “Worst case discharge scenario” appendix?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., materials, support vessels, and strategies listed are suitable, within the limits of current technology, for... equipment. Examples of acceptable terms include those defined in American Society for Testing of Materials...
Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson
2008-01-01
We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...
Uncertainties of wild-land fires emission in AQMEII phase 2 case study
NASA Astrophysics Data System (ADS)
Soares, J.; Sofiev, M.; Hakkarainen, J.
2015-08-01
The paper discusses the main uncertainties of wild-land fire emission estimates used in the AQMEII-II case study. The wild-land fire emission of particulate matter for the summer fire season of 2010 in Eurasia was generated by the Integrated System for wild-land Fires (IS4FIRES). The emission calculation procedure included two steps: bottom-up emission compilation from radiative energy of individual fires observed by MODIS instrument on-board of Terra and Aqua satellites; and top-down calibration of emission factors based on the comparison between observations and modelled results. The approach inherits various uncertainties originating from imperfect information on fires, inaccuracies of the inverse problem solution, and simplifications in the fire description. These are analysed in regard to the Eurasian fires in 2010. It is concluded that the total emission is likely to be over-estimated by up to 50% with individual-fire emission accuracy likely to vary in a wide range. The first results of the new IS4FIRESv2 products and fire-resolving modelling are discussed in application to the 2010 events. It is shown that the new emission estimates have similar patterns but are lower than the IS4FIRESv1 values.
Characteristics of worst hour rainfall rate for radio wave propagation modelling in Nigeria
NASA Astrophysics Data System (ADS)
Osita, Ibe; Nymphas, E. F.
2017-10-01
Radio waves especially at the millimeter-wave band are known to be attenuated by rain. Radio engineers and designers need to be able to predict the time of the day when radio signal will be attenuated so as to provide measures to mitigate this effect. This is achieved by characterizing the rainfall intensity for a particular region of interest into worst month and worst hour of the day. This paper characterized rainfall in Nigeria into worst year, worst month, and worst hour. It is shown that for the period of study, 2008 and 2009 are the worst years, while September is the most frequent worst month in most of the stations. The evening time (LT) is the worst hours of the day in virtually all the stations.
NASA Astrophysics Data System (ADS)
Athanasopoulou, E.; Giannakopoulos, C.; Vogel, H.; Rieger, D.; Knote, C.; Hatzaki, M.; Vogel, B.; Karali, A.
2012-04-01
During 2007, Greece experienced an extreme summer and the worst natural hazard in its modern history. Soil dehydration, following a prolonged dry period in combination with hot temperatures and strong winds, yielded favorable conditions for the ignition and spread of wild fires that burnt approximately 200,000 ha of vegetated land (Founda and Gianakopoulos, 2009; Sifakis et al., 2011). The relationship between meteorology and fire potential can be provided by the Canadian Fire Weather Index (FWI), which is already found applicable in the fire activity of the Mediterranean region (Carvalho et al., 2008). However, lack of meteorological data or remote fire spots can be sources of uncertainties for fire risk estimation. In addition to the direct fire damage, these fires produced large quantities of gaseous air pollutants and particles (PM10) dispersed over the area of Greece. Indeed, PM10 concentration measurements showed two pollution episodes over Athens during late August and early September, 2007 (Liu et al., 2009). Nevertheless, these measurements neither show the large spatial extent of fire effects nor reveal its important role on atmospheric chemistry. In the current study, the application of the atmospheric model COSMO-ART is used to investigate the issues addressed above. COSMO-ART (Vogel et al. 2009) is a regional chemistry transport model (ART stands for Aerosols and Reactive Trace gases) online-coupled to the COSMO regional numerical weather prediction and climate model (Baldauf et al. 2011). The current simulations are performed between August 15 and September 15 over Greece with a horizontal resolution of 2.8 km and a vertical extend up to 20 km. The initial and boundary meteorological conditions are derived from a coarser COSMO simulation performed by the German Weather Service. Fire emissions are retrieved from the Global Fire Emissions Database version 3 (van der Werf et al., 2010). The anthropogenic emission database used is the TNO/MACC (Kuenen et al. 2011), while biogenic emissions are calculated online (Vogel et al. 1995). The FWI is calculated from air temperature, relative humidity, wind speed, and precipitation data obtained from the Hellenic National Meteorological Service for several sites in proximity to the fire event areas. In parallel, these data serve as evaluation for the respective model predictions. The satisfactory comparison results enable the FWI calculation using the model data over the burnt areas, where observations are missing. The effect of these fire events on atmospheric chemistry is estimated by analyzing the predictions not only for the mainly affected primary species (carbon monoxide, methane, non-methane hydrocarbons, nitrogen oxides and elemental carbon), but also for the secondary pollutants (ozone, organic and nitrate aerosol). The competence of COSMO-ART mass predictions is evaluated by comparing PM10 outputs with published literature results. The weather conditions during the 2007 wildfire events have already been assessed as a typical summertime meteorological regime during the latter part of the century (Founda and Gianakopoulos, 2009). Therefore, the results presented here can be viewed as representative of a fire event likely to occur by then. Acknowledgement: This work was supported by the EU project CLIMRUN under contract FP7-ENV-2010-265192.
Stressful life events and catechol-O-methyl-transferase (COMT) gene in bipolar disorder.
Hosang, Georgina M; Fisher, Helen L; Cohen-Woods, Sarah; McGuffin, Peter; Farmer, Anne E
2017-05-01
A small body of research suggests that gene-environment interactions play an important role in the development of bipolar disorder. The aim of the present study is to contribute to this work by exploring the relationship between stressful life events and the catechol-O-methyl-transferase (COMT) Val 158 Met polymorphism in bipolar disorder. Four hundred eighty-two bipolar cases and 205 psychiatrically healthy controls completed the List of Threatening Experiences Questionnaire. Bipolar cases reported the events experienced 6 months before their worst depressive and manic episodes; controls reported those events experienced 6 months prior to their interview. The genotypic information for the COMT Val 158 Met variant (rs4680) was extracted from GWAS analysis of the sample. The impact of stressful life events was moderated by the COMT genotype for the worst depressive episode using a Val dominant model (adjusted risk difference = 0.09, 95% confidence intervals = 0.003-0.18, P = .04). For the worst manic episodes no significant interactions between COMT and stressful life events were detected. This is the first study to explore the relationship between stressful life events and the COMT Val 158 Met polymorphism focusing solely on bipolar disorder. The results of this study highlight the importance of the interplay between genetic and environmental factors for bipolar depression. © 2017 Wiley Periodicals, Inc.
Grant J. Williamson; Lynda D. Prior; Matt Jolly; Mark A. Cochrane; Brett P. Murphy; David M. J. S. Bowman
2016-01-01
Climate dynamics at diurnal, seasonal and inter-annual scales shape global fire activity, although difficulties of assembling reliable fire and meteorological data with sufficient spatio-temporal resolution have frustrated quantification of this variability. Using Australia as a case study, we combine data from 4760 meteorological stations with 12 years of satellite-...
Laser Transformation Hardening of Firing Zone Cutout Cams.
1981-06-01
bath nitriding to case harden firing zone cutout cams for the Mk 10 Guided Missile Launcher System (GMLS). These cams, machined of 4340 steel ...salt bath nitriding to case harden firing zone cutout cams for the Mk 10 Guided Missile Launcher System (GMLS). These cams, machined of 4340 steel ...Patterns ........ ................ 8 9 Laser Beam Step Pattern ...... .................. .. 10 10 Hardness Profile, 4340 Steel
ERIC Educational Resources Information Center
Zirkel, Sabrina; Pollack, Terry M.
2016-01-01
We present a case analysis of the controversy and public debate generated from a school district's efforts to address racial inequities in educational outcomes by diverting special funds from the highest performing students seeking elite college admissions to the lowest performing students who were struggling to graduate from high school.…
2008-03-01
Adversarial Tripolarity ................................................................................... VII-1 VIII. Fallen Nuclear Dominoes...power dimension, it is possible to imagine a best case (deep concert) and a worst case (adversarial tripolarity ) and some less extreme outcomes, one...vanquished and the sub-regions have settled into relative stability). 5. Adversarial U.S.-Russia-China tripolarity : In this world, the regional
ERIC Educational Resources Information Center
Marginson, Simon
This study examined the character of the emerging systems of corporate management in Australian universities and their effects on academic and administrative practices, focusing on relations of power. Case studies were conducted at 17 individual universities of various types. In each institution, interviews were conducted with senior…
Elementary Social Studies in 2005: Danger or Opportunity?--A Response to Jeff Passe
ERIC Educational Resources Information Center
Libresco, Andrea S.
2006-01-01
From the emphasis on lower-level test-prep materials to the disappearance of the subject altogether, elementary social studies is, in the best case scenario, being tested and, thus, taught with a heavy emphasis on recall; and, in the worst-case scenario, not being taught at all. In this article, the author responds to Jeff Passe's views on…
Thermal Analysis of a Metallic Wing Glove for a Mach-8 Boundary-Layer Experiment
NASA Technical Reports Server (NTRS)
Gong, Leslie; Richards, W. Lance
1998-01-01
A metallic 'glove' structure has been built and attached to the wing of the Pegasus(trademark) space booster. An experiment on the upper surface of the glove has been designed to help validate boundary-layer stability codes in a free-flight environment. Three-dimensional thermal analyses have been performed to ensure that the glove structure design would be within allowable temperature limits in the experiment test section of the upper skin of the glove. Temperature results obtained from the design-case analysis show a peak temperature at the leading edge of 490 F. For the upper surface of the glove, approximately 3 in. back from the leading edge, temperature calculations indicate transition occurs at approximately 45 sec into the flight profile. A worst-case heating analysis has also been performed to ensure that the glove structure would not have any detrimental effects on the primary objective of the Pegasus a launch. A peak temperature of 805 F has been calculated on the leading edge of the glove structure. The temperatures predicted from the design case are well within the temperature limits of the glove structure, and the worst-case heating analysis temperature results are acceptable for the mission objectives.
NASA Astrophysics Data System (ADS)
Van Zandt, James R.
2012-05-01
Steady-state performance of a tracking filter is traditionally evaluated immediately after a track update. However, there is commonly a further delay (e.g., processing and communications latency) before the tracks can actually be used. We analyze the accuracy of extrapolated target tracks for four tracking filters: Kalman filter with the Singer maneuver model and worst-case correlation time, with piecewise constant white acceleration, and with continuous white acceleration, and the reduced state filter proposed by Mookerjee and Reifler.1, 2 Performance evaluation of a tracking filter is significantly simplified by appropriate normalization. For the Kalman filter with the Singer maneuver model, the steady-state RMS error immediately after an update depends on only two dimensionless parameters.3 By assuming a worst case value of target acceleration correlation time, we reduce this to a single parameter without significantly changing the filter performance (within a few percent for air tracking).4 With this simplification, we find for all four filters that the RMS errors for the extrapolated state are functions of only two dimensionless parameters. We provide simple analytic approximations in each case.
Comprehensive all-sky search for periodic gravitational waves in the sixth science run LIGO data
NASA Astrophysics Data System (ADS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Bejger, M.; Bell, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Creighton, T.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dasgupta, A.; Da Silva Costa, C. F.; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devine, R. C.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fenyvesi, E.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Geng, P.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Henry, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jian, L.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chi-Woong; Kim, Chunglee; Kim, J.; Kim, K.; Kim, N.; Kim, W.; Kim, Y.-M.; Kimbrell, S. J.; King, E. J.; King, P. J.; Kissel, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Laxen, M.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Lewis, J. B.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Lynch, R.; Ma, Y.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magaña Zertuche, L.; Magee, R. M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Nedkova, K.; Nelemans, G.; Nelson, T. J. N.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Perri, L. M.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, J. D.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O. E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, A. D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sunil, S.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tornasi, Z.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weßels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J. L.; Wu, D. S.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yu, H.; Yvert, M.; ZadroŻny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration
2016-08-01
We report on a comprehensive all-sky search for periodic gravitational waves in the frequency band 100-1500 Hz and with a frequency time derivative in the range of [-1.18 ,+1.00 ] ×1 0-8 Hz /s . Such a signal could be produced by a nearby spinning and slightly nonaxisymmetric isolated neutron star in our galaxy. This search uses the data from the initial LIGO sixth science run and covers a larger parameter space with respect to any past search. A Loosely Coherent detection pipeline was applied to follow up weak outliers in both Gaussian (95% recovery rate) and non-Gaussian (75% recovery rate) bands. No gravitational wave signals were observed, and upper limits were placed on their strength. Our smallest upper limit on worst-case (linearly polarized) strain amplitude h0 is 9.7 ×1 0-25 near 169 Hz, while at the high end of our frequency range we achieve a worst-case upper limit of 5.5 ×1 0-24 . Both cases refer to all sky locations and entire range of frequency derivative values.
Zika virus in French Polynesia 2013-14: anatomy of a completed outbreak.
Musso, Didier; Bossin, Hervé; Mallet, Henri Pierre; Besnard, Marianne; Broult, Julien; Baudouin, Laure; Levi, José Eduardo; Sabino, Ester C; Ghawche, Frederic; Lanteri, Marion C; Baud, David
2018-05-01
The Zika virus crisis exemplified the risk associated with emerging pathogens and was a reminder that preparedness for the worst-case scenario, although challenging, is needed. Herein, we review all data reported during the unexpected emergence of Zika virus in French Polynesia in late 2013. We focus on the new findings reported during this outbreak, especially the first description of severe neurological complications in adults and the retrospective description of CNS malformations in neonates, the isolation of Zika virus in semen, the potential for blood-transfusion transmission, mother-to-child transmission, and the development of new diagnostic assays. We describe the effect of this outbreak on health systems, the implementation of vector-borne control strategies, and the line of communication used to alert the international community of the new risk associated with Zika virus. This outbreak highlighted the need for careful monitoring of all unexpected events that occur during an emergence, to implement surveillance and research programmes in parallel to management of cases, and to be prepared to the worst-case scenario. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fire fighting and its influence on the body.
Rossi, René
2003-08-15
Working conditions for fire fighters can be described according to the environment temperature and the incident radiant heat flux. Measurements for this study in buildings for fire fighting training have shown that fire fighters are typically exposed to radiant heat fluxes of between 5 and 10 kWm(-2) during this kind of exercise. The heat load can nevertheless be much higher. In one case, 42 kWm(-2) was measured. The temperatures reached between 100 and 190 degrees C at 1 m above ground, going up to 278 degrees C in one case. Human trials have been performed with 17 fire fighters. After exercises (about 15 min) in a heated room, the mean core temperature of the fire fighters rose by 0.6 degrees C with a surrounding temperature of 31 degrees C and 1.0 degrees C with 38 degrees C. The sweat production varied from 0.7 to 2.1 lh(-1); 16% to 45% of sweat remained in the clothing layers. During the exercises in the training buildings, a mean of 48 degrees C has been measured between fire fighters' clothing and workwear. These conditions lead to an increase of the relative humidity in all the jackets up to 100%. When the fire fighters came out of the fire, the humidity remained at this level in the PVC coated jackets while it was in some cases strongly reduced in breathable jackets.
NASA Astrophysics Data System (ADS)
Kiefer, Michael T.; Zhong, Shiyuan; Heilman, Warren E.; Charney, Joseph J.; Bian, Xindi
2018-03-01
An improved understanding of atmospheric perturbations within and above a forest during a wildland fire has relevance to many aspects of wildland fires including fire spread, smoke transport and dispersion, and tree mortality. In this study, the ARPS-CANOPY model, a version of the Advanced Regional Prediction System (ARPS) model with a canopy parameterization, is utilized in a series of idealized numerical experiments to investigate the influence of vertical canopy structure on the atmospheric response to a stationary sensible heat flux at the ground ("fire heat flux"), broadly consistent in magnitude with the sensible heat flux from a low-intensity surface fire. Five vertical canopy structures are combined with five fire heat flux magnitudes to yield a matrix of 25 simulations. Analyses of the fire-heat-flux-perturbed u component of the wind, vertical velocity, kinetic energy, and temperature show that the spatial pattern and magnitude of the perturbations are sensitive to vertical canopy structure. Both vertical velocity and kinetic energy exhibit an increasing trend with increasing fire heat flux that is stronger for cases with some amount of overstory vegetation than cases with exclusively understory vegetation. A weaker trend in cases with exclusively understory vegetation indicates a damping of the atmospheric response to the sensible heat from a surface fire when vegetation is most concentrated near the surface. More generally, the results presented in this study suggest that canopy morphology should be considered when applying the results of a fire-atmosphere interaction study conducted in one type of forest to other forests with different canopy structures.
Probability Quantization for Multiplication-Free Binary Arithmetic Coding
NASA Technical Reports Server (NTRS)
Cheung, K. -M.
1995-01-01
A method has been developed to improve on Witten's binary arithmetic coding procedure of tracking a high value and a low value. The new method approximates the probability of the less probable symbol, which improves the worst-case coding efficiency.
Carbon monoxide screen for signalized intersections : COSIM, version 4.0 - technical documentation.
DOT National Transportation Integrated Search
2013-06-01
Illinois Carbon Monoxide Screen for Intersection Modeling (COSIM) Version 3.0 is a Windows-based computer : program currently used by the Illinois Department of Transportation (IDOT) to estimate worst-case carbon : monoxide (CO) concentrations near s...
Global climate change: The quantifiable sustainability challenge
Population growth and the pressures spawned by increasing demands for energy and resource-intensive goods, foods and services are driving unsustainable growth in greenhouse gas (GHG) emissions. Recent GHG emission trends are consistent with worst-case scenarios of the previous de...
NASA Astrophysics Data System (ADS)
Sripramai, Keerati; Oikawa, Yasushi; Watanabe, Hiroshi; Katada, Toshitaka
Generally, in order to improve some regional fire fighting validity, indispensable strategies are not only a reinforcement of the governmental fire fighting ability, but also a strengthening of the cooperative relationship between governmental and non-governmental fire fighting ability. However, for practical purposes, the effective strategy should be different depending on the actual situationin the subject area. So, in this study, we grasp the actual state and background of the problems that need to be solved for the improvement of the regional fire fighting validity in Bangkok as a case study, and examine the appropriate solution focusing on the relationship between official and voluntary fire fighting. Through some practicable activities such as interviews, investigati ons, and making the regional fire fighting validity map, it became clear that the problems of uncooperative relationship and the lack of trust between stakeholders should be solved first and foremost.
14 CFR 125.161 - Fire-extinguishing systems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 125.161 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... protection against destruction of the airplane in case of fire is provided by the use of fireproof materials... be provided to serve all designated fire zones. (b) Materials in the fire-extinguishing system must...
14 CFR 125.161 - Fire-extinguishing systems.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 125.161 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... protection against destruction of the airplane in case of fire is provided by the use of fireproof materials... be provided to serve all designated fire zones. (b) Materials in the fire-extinguishing system must...
NASA Astrophysics Data System (ADS)
Hudjimartsu, S. A.; Djatna, T.; Ambarwari, A.; Apriliantono
2017-01-01
The forest fires in Indonesia occurs frequently in the dry season. Almost all the causes of forest fires are caused by the human activity itself. The impact of forest fires is the loss of biodiversity, pollution hazard and harm the economy of surrounding communities. To prevent fires required the method, one of them with spatial temporal clustering. Spatial temporal clustering formed grouping data so that the results of these groupings can be used as initial information on fire prevention. To analyze the fires, used hotspot data as early indicator of fire spot. Hotspot data consists of spatial and temporal dimensions can be processed using the Spatial Temporal Clustering with Kulldorff Scan Statistic (KSS). The result of this research is to the effectiveness of KSS method to cluster spatial hotspot in a case within Riau Province and produces two types of clusters, most cluster and secondary cluster. This cluster can be used as an early fire warning information.
14 CFR 121.263 - Fire-extinguishing systems.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 121.263 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... the airplane in case of fire is provided by the use of fireproof materials in the nacelle and other... designated fire zones. (b) Materials in the fire-extinguishing system must not react chemically with the...
14 CFR 121.263 - Fire-extinguishing systems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 121.263 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... the airplane in case of fire is provided by the use of fireproof materials in the nacelle and other... designated fire zones. (b) Materials in the fire-extinguishing system must not react chemically with the...
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.
Selective robust optimization: A new intensity-modulated proton therapy optimization strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yupeng; Niemela, Perttu; Siljamaki, Sami
2015-08-15
Purpose: To develop a new robust optimization strategy for intensity-modulated proton therapy as an important step in translating robust proton treatment planning from research to clinical applications. Methods: In selective robust optimization, a worst-case-based robust optimization algorithm is extended, and terms of the objective function are selectively computed from either the worst-case dose or the nominal dose. Two lung cancer cases and one head and neck cancer case were used to demonstrate the practical significance of the proposed robust planning strategy. The lung cancer cases had minimal tumor motion less than 5 mm, and, for the demonstration of the methodology,more » are assumed to be static. Results: Selective robust optimization achieved robust clinical target volume (CTV) coverage and at the same time increased nominal planning target volume coverage to 95.8%, compared to the 84.6% coverage achieved with CTV-based robust optimization in one of the lung cases. In the other lung case, the maximum dose in selective robust optimization was lowered from a dose of 131.3% in the CTV-based robust optimization to 113.6%. Selective robust optimization provided robust CTV coverage in the head and neck case, and at the same time improved controls over isodose distribution so that clinical requirements may be readily met. Conclusions: Selective robust optimization may provide the flexibility and capability necessary for meeting various clinical requirements in addition to achieving the required plan robustness in practical proton treatment planning settings.« less
Suresh, R
2017-08-01
Pertinent marks of fired cartridge cases such as firing pin, breech face, extractor, ejector, etc. are used for firearm identification. A non-standard semiautomatic pistol and four .22rim fire cartridges (head stamp KF) is used for known source comparison study. Two test fired cartridge cases are examined under stereomicroscope. The characteristic marks are captured by digital camera and comparative analysis of striation marks is done by using different tools available in the Microsoft word (Windows 8) of a computer system. The similarities of striation marks thus obtained are highly convincing to identify the firearm. In this paper, an effort has been made to study and compare the striation marks of two fired cartridge cases using stereomicroscope, digital camera and computer system. Comparison microscope is not used in this study. The method described in this study is simple, cost effective, transport to field study and can be equipped in a crime scene vehicle to facilitate immediate on spot examination. The findings may be highly helpful to the forensic community, law enforcement agencies and students. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Stenzel, S.; Baumann-Stanzer, K.
2009-04-01
Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades. Sirma Stenzel, Kathrin Baumann-Stanzer In the case of accidental release of hazardous gases in the atmosphere, the emergency responders need a reliable and fast tool to assess the possible consequences and apply the optimal countermeasures. For hazard prediction and simulation of the hazard zones a number of air dispersion models are available. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for display the results, they are easy to use and can operate fast and effective during stress situations. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios"), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. There are also possibilities for model direct coupling to automatic meteorological stations, in order to avoid uncertainties in the model output due to insufficient or incorrect meteorological data. Another key problem in coping with accidental toxic release is the relative width spectrum of regulations and values, like IDLH, ERPG, AEGL, MAK etc. and the different criteria for their application. Since the particulate emergency responders and organizations require for their purposes unequal regulations and values, it is quite difficult to predict the individual hazard areas. There are a quite number of research studies and investigations coping with the problem, anyway the end decision is up to the authorities. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Vienna fire brigade, OMV Refining & Marketing GmbH and Synex Ries & Greßlehner GmbH. RETOMOD was funded by the KIRAS safety research program at the Austrian Ministry of Transport, Innovation and Technology (www.kiras.at). One of the main tasks of this project was 1. Sensitivity study and optimization of the meteorological input for modeling of the hazard areas (human exposure) during the accidental toxic releases. 2. Comparison of several model packages (based on reference scenarios) in order to estimate the utility for the fire brigades. This presentation introduces the project models used and presents the results of task 2. The results of task 1 are presented by Baumann-Stanzer and Stenzel in this session. For the purpose of this study the following models were tested and compared: ALOHA (Areal Location of Hazardous atmosphere, EPA), MEMPLEX (Keudel av-Technik GmbH), Breeze (Trinity Consulting), SAFER System, SAM (Engineering office Lohmeyer), COMPAS. A set of reference scenarios for Chlorine, Ammoniac, Butane and Petrol were proceed in order to reliably predict and estimate the human exposure during the event. The models simulated the accidental release from the mentioned above gases and estimates the potential toxic areas. Since the inputs requirement differ from model to model, and the outputs are based on different criteria for toxic areas and exposure, a high degree of caution in the interpretation of the model results is needed.
Dimitroulopoulou, C; Lucica, E; Johnson, A; Ashmore, M R; Sakellaris, I; Stranger, M; Goelen, E
2015-12-01
Consumer products are frequently and regularly used in the domestic environment. Realistic estimates for product use are required for exposure modelling and health risk assessment. This paper provides significant data that can be used as input for such modelling studies. A European survey was conducted, within the framework of the DG Sanco-funded EPHECT project, on the household use of 15 consumer products. These products are all-purpose cleaners, kitchen cleaners, floor cleaners, glass and window cleaners, bathroom cleaners, furniture and floor polish products, combustible air fresheners, spray air fresheners, electric air fresheners, passive air fresheners, coating products for leather and textiles, hair styling products, spray deodorants and perfumes. The analysis of the results from the household survey (1st phase) focused on identifying consumer behaviour patterns (selection criteria, frequency of use, quantities, period of use and ventilation conditions during product use). This can provide valuable input to modelling studies, as this information is not reported in the open literature. The above results were further analysed (2nd phase), to provide the basis for the development of 'most representative worst-case scenarios' regarding the use of the 15 products by home-based population groups (housekeepers and retired people), in four geographical regions in Europe. These scenarios will be used for the exposure and health risk assessment within the EPHECT project. To the best of our knowledge, it is the first time that daily worst-case scenarios are presented in the scientific published literature concerning the use of a wide range of 15 consumer products across Europe. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
Analysis of Separation Corridors for Visiting Vehicles from the International Space Station
NASA Technical Reports Server (NTRS)
Zaczek, Mariusz P.; Schrock, Rita R.; Schrock, Mark B.; Lowman, Bryan C.
2011-01-01
The International Space Station (ISS) is a very dynamic vehicle with many operational constraints that affect its performance, operations, and vehicle lifetime. Most constraints are designed to alleviate various safety concerns that are a result of dynamic activities between the ISS and various Visiting Vehicles (VVs). One such constraint that has been in place for Russian Vehicle (RV) operations is the limitation placed on Solar Array (SA) positioning in order to prevent collisions during separation and subsequent relative motion of VVs. An unintended consequence of the SA constraint has been the impacts to the operational flexibility of the ISS resulting from the reduced power generation capability as well as from a reduction in the operational lifetime of various SA components. The purpose of this paper is to discuss the technique and the analysis that were applied in order to relax the SA constraints for RV undockings, thereby improving both the ISS operational flexibility and extending its lifetime for many years to come. This analysis focused on the effects of the dynamic motion that occur both prior to and following RV separations. The analysis involved a parametric approach in the conservative application of various initial conditions and assumptions. These included the use of the worst case minimum and maximum vehicle configurations, worst case initial attitudes and attitude rates, and the worst case docking port separation dynamics. Separations were calculated for multiple ISS docking ports, at varied deviations from the nominal undocking attitudes and included the use of two separate attitude control schemes: continuous free-drift and a post separation attitude hold. The analysis required numerical propagation of both the separation motion and the vehicle attitudes using 3-degree-of-freedom (DOF) relative motion equations coupled with rigid body rotational dynamics to generate a large set of separation trajectories.
Mallinckrodt, C H; Lin, Q; Molenberghs, M
2013-01-01
The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.
Ecological risk estimation of organophosphorus pesticides in riverine ecosystems.
Wee, Sze Yee; Aris, Ahmad Zaharin
2017-12-01
Pesticides are of great concern because of their existence in ecosystems at trace concentrations. Worldwide pesticide use and its ecological impacts (i.e., altered environmental distribution and toxicity of pesticides) have increased over time. Exposure and toxicity studies are vital for reducing the extent of pesticide exposure and risk to the environment and humans. Regional regulatory actions may be less relevant in some regions because the contamination and distribution of pesticides vary across regions and countries. The risk quotient (RQ) method was applied to assess the potential risk of organophosphorus pesticides (OPPs), primarily focusing on riverine ecosystems. Using the available ecotoxicity data, aquatic risks from OPPs (diazinon and chlorpyrifos) in the surface water of the Langat River, Selangor, Malaysia were evaluated based on general (RQ m ) and worst-case (RQ ex ) scenarios. Since the ecotoxicity of quinalphos has not been well established, quinalphos was excluded from the risk assessment. The calculated RQs indicate medium risk (RQ m = 0.17 and RQ ex = 0.66; 0.1 ≤ RQ < 1) of overall diazinon. The overall chlorpyrifos exposure was observed at high risk (RQ ≥ 1) based on RQ m and RQ ex at 1.44 and 4.83, respectively. A contradictory trend of RQs > 1 (high risk) was observed for both the general and worst cases of chlorpyrifos, but only for the worst cases of diazinon at all sites from downstream to upstream regions. Thus, chlorpyrifos posed a higher risk than diazinon along the Langat River, suggesting that organisms and humans could be exposed to potentially high levels of OPPs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reducing the worst case running times of a family of RNA and CFG problems, using Valiant's approach.
Zakov, Shay; Tsur, Dekel; Ziv-Ukelson, Michal
2011-08-18
RNA secondary structure prediction is a mainstream bioinformatic domain, and is key to computational analysis of functional RNA. In more than 30 years, much research has been devoted to defining different variants of RNA structure prediction problems, and to developing techniques for improving prediction quality. Nevertheless, most of the algorithms in this field follow a similar dynamic programming approach as that presented by Nussinov and Jacobson in the late 70's, which typically yields cubic worst case running time algorithms. Recently, some algorithmic approaches were applied to improve the complexity of these algorithms, motivated by new discoveries in the RNA domain and by the need to efficiently analyze the increasing amount of accumulated genome-wide data. We study Valiant's classical algorithm for Context Free Grammar recognition in sub-cubic time, and extract features that are common to problems on which Valiant's approach can be applied. Based on this, we describe several problem templates, and formulate generic algorithms that use Valiant's technique and can be applied to all problems which abide by these templates, including many problems within the world of RNA Secondary Structures and Context Free Grammars. The algorithms presented in this paper improve the theoretical asymptotic worst case running time bounds for a large family of important problems. It is also possible that the suggested techniques could be applied to yield a practical speedup for these problems. For some of the problems (such as computing the RNA partition function and base-pair binding probabilities), the presented techniques are the only ones which are currently known for reducing the asymptotic running time bounds of the standard algorithms.
NASA Astrophysics Data System (ADS)
Wang, Tian; Cui, Xiaoxin; Ni, Yewen; Liao, Kai; Liao, Nan; Yu, Dunshan; Cui, Xiaole
2017-04-01
With shrinking transistor feature size, the fin-type field-effect transistor (FinFET) has become the most promising option in low-power circuit design due to its superior capability to suppress leakage. To support the VLSI digital system flow based on logic synthesis, we have designed an optimized high-performance low-power FinFET standard cell library based on employing the mixed FBB/RBB technique in the existing stacked structure of each cell. This paper presents the reliability evaluation of the optimized cells under process and operating environment variations based on Monte Carlo analysis. The variations are modelled with Gaussian distribution of the device parameters and 10000 sweeps are conducted in the simulation to obtain the statistical properties of the worst-case delay and input-dependent leakage for each cell. For comparison, a set of non-optimal cells that adopt the same topology without employing the mixed biasing technique is also generated. Experimental results show that the optimized cells achieve standard deviation reduction of 39.1% and 30.7% at most in worst-case delay and input-dependent leakage respectively while the normalized deviation shrinking in worst-case delay and input-dependent leakage can be up to 98.37% and 24.13%, respectively, which demonstrates that our optimized cells are less sensitive to variability and exhibit more reliability. Project supported by the National Natural Science Foundation of China (No. 61306040), the State Key Development Program for Basic Research of China (No. 2015CB057201), the Beijing Natural Science Foundation (No. 4152020), and Natural Science Foundation of Guangdong Province, China (No. 2015A030313147).
Reducing the worst case running times of a family of RNA and CFG problems, using Valiant's approach
2011-01-01
Background RNA secondary structure prediction is a mainstream bioinformatic domain, and is key to computational analysis of functional RNA. In more than 30 years, much research has been devoted to defining different variants of RNA structure prediction problems, and to developing techniques for improving prediction quality. Nevertheless, most of the algorithms in this field follow a similar dynamic programming approach as that presented by Nussinov and Jacobson in the late 70's, which typically yields cubic worst case running time algorithms. Recently, some algorithmic approaches were applied to improve the complexity of these algorithms, motivated by new discoveries in the RNA domain and by the need to efficiently analyze the increasing amount of accumulated genome-wide data. Results We study Valiant's classical algorithm for Context Free Grammar recognition in sub-cubic time, and extract features that are common to problems on which Valiant's approach can be applied. Based on this, we describe several problem templates, and formulate generic algorithms that use Valiant's technique and can be applied to all problems which abide by these templates, including many problems within the world of RNA Secondary Structures and Context Free Grammars. Conclusions The algorithms presented in this paper improve the theoretical asymptotic worst case running time bounds for a large family of important problems. It is also possible that the suggested techniques could be applied to yield a practical speedup for these problems. For some of the problems (such as computing the RNA partition function and base-pair binding probabilities), the presented techniques are the only ones which are currently known for reducing the asymptotic running time bounds of the standard algorithms. PMID:21851589
Walser, Tobias; Juraske, Ronnie; Demou, Evangelia; Hellweg, Stefanie
2014-01-01
A pronounced presence of toluene from rotogravure printed matter has been frequently observed indoors. However, its consequences to human health in the life cycle of magazines are poorly known. Therefore, we quantified human-health risks in indoor environments with Risk Assessment (RA) and impacts relative to the total impact of toxic releases occurring in the life cycle of a magazine with Life Cycle Assessment (LCA). We used a one-box indoor model to estimate toluene concentrations in printing facilities, newsstands, and residences in a best, average, and worst-case scenario. The modeled concentrations are in the range of the values measured in on-site campaigns. Toluene concentrations can be close or even surpass the occupational legal thresholds in printing facilities in realistic worst-case scenarios. The concentrations in homes can surpass the US EPA reference dose (69 μg/kg/day) in worst-case scenarios, but are still at least 1 order of magnitude lower than in press rooms or newsstands. However, toluene inhaled at home becomes the dominant contribution to the total potential human toxicity impacts of toluene from printed matter when assessed with LCA, using the USEtox method complemented with indoor characterization factors for toluene. The significant contribution (44%) of toluene exposure in production, retail, and use in households, to the total life cycle impact of a magazine in the category of human toxicity, demonstrates that the indoor compartment requires particular attention in LCA. While RA works with threshold levels, LCA assumes that every toxic emission causes an incremental change to the total impact. Here, the combination of the two paradigms provides valuable information on the life cycle stages of printed matter.
Boehmler, Erick M.; Degnan, James R.
1997-01-01
year discharges. In addition, the incipient roadway-overtopping discharge is determined and analyzed as another potential worst-case scour scenario. Total scour at a highway crossing is comprised of three components: 1) long-term streambed degradation; 2) contraction scour (due to accelerated flow caused by a reduction in flow area at a bridge) and; 3) local scour (caused by accelerated flow around piers and abutments). Total scour is the sum of the three components. Equations are available to compute depths for contraction and local scour and a summary of the results of these computations follows. Contraction scour for all modelled flows ranged from 1.2 to 1.8 feet. The worst-case contraction scour occurred at the incipient overtopping discharge, which is less than the 500-year discharge. Abutment scour ranged from 17.7 to 23.7 feet. The worst-case abutment scour occurred at the 500-year discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in tables 1 and 2. A cross-section of the scour computed at the bridge is presented in figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. It is generally accepted that the Froehlich equation (abutment scour) gives “excessively conservative estimates of scour depths” (Richardson and others, 1995, p. 47). Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
A fire-affected community: the case of the West Melton fire, December 2003
Laura Kelly; Pamela J. Jakes; E.R. Langer
2008-01-01
A study of a fire-affected community was conducted to gain an understanding of the social ramifications of a wildfire event and to analyse how one community responded to this type of disaster. The research focused on the West Melton community, located just west of Christchurch, in the Selwyn District. This is the first in a series of case studies planned by the Scion...
Geoffrey H. Donovan
2006-01-01
Federal land management agencies in the United States are increasingly relying on contract crews as opposed to agency fire crews. Despite this increasing reliance on contractors, there have been no studies to determine what the optimal mix of contract and agency fire crews should be. A mathematical model is presented to address this question and is applied to a case...
Hayman Fire case study: Summary [RMRS-GTR-114
Russell T. Graham
2003-01-01
Historically, wildfires burned Western forests creating and maintaining a variety of forest compositions and structures (Agee 1993). Prior to European settlement lightning along with Native Americans ignited fires routinely across many forested landscapes. After Euro-American settlement, fires continued to be quite common with fires ignited by settlers, railroads, and...
30 CFR 77.1101 - Escape and evacuation; plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Fire Protection § 77.1101 Escape and evacuation; plan. (a) Before September 30, 1971, each operator of... event of a fire. (b) All employees shall be instructed on current escape and evacuation plans, fire alarm signals, and applicable procedures to be followed in case of fire. (c) Plans for escape and...
30 CFR 77.1101 - Escape and evacuation; plan.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Fire Protection § 77.1101 Escape and evacuation; plan. (a) Before September 30, 1971, each operator of... event of a fire. (b) All employees shall be instructed on current escape and evacuation plans, fire alarm signals, and applicable procedures to be followed in case of fire. (c) Plans for escape and...
30 CFR 77.1101 - Escape and evacuation; plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Fire Protection § 77.1101 Escape and evacuation; plan. (a) Before September 30, 1971, each operator of... event of a fire. (b) All employees shall be instructed on current escape and evacuation plans, fire alarm signals, and applicable procedures to be followed in case of fire. (c) Plans for escape and...
30 CFR 77.1101 - Escape and evacuation; plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Fire Protection § 77.1101 Escape and evacuation; plan. (a) Before September 30, 1971, each operator of... event of a fire. (b) All employees shall be instructed on current escape and evacuation plans, fire alarm signals, and applicable procedures to be followed in case of fire. (c) Plans for escape and...
Effects of fire damage on the structural properties of steel bridge elements.
DOT National Transportation Integrated Search
2011-04-30
It is well known that fire can cause severe damage to steel bridges. There are documented cases where fire has directly led to the collapse or significant sagging of a steel bridge. However, when the damage is less severe, the effects of the fire, if...
This paper examines the use of Moderate Resolution Imaging Spectroradiometer (MODIS) observed active fire data (pixel counts) to refine the National Emissions Inventory (NEI) fire emission estimates for major wildfire events. This study was motivated by the extremely limited info...
49 CFR 176.69 - General stowage requirements for hazardous materials.
Code of Federal Regulations, 2010 CFR
2010-10-01
... equipped with a fixed fire extinguishing and fire detection system, the freight containers or barges need... by paragraph (a) of this section if fire fighting equipment capable of reaching and piercing the..., their removal from a potentially dangerous situation, and the removal of packages in case of fire. (b...
MINEMOTION3D: A new set of Programs for Predicting Ground Motion From Explosions in Complex 3D Media
NASA Astrophysics Data System (ADS)
Tibuleac, I. M.; Bonner, J. L.; Orrey, J. L.; Yang, X.
2004-12-01
Predicting ground motion from complicated mining explosions is important for mines developing new blasting programs in regions where vibrations must be kept below certain levels. Additionally, predicting ground motion from mining explosions in complex 3D media is important for moment estimation for nuclear test treaty monitoring. Both problems have been addressed under the development of a new series of numerical prediction programs called MINEMOTION3D including 1) Generalized Fourier Methods to generate Green's functions in 3D media for a moment tensor source implementation and 2) MineSeis3D, a program that simulates seismograms for delay-fired mining explosions with a linear relationship between signals from small size individual shots. To test the programs, local recordings (5 - 23 km) of three production shots at a mine in northern Minnesota were compared to synthetic waveforms in 3D media. A non-zero value of the moment tensor component M12 was considered, to introduce a horizontal spall component into the waveform synthesis when the Green's functions were generated for each model. Methods using seismic noise crosscorrelation for improved inter-element subsurface structure estimation were also evaluated. Comparison of the observed and synthetic waveforms shows promising results. The shape and arrival times of the normalized synthetic and observed waveforms are similar for most of the stations. The synthetic and observed waveform amplitude fit is best for the vertical components in the mean 3D model and worst for the transversal components. The observed effect of spall on the waveform spectra was weak in the case of fragmentation delay fired commercial explosions. Commercial applications of the code could provide data needed for designing explosions which do not exceed ground vibration requirements posed by the U.S. Department of the Interior, Office of Surface Mining.
Evaluating the risk from depleted uranium after the Boeing 747-258F crash in Amsterdam, 1992.
Uijt de Haag, P A; Smetsers, R C; Witlox, H W; Krüs, H W; Eisenga, A H
2000-08-28
On 4 October 1992, a large cargo plane crashed into an apartment building in the Bijlmermeer quarter of Amsterdam. In the years following the accident, an increasing number of people started reporting health complaints, which they attributed to exposure to dangerous substances after the crash. Since the aircraft had been carrying depleted uranium as counterbalance weights and about 150 kg uranium had been found missing after clearance of the crash site, exposure to uranium oxide particles was pointed out as the possible cause of their health complaints. Six years after the accident, a risk analysis was therefore carried out to investigate whether the health complaints could be attributed to exposure to uranium oxide set free during the accident. The scientific challenge was to come up with reliable results, knowing that - considering the late date - virtually no data were available to validate any calculated result. The source term of uranium was estimated using both generic and specific data. Various dispersion models were applied in combination with the local setting and the meteorological conditions at the time of the accident to estimate the exposure of bystanders during the fire caused by the crash. Emphasis was given to analysing the input parameters, inter-comparing the various models and comparing model results with the scarce information available. Uranium oxide formed in the fire has a low solubility, making the chemical toxicity to humans less important than the radiotoxicity. Best-estimate results indicated that bystanders may have been exposed to a radiation dose of less than 1 microSv, whereas a worst-case approach indicated an upper limit of less than 1 mSv. This value is considerably less than the radiation dose for which acute effects are to be expected. It is therefore considered to be improbable that the missing uranium had indeed led to the health complaints reported.
Fire Safety for the Oral and Maxillofacial Surgeon and Surgical Staff.
Di Pasquale, LisaMarie; Ferneini, Elie M
2017-05-01
Fire in the operating room is a life-threatening emergency that demands quick, efficient intervention. Because the circumstances surrounding fires are generally well-understood, virtually every operating room fire is preventable. Before every operating room case, thorough preprocedure "time outs" should address each team members' awareness of specific fire risks and agreement regarding fire concerns and emergency actions. Fire prevention centers on 3 constituent parts of the fire triad necessary for fire formation. Regular fire drills should guide policies and procedures to prevent surgical fires. Delivering optimal patient care in emergent situations requires surgical team training, practicing emergency roles, and specific actions. Copyright © 2016 Elsevier Inc. All rights reserved.
... reaction can vary from mild to severe. In rare cases, the person with the rash needs to be treated in the hospital. The worst symptoms are often seen during days 4 to 7 after coming in contact with the plant. The rash may last for 1 to 3 ...
Closed Environment Module - Modularization and extension of the Virtual Habitat
NASA Astrophysics Data System (ADS)
Plötner, Peter; Czupalla, Markus; Zhukov, Anton
2013-12-01
The Virtual Habitat (V-HAB), is a Life Support System (LSS) simulation, created to perform dynamic simulation of LSS's for future human spaceflight missions. It allows the testing of LSS robustness by means of computer simulations, e.g. of worst case scenarios.
49 CFR 238.431 - Brake system.
Code of Federal Regulations, 2011 CFR
2011-10-01
... train is operating under worst-case adhesion conditions. (b) The brake system shall be designed to allow... a brake rate consistent with prevailing adhesion, passenger safety, and brake system thermal... adhesion control system designed to automatically adjust the braking force on each wheel to prevent sliding...
40 CFR 300.135 - Response operations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PLANNING, AND COMMUNITY RIGHT-TO-KNOW PROGRAMS NATIONAL OIL AND HAZARDOUS SUBSTANCES POLLUTION CONTINGENCY... discharge is a worst case discharge as discussed in § 300.324; the pathways to human and environmental exposure; the potential impact on human health, welfare, and safety and the environment; whether the...
Management of reliability and maintainability; a disciplined approach to fleet readiness
NASA Technical Reports Server (NTRS)
Willoughby, W. J., Jr.
1981-01-01
Material acquisition fundamentals were reviewed and include: mission profile definition, stress analysis, derating criteria, circuit reliability, failure modes, and worst case analysis. Military system reliability was examined with emphasis on the sparing of equipment. The Navy's organizational strategy for 1980 is presented.
Empirical Modeling Of Single-Event Upset
NASA Technical Reports Server (NTRS)
Zoutendyk, John A.; Smith, Lawrence S.; Soli, George A.; Thieberger, Peter; Smith, Stephen L.; Atwood, Gregory E.
1988-01-01
Experimental study presents examples of empirical modeling of single-event upset in negatively-doped-source/drain metal-oxide-semiconductor static random-access memory cells. Data supports adoption of simplified worst-case model in which cross sectionof SEU by ion above threshold energy equals area of memory cell.
Kennedy, Reese D; Cheavegatti-Gianotto, Adriana; de Oliveira, Wladecir S; Lirette, Ronald P; Hjelle, Jerry J
2018-01-01
Insect-protected sugarcane that expresses Cry1Ab has been developed in Brazil. Analysis of trade information has shown that effectively all the sugarcane-derived Brazilian exports are raw or refined sugar and ethanol. The fact that raw and refined sugar are highly purified food ingredients, with no detectable transgenic protein, provides an interesting case study of a generalized safety assessment approach. In this study, both the theoretical protein intakes and safety assessments of Cry1Ab, Cry1Ac, NPTII, and Bar proteins used in insect-protected biotechnology crops were examined. The potential consumption of these proteins was examined using local market research data of average added sugar intakes in eight diverse and representative Brazilian raw and refined sugar export markets (Brazil, Canada, China, Indonesia, India, Japan, Russia, and the USA). The average sugar intakes, which ranged from 5.1 g of added sugar/person/day (India) to 126 g sugar/p/day (USA) were used to calculated possible human exposure. The theoretical protein intake estimates were carried out in the "Worst-case" scenario, assumed that 1 μg of newly-expressed protein is detected/g of raw or refined sugar; and the "Reasonable-case" scenario assumed 1 ng protein/g sugar. The "Worst-case" scenario was based on results of detailed studies of sugarcane processing in Brazil that showed that refined sugar contains less than 1 μg of total plant protein /g refined sugar. The "Reasonable-case" scenario was based on assumption that the expression levels in stalk of newly-expressed proteins were less than 0.1% of total stalk protein. Using these calculated protein intake values from the consumption of sugar, along with the accepted NOAEL levels of the four representative proteins we concluded that safety margins for the "Worst-case" scenario ranged from 6.9 × 10 5 to 5.9 × 10 7 and for the "Reasonable-case" scenario ranged from 6.9 × 10 8 to 5.9 × 10 10 . These safety margins are very high due to the extremely low possible exposures and the high NOAELs for these non-toxic proteins. This generalized approach to the safety assessment of highly purified food ingredients like sugar illustrates that sugar processed from Brazilian GM varieties are safe for consumption in representative markets globally.
Vapor Hydrogen Peroxide as Alternative to Dry Heat Microbial Reduction
NASA Technical Reports Server (NTRS)
Cash, Howard A.; Kern, Roger G.; Chung, Shirley Y.; Koukol, Robert C.; Barengoltz, Jack B.
2006-01-01
The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected vapor phase hydrogen peroxide (VHP) sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with appropriate specification, in NPG8020.12C as a low temperature complementary technique to the dry heat sterilization process. A series of experiments were conducted in vacuum to determine VHP process parameters that provided significant reductions in spore viability while allowing survival of sufficient spores for statistically significant enumeration. With this knowledge of D values, sensible margins can be applied in a planetary protection specification. The outcome of this study provided an optimization of test sterilizer process conditions: VHP concentration, process duration, a process temperature range for which the worst case D value may be imposed, a process humidity range for which the worst case D value may be imposed, and robustness to selected spacecraft material substrates.
Mahfouz, Zaher; Verloock, Leen; Joseph, Wout; Tanghe, Emmeric; Gati, Azeddine; Wiart, Joe; Lautru, David; Hanna, Victor Fouad; Martens, Luc
2013-12-01
The influence of temporal daily exposure to global system for mobile communications (GSM) and universal mobile telecommunications systems and high speed downlink packet access (UMTS-HSDPA) is investigated using spectrum analyser measurements in two countries, France and Belgium. Temporal variations and traffic distributions are investigated. Three different methods to estimate maximal electric-field exposure are compared. The maximal realistic (99 %) and the maximal theoretical extrapolation factor used to extrapolate the measured broadcast control channel (BCCH) for GSM and the common pilot channel (CPICH) for UMTS are presented and compared for the first time in the two countries. Similar conclusions are found in the two countries for both urban and rural areas: worst-case exposure assessment overestimates realistic maximal exposure up to 5.7 dB for the considered example. In France, the values are the highest, because of the higher population density. The results for the maximal realistic extrapolation factor at the weekdays are similar to those from weekend days.
Full band all-sky search for periodic gravitational waves in the O1 LIGO data
NASA Astrophysics Data System (ADS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Afrough, M.; Agarwal, B.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Allen, B.; Allen, G.; Allocca, A.; Altin, P. A.; Amato, A.; Ananyeva, A.; Anderson, S. B.; Anderson, W. G.; Angelova, S. V.; Antier, S.; Appert, S.; Arai, K.; Araya, M. C.; Areeda, J. S.; Arnaud, N.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Atallah, D. V.; Aufmuth, P.; Aulbert, C.; AultONeal, K.; Austin, C.; Avila-Alvarez, A.; Babak, S.; Bacon, P.; Bader, M. K. M.; Bae, S.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Banagiri, S.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barkett, K.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Bawaj, M.; Bayley, J. C.; Bazzan, M.; Bécsy, B.; Beer, C.; Bejger, M.; Belahcene, I.; Bell, A. S.; Berger, B. K.; Bergmann, G.; Bero, J. J.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Billman, C. R.; Birch, J.; Birney, R.; Birnholtz, O.; Biscans, S.; Biscoveanu, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackman, J.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bode, N.; Boer, M.; Bogaert, G.; Bohe, A.; Bondu, F.; Bonilla, E.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bossie, K.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Broida, J. E.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Bustillo, J. Calderón; Callister, T. A.; Calloni, E.; Camp, J. B.; Canepa, M.; Canizares, P.; Cannon, K. C.; Cao, H.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Carney, M. F.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerdá-Durán, P.; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chase, E.; Chassande-Mottin, E.; Chatterjee, D.; Cheeseboro, B. D.; Chen, H. Y.; Chen, X.; Chen, Y.; Cheng, H.-P.; Chia, H. Y.; Chincarini, A.; Chiummo, A.; Chmiel, T.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, A. J. K.; Chua, S.; Chung, A. K. W.; Chung, S.; Ciani, G.; Ciecielag, P.; Ciolfi, R.; Cirelli, C. E.; Cirone, A.; Clara, F.; Clark, J. A.; Clearwater, P.; Cleva, F.; Cocchieri, C.; Coccia, E.; Cohadon, P.-F.; Cohen, D.; Colla, A.; Collette, C. G.; Cominsky, L. R.; Constancio, M.; Conti, L.; Cooper, S. J.; Corban, P.; Corbitt, T. R.; Cordero-Carrión, I.; Corley, K. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, E. T.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Covas, P. B.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Creighton, J. D. E.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cullen, T. J.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Dálya, G.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dasgupta, A.; Da Silva Costa, C. F.; Dattilo, V.; Dave, I.; Davier, M.; Davis, D.; Daw, E. J.; Day, B.; De, S.; DeBra, D.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Demos, N.; Denker, T.; Dent, T.; De Pietri, R.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; De Rossi, C.; DeSalvo, R.; de Varona, O.; Devenson, J.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Renzo, F.; Doctor, Z.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dorosh, O.; Dorrington, I.; Douglas, R.; Dovale Álvarez, M.; Downes, T. P.; Drago, M.; Dreissigacker, C.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dupej, P.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Eisenstein, R. A.; Essick, R. C.; Estevez, D.; Etienne, Z. B.; Etzel, T.; Evans, M.; Evans, T. M.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Farinon, S.; Farr, B.; Farr, W. M.; Fauchon-Jones, E. J.; Favata, M.; Fays, M.; Fee, C.; Fehrmann, H.; Feicht, J.; Fejer, M. M.; Fernandez-Galiana, A.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Finstad, D.; Fiori, I.; Fiorucci, D.; Fishbach, M.; Fisher, R. P.; Fitz-Axen, M.; Flaminio, R.; Fletcher, M.; Fong, H.; Font, J. A.; Forsyth, P. W. F.; Forsyth, S. S.; Fournier, J.-D.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fries, E. M.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H.; Gadre, B. U.; Gaebel, S. M.; Gair, J. R.; Gammaitoni, L.; Ganija, M. R.; Gaonkar, S. G.; Garcia-Quiros, C.; Garufi, F.; Gateley, B.; Gaudio, S.; Gaur, G.; Gayathri, V.; Gehrels, N.; Gemme, G.; Genin, E.; Gennai, A.; George, D.; George, J.; Gergely, L.; Germain, V.; Ghonge, S.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glover, L.; Goetz, E.; Goetz, R.; Gomes, S.; Goncharov, B.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Gretarsson, E. M.; Groot, P.; Grote, H.; Grunewald, S.; Gruning, P.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Halim, O.; Hall, B. R.; Hall, E. D.; Hamilton, E. Z.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hannuksela, O. A.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Haster, C.-J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinderer, T.; Hoak, D.; Hofman, D.; Holt, K.; Holz, D. E.; Hopkins, P.; Horst, C.; Hough, J.; Houston, E. A.; Howell, E. J.; Hreibi, A.; Hu, Y. M.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Inta, R.; Intini, G.; Isa, H. N.; Isac, J.-M.; Isi, M.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; Junker, J.; Kalaghatgi, C. V.; Kalogera, V.; Kamai, B.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kapadia, S. J.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katolik, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kawabe, K.; Kéfélian, F.; Keitel, D.; Kemball, A. J.; Kennedy, R.; Kent, C.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chunglee; Kim, J. C.; Kim, K.; Kim, W.; Kim, W. S.; Kim, Y.-M.; Kimbrell, S. J.; King, E. J.; King, P. J.; Kinley-Hanlon, M.; Kirchhoff, R.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Knowles, T. D.; Koch, P.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Krämer, C.; Kringel, V.; Krishnan, B.; Królak, A.; Kuehn, G.; Kumar, P.; Kumar, R.; Kumar, S.; Kuo, L.; Kutynia, A.; Kwang, S.; Lackey, B. D.; Lai, K. H.; Landry, M.; Lang, R. N.; Lange, J.; Lantz, B.; Lanza, R. K.; Lartaux-Vollard, A.; Lasky, P. D.; Laxen, M.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, H. W.; Lee, K.; Lehmann, J.; Lenon, A.; Leonardi, M.; Leroy, N.; Letendre, N.; Levin, Y.; Li, T. G. F.; Linker, S. D.; Littenberg, T. B.; Liu, J.; Lo, R. K. L.; Lockerbie, N. A.; London, L. T.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lovelace, G.; Lück, H.; Lumaca, D.; Lundgren, A. P.; Lynch, R.; Ma, Y.; Macas, R.; Macfoy, S.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña Hernandez, I.; Magaña-Sandoval, F.; Magaña Zertuche, L.; Magee, R. M.; Majorana, E.; Maksimovic, I.; Man, N.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markakis, C.; Markosyan, A. S.; Markowitz, A.; Maros, E.; Marquina, A.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Mason, K.; Massera, E.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matas, A.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McCuller, L.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McNeill, L.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Mehmet, M.; Meidam, J.; Mejuto-Villa, E.; Melatos, A.; Mendell, G.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, B. B.; Miller, J.; Millhouse, M.; Milovich-Goff, M. C.; Minazzoli, O.; Minenkov, Y.; Ming, J.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moffa, D.; Moggi, A.; Mogushi, K.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Muñiz, E. A.; Muratore, M.; Murray, P. G.; Napier, K.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Neilson, J.; Nelemans, G.; Nelson, T. J. N.; Nery, M.; Neunzert, A.; Nevin, L.; Newport, J. M.; Newton, G.; Ng, K. Y.; Nguyen, T. T.; Nichols, D.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Noack, A.; Nocera, F.; Nolting, D.; North, C.; Nuttall, L. K.; Oberling, J.; O'Dea, G. D.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Okada, M. A.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; Ormiston, R.; Ortega, L. F.; O'Shaughnessy, R.; Ossokine, S.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; Pace, A. E.; Page, J.; Page, M. A.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, Howard; Pan, Huang-Wei; Pang, B.; Pang, P. T. H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Parida, A.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patil, M.; Patricelli, B.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perez, C. J.; Perreca, A.; Perri, L. M.; Pfeiffer, H. P.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pirello, M.; Pisarski, A.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Porter, E. K.; Post, A.; Powell, J.; Prasad, J.; Pratt, J. W. W.; Pratten, G.; Predoi, V.; Prestegard, T.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L. G.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rajbhandari, B.; Rakhmanov, M.; Ramirez, K. E.; Ramos-Buades, A.; Rapagnani, P.; Raymond, V.; Razzano, M.; Read, J.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Ren, W.; Reyes, S. D.; Ricci, F.; Ricker, P. M.; Rieger, S.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romel, C. L.; Romie, J. H.; Rosińska, D.; Ross, M. P.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Rutins, G.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L. M.; Sanchez, E. J.; Sanchez, L. E.; Sanchis-Gual, N.; Sandberg, V.; Sanders, J. R.; Sassolas, B.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Scheel, M.; Scheuer, J.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schulte, B. W.; Schutz, B. F.; Schwalbe, S. G.; Scott, J.; Scott, S. M.; Seidel, E.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Shaddock, D. A.; Shaffer, T. J.; Shah, A. A.; Shahriar, M. S.; Shaner, M. B.; Shao, L.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, A. D.; Singer, L. P.; Singh, A.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, B.; Smith, J. R.; Smith, R. J. E.; Somala, S.; Son, E. J.; Sonnenberg, J. A.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Spencer, A. P.; Srivastava, A. K.; Staats, K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stevenson, S. P.; Stone, R.; Stops, D. J.; Strain, K. A.; Stratta, G.; Strigin, S. E.; Strunk, A.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sunil, S.; Suresh, J.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Tait, S. C.; Talbot, C.; Talukder, D.; Tanner, D. B.; Tao, D.; Tápai, M.; Taracchini, A.; Tasson, J. D.; Taylor, J. A.; Taylor, R.; Tewari, S. V.; Theeg, T.; Thies, F.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tonelli, M.; Tornasi, Z.; Torres-Forné, A.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trinastic, J.; Tringali, M. C.; Trozzo, L.; Tsang, K. W.; Tse, M.; Tso, R.; Tsukada, L.; Tsuna, D.; Tuyenbayev, D.; Ueno, K.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Varma, V.; Vass, S.; Vasúth, M.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Venugopalan, G.; Verkindt, D.; Vetrano, F.; Viceré, A.; Viets, A. D.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walet, R.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, J. Z.; Wang, W. H.; Wang, Y. F.; Ward, R. L.; Warner, J.; Was, M.; Watchi, J.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Wessel, E. K.; Weßels, P.; Westerweck, J.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Whittle, C.; Wilken, D.; Williams, D.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Woehler, J.; Wofford, J.; Wong, W. K.; Worden, J.; Wright, J. L.; Wu, D. S.; Wysocki, D. M.; Xiao, S.; Yamamoto, H.; Yancey, C. C.; Yang, L.; Yap, M. J.; Yazback, M.; Yu, Hang; Yu, Haocun; Yvert, M.; Zadroźny, A.; Zanolin, M.; Zelenova, T.; Zendri, J.-P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, T.; Zhang, Y.-H.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, S. J.; Zhu, X. J.; Zucker, M. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration
2018-05-01
We report on a new all-sky search for periodic gravitational waves in the frequency band 475-2000 Hz and with a frequency time derivative in the range of [-1.0 ,+0.1 ] ×1 0-8 Hz /s . Potential signals could be produced by a nearby spinning and slightly nonaxisymmetric isolated neutron star in our Galaxy. This search uses the data from Advanced LIGO's first observational run O1. No gravitational-wave signals were observed, and upper limits were placed on their strengths. For completeness, results from the separately published low-frequency search 20-475 Hz are included as well. Our lowest upper limit on worst-case (linearly polarized) strain amplitude h0 is ˜4 ×1 0-25 near 170 Hz, while at the high end of our frequency range, we achieve a worst-case upper limit of 1.3 ×1 0-24. For a circularly polarized source (most favorable orientation), the smallest upper limit obtained is ˜1.5 ×1 0-25.
Quantum systems as embarrassed colleagues: what do tax evasion and state tomography have in common?
NASA Astrophysics Data System (ADS)
Ferrie, Chris; Blume-Kohout, Robin
2011-03-01
Quantum state estimation (a.k.a. ``tomography'') plays a key role in designing quantum information processors. As a problem, it resembles probability estimation - e.g. for classical coins or dice - but with some subtle and important discrepancies. We demonstrate an improved classical analogue that captures many of these differences: the ``noisy coin.'' Observations on noisy coins are unreliable - much like soliciting sensitive information such as ones tax preparation habits. So, like a quantum system, it cannot be sampled directly. Unlike standard coins or dice, whose worst-case estimation risk scales as 1 / N for all states, noisy coins (and quantum states) have a worst-case risk that scales as 1 /√{ N } and is overwhelmingly dominated by nearly-pure states. The resulting optimal estimation strategies for noisy coins are surprising and counterintuitive. We demonstrate some important consequences for quantum state estimation - in particular, that adaptive tomography can recover the 1 / N risk scaling of classical probability estimation.
Derivation and experimental verification of clock synchronization theory
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.
1994-01-01
The objective of this work is to validate mathematically derived clock synchronization theories and their associated algorithms through experiment. Two theories are considered, the Interactive Convergence Clock Synchronization Algorithm and the Mid-Point Algorithm. Special clock circuitry was designed and built so that several operating conditions and failure modes (including malicious failures) could be tested. Both theories are shown to predict conservative upper bounds (i.e., measured values of clock skew were always less than the theory prediction). Insight gained during experimentation led to alternative derivations of the theories. These new theories accurately predict the clock system's behavior. It is found that a 100% penalty is paid to tolerate worst case failures. It is also shown that under optimal conditions (with minimum error and no failures) the clock skew can be as much as 3 clock ticks. Clock skew grows to 6 clock ticks when failures are present. Finally, it is concluded that one cannot rely solely on test procedures or theoretical analysis to predict worst case conditions. conditions.
Experimental validation of clock synchronization algorithms
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Graham, R. Lynn
1992-01-01
The objective of this work is to validate mathematically derived clock synchronization theories and their associated algorithms through experiment. Two theories are considered, the Interactive Convergence Clock Synchronization Algorithm and the Midpoint Algorithm. Special clock circuitry was designed and built so that several operating conditions and failure modes (including malicious failures) could be tested. Both theories are shown to predict conservative upper bounds (i.e., measured values of clock skew were always less than the theory prediction). Insight gained during experimentation led to alternative derivations of the theories. These new theories accurately predict the behavior of the clock system. It is found that a 100 percent penalty is paid to tolerate worst-case failures. It is also shown that under optimal conditions (with minimum error and no failures) the clock skew can be as much as three clock ticks. Clock skew grows to six clock ticks when failures are present. Finally, it is concluded that one cannot rely solely on test procedures or theoretical analysis to predict worst-case conditions.
Direct simulation Monte Carlo prediction of on-orbit contaminant deposit levels for HALOE
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.; Rault, Didier F. G.
1994-01-01
A three-dimensional version of the direct simulation Monte Carlo method is adapted to assess the contamination environment surrounding a highly detailed model of the Upper Atmosphere Research Satellite. Emphasis is placed on simulating a realistic, worst-case set of flow field and surface conditions and geometric orientations for the satellite in order to estimate an upper limit for the cumulative level of volatile organic molecular deposits at the aperture of the Halogen Occultation Experiment. A detailed description of the adaptation of this solution method to the study of the satellite's environment is also presented. Results pertaining to the satellite's environment are presented regarding contaminant cloud structure, cloud composition, and statistics of simulated molecules impinging on the target surface, along with data related to code performance. Using procedures developed in standard contamination analyses, along with many worst-case assumptions, the cumulative upper-limit level of volatile organic deposits on HALOE's aperture over the instrument's 35-month nominal data collection period is estimated at about 13,350 A.
Correct consideration of the index of refraction using blackbody radiation.
Hartmann, Jurgen
2006-09-04
The correct consideration of the index of refraction when using blackbody radiators as standard sources for optical radiation is derived and discussed. It is shown that simply using the index of refraction of air at laboratory conditions is not sufficient. A combination of the index of refraction of the media used inside the blackbody radiator and for the optical path between blackbody and detector has to be used instead. A worst case approximation for the introduced error when neglecting these effects is presented, showing that the error is below 0.1 % for wavelengths above 200 nm. Nevertheless, for the determination of the spectral radiance for the purpose of radiation temperature measurements the correct consideration of the refractive index is mandatory. The worst case estimation reveals that the introduced error in temperature at a blackbody temperature of 3000 degrees C can be as high as 400 mk at a wavelength of 650 nm and even higher at longer wavelengths.
Thermal-hydraulic analysis of N Reactor graphite and shield cooling system performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Low, J.O.; Schmitt, B.E.
1988-02-01
A series of bounding (worst-case) calculations were performed using a detailed hydrodynamic RELAP5 model of the N Reactor graphite and shield cooling system (GSCS). These calculations were specifically aimed to answer issues raised by the Westinghouse Independent Safety Review (WISR) committee. These questions address the operability of the GSCS during a worst-case degraded-core accident that requires the GDCS to mitigate the consequences of the accident. An accident scenario previously developed was designed as the hydrogen-mitigation design-basis accident (HMDBA). Previous HMDBA heat transfer analysis,, using the TRUMP-BD code, was used to define the thermal boundary conditions that the GSDS may bemore » exposed to. These TRUMP/HMDBA analysis results were used to define the bounding operating conditions of the GSCS during the course of an HMDBA transient. Nominal and degraded GSCS scenarios were investigated using RELAP5 within or at the bounds of the HMDBA transient. 10 refs., 42 figs., 10 tabs.« less
Zero-moment point determination of worst-case manoeuvres leading to vehicle wheel lift
NASA Astrophysics Data System (ADS)
Lapapong, S.; Brown, A. A.; Swanson, K. S.; Brennan, S. N.
2012-01-01
This paper proposes a method to evaluate vehicle rollover propensity based on a frequency-domain representation of the zero-moment point (ZMP). Unlike other rollover metrics such as the static stability factor, which is based on the steady-state behaviour, and the load transfer ratio, which requires the calculation of tyre forces, the ZMP is based on a simplified kinematic model of the vehicle and the analysis of the contact point of the vehicle relative to the edge of the support polygon. Previous work has validated the use of the ZMP experimentally in its ability to predict wheel lift in the time domain. This work explores the use of the ZMP in the frequency domain to allow a chassis designer to understand how operating conditions and vehicle parameters affect rollover propensity. The ZMP analysis is then extended to calculate worst-case sinusoidal manoeuvres that lead to untripped wheel lift, and the analysis is tested across several vehicle configurations and compared with that of the standard Toyota J manoeuvre.
Burns, Ronda L.; Severance, Timothy
1997-01-01
Contraction scour for all modelled flows ranged from 15.8 to 22.5 ft. The worst-case contraction scour occurred at the 500-year discharge. Abutment scour ranged from 6.7 to 11.1 ft. The worst-case abutment scour also occurred at the 500-year discharge. Additional information on scour depths and depths to armoring are included in the section titled “Scour Results”. Scoured-streambed elevations, based on the calculated scour depths, are presented in Tables 1 and 2. A cross-section of the scour computed at the bridge is presented in Figure 8. Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. Usually, computed scour depths are evaluated in combination with other information including (but not limited to) historical performance during flood events, the geomorphic stability assessment, existing scour protection measures, and the results of the hydraulic analyses. Therefore, scour depths adopted by VTAOT may differ from the computed values documented herein.
Monro, Donald M; Rakshit, Soumyadip; Zhang, Dexin
2007-04-01
This paper presents a novel iris coding method based on differences of discrete cosine transform (DCT) coefficients of overlapped angular patches from normalized iris images. The feature extraction capabilities of the DCT are optimized on the two largest publicly available iris image data sets, 2,156 images of 308 eyes from the CASIA database and 2,955 images of 150 eyes from the Bath database. On this data, we achieve 100 percent Correct Recognition Rate (CRR) and perfect Receiver-Operating Characteristic (ROC) Curves with no registered false accepts or rejects. Individual feature bit and patch position parameters are optimized for matching through a product-of-sum approach to Hamming distance calculation. For verification, a variable threshold is applied to the distance metric and the False Acceptance Rate (FAR) and False Rejection Rate (FRR) are recorded. A new worst-case metric is proposed for predicting practical system performance in the absence of matching failures, and the worst case theoretical Equal Error Rate (EER) is predicted to be as low as 2.59 x 10(-4) on the available data sets.
A CMOS matrix for extracting MOSFET parameters before and after irradiation
NASA Technical Reports Server (NTRS)
Blaes, B. R.; Buehler, M. G.; Lin, Y.-S.; Hicks, K. A.
1988-01-01
An addressable matrix of 16 n- and 16 p-MOSFETs was designed to extract the dc MOSFET parameters for all dc gate bias conditions before and after irradiation. The matrix contains four sets of MOSFETs, each with four different geometries that can be biased independently. Thus the worst-case bias scenarios can be determined. The MOSFET matrix was fabricated at a silicon foundry using a radiation-soft CMOS p-well LOCOS process. Co-60 irradiation results for the n-MOSFETs showed a threshold-voltage shift of -3 mV/krad(Si), whereas the p-MOSFETs showed a shift of 21 mV/krad(Si). The worst-case threshold-voltage shift occurred for the n-MOSFETs, with a gate bias of 5 V during the anneal. For the p-MOSFETs, biasing did not affect the shift in the threshold voltage. A parasitic MOSFET dominated the leakage of the n-MOSFET biased with 5 V on the gate during irradiation. Co-60 test results for other parameters are also presented.
Mad cows and computer models: the U.S. response to BSE.
Ackerman, Frank; Johnecheck, Wendy A
2008-01-01
The proportion of slaughtered cattle tested for BSE is much smaller in the U.S. than in Europe and Japan, leaving the U.S. heavily dependent on statistical models to estimate both the current prevalence and the spread of BSE. We examine the models relied on by USDA, finding that the prevalence model provides only a rough estimate, due to limited data availability. Reassuring forecasts from the model of the spread of BSE depend on the arbitrary constraint that worst-case values are assumed by only one of 17 key parameters at a time. In three of the six published scenarios with multiple worst-case parameter values, there is at least a 25% probability that BSE will spread rapidly. In public policy terms, reliance on potentially flawed models can be seen as a gamble that no serious BSE outbreak will occur. Statistical modeling at this level of abstraction, with its myriad, compound uncertainties, is no substitute for precautionary policies to protect public health against the threat of epidemics such as BSE.
Modelling the long-term evolution of worst-case Arctic oil spills.
Blanken, Hauke; Tremblay, Louis Bruno; Gaskin, Susan; Slavin, Alexander
2017-03-15
We present worst-case assessments of contamination in sea ice and surface waters resulting from hypothetical well blowout oil spills at ten sites in the Arctic Ocean basin. Spill extents are estimated by considering Eulerian passive tracers in the surface ocean of the MITgcm (a hydrostatic, coupled ice-ocean model). Oil in sea ice, and contamination resulting from melting of oiled ice, is tracked using an offline Lagrangian scheme. Spills are initialized on November 1st 1980-2010 and tracked for one year. An average spill was transported 1100km and potentially affected 1.1 million km 2 . The direction and magnitude of simulated oil trajectories are consistent with known large-scale current and sea ice circulation patterns, and trajectories frequently cross international boundaries. The simulated trajectories of oil in sea ice match observed ice drift trajectories well. During the winter oil transport by drifting sea ice is more significant than transport with surface currents. Copyright © 2017 Elsevier Ltd. All rights reserved.
George T. Cvetkovich; Patricia L. Winter
2008-01-01
This report presents results from a study of San Bernardino National Forest community residentsâ experiences with and perceptions of fire, fire management, and the Forest Service. Using self-administered surveys and focus group discussions, we found that participants had personal experiences with fire, were concerned about fire, and felt knowledgeable about effective...
Brian R. Sturtevant; Brian R. Miranda; Jian Yang; Hong S. He; Eric J. Gustafson; Robert M. Scheller
2009-01-01
Public forests are surrounded by land over which agency managers have no control, and whose owners expect the public forest to be a "good neighbor." Fire risk abatement on multi-owner landscapes containing flammable but fire-dependent ecosystems epitomizes the complexities of managing public lands. We report a case study that applies a landscape disturbance...
Ana Carolina Monmany; William Gould; Maria Jose Andrade-Nunez; Grizelle Gonzalez; Maya Quinones
2017-01-01
Global estimates of fire frequency indicate that over 70% of active fires occur in the tropics, and the size and frequency of fires are increasing every year. The majority of fires in the tropics are an unintended consequence of current land-use practices that promotes the establishment of grass and shrubland communities, which are more flammable and more adapted to...
Garrison Project - Lake Sakakawea Oil and Gas Management Plan, North Dakota
2012-11-01
When the air gun is fired , pulses of acoustic energy are produced causing the shock waves needed for data collection (Peterson, 2004). • Seismic...The proposed casing program shall include the size, weight, grade, and length of casing proposed, type of thread and coupling, and setting depth of...suppression of fires on public lands caused by its employees, contractors or subcontractors. During conditions of extreme fire danger, surface use
Propulsion and Energetics Panel Working Group 11 on Aircraft Fire Safety. Volume 2. Main Report
1979-11-01
which make burning metal particles a potent igni- tion source and extinguishment of bulk metal fires a difficult task. In the latter case, the difficulty...aircraft to fires induced by uncon- tained engine failures and internal engine metal fires . With respect to the uncontained engine failure current engine
Chapter 1: Fire and nonnative invasive plants-introduction
Jane Kapler Smith; Kristin Zouhar; Steve Sutherland; Matthew L. Brooks
2008-01-01
Fire is a process integral to the functioning of most temperate wildland ecosystems. Lightning-caused and anthropogenic fires have influenced the vegetation of North America profoundly for millennia (Brown and Smith 2000; Pyne 1982b). In some cases, fire has been used to manipulate the species composition and structure of ecosystems to meet management objectives,...
The hidden consequences of fire suppression
Carol Miller
2012-01-01
Wilderness managers need a way to quantify and monitor the effects of suppressing lightning-caused wildfires, which can alter natural fire regimes, vegetation, and habitat. Using computerized models of fire spread, weather, and fuels, it is now possible to quantify many of the hidden consequences of fire suppression. Case study watersheds in Yosemite and Sequoia-Kings...
Fire weather technology for fire agrometeorology operations
Francis Fujioka
2008-01-01
Even as the magnitude of wildfire problems increases globally, United Nations agencies are acting to mitigate the risk of wildfire disasters to members. Fire management organizations worldwide may vary considerably in operational scope, depending on the number and type of resources an organization manages. In any case, good fire weather information is vital. This paper...
Fiber-modified polyurethane foam for ballistic protection
NASA Technical Reports Server (NTRS)
Fish, R. H.; Parker, J. A.; Rosser, R. W.
1975-01-01
Closed-cell, semirigid, fiber-loaded, self-extinguishing polyurethane foam material fills voids around fuel cells in aircraft. Material prevents leakage of fuel and spreading of fire in case of ballistic incendiary impact. It also protects fuel cell in case of exterior fire.
Aerothermodynamic Environments Definition for the Mars Science Laboratory Entry Capsule
NASA Technical Reports Server (NTRS)
Edquist, Karl T.; Dyakonov, Artem A.; Wright, Michael J.; Tang, Chun Y.
2007-01-01
An overview of the aerothermodynamic environments definition status is presented for the Mars Science Laboratory entry vehicle. The environments are based on Navier-Stokes flowfield simulations on a candidate aeroshell geometry and worst-case entry heating trajectories. Uncertainties for the flowfield predictions are based primarily on available ground data since Mars flight data are scarce. The forebody aerothermodynamics analysis focuses on boundary layer transition and turbulent heating augmentation. Turbulent transition is expected prior to peak heating, a first for Mars entry, resulting in augmented heat flux and shear stress at the same heatshield location. Afterbody computations are also shown with and without interference effects of reaction control system thruster plumes. Including uncertainties, analysis predicts that the heatshield may experience peaks of 225 W/sq cm for turbulent heat flux, 0.32 atm for stagnation pressure, and 400 Pa for turbulent shear stress. The afterbody heat flux without thruster plume interference is predicted to be 7 W/sq cm on the backshell and 10 W/sq cm on the parachute cover. If the reaction control jets are fired near peak dynamic pressure, the heat flux at localized areas could reach as high as 76 W/sq cm on the backshell and 38 W/sq cm on the parachute cover, including uncertainties. The final flight environments used for hardware design will be updated for any changes in the aeroshell configuration, heating design trajectories, or uncertainties.
Multiple Microcomputer Control Algorithm.
1979-09-01
discrete and semaphore supervisor calls can be used with tasks in separate processors, in which case they are maintained in shared memory. Operations on ...the source or destination operand specifier of each mode in most cases . However, four of the 16 general register addressing modes and one of the 8 pro...instruction time is based on the specified usage factors and the best cast, and worst case execution times for the instruc- 1I 5 1NAVTRAEQZJ1PCrN M’.V7~j
Investigation of the Human Response to Upper Torso Retraction with Weighted Helmets
2013-09-01
coverage of each test. The Kodak system is capable of recording high-speed motion up to a rate of 1000 frames per second. For this study , the video...the measured center-of-gravity (CG) of the worst- case test helmet fell outside the current limits and no injuries were observed, it can be stated...8 Figure 7. T-test Cases 1-9 (0 lb Added Helmet Weight
Welling, L; Boers, M; Mackie, D P; Patka, P; Bierens, J J L M; Luitse, J S K; Kreis, R W
2006-01-01
The optimum response to the different stages of a major burns incident is still not established. The fire in a café in Volendam on New Year's Eve 2000 was the worst incident in recent Dutch history and resulted in mass burn casualties. The fire has been the subject of several investigations concerned with organisational and medical aspects. Based on the findings in these investigations, a multidisciplinary research group started a consensus study. The aim of this study was to further identify areas of improvement in the care after mass burns incidents. The consensus process comprised three postal rounds (Delphi Method) and a consensus conference (modified nominal group technique). The multidisciplinary panel consisted of 26 Dutch-speaking experts, working in influential positions within the sphere of disaster management and healthcare. In response to the postal questionnaires, consensus was reached for 66 per cent of the statements. Six topics were subsequently discussed during the consensus conference; three topics were discussed within the plenary session and three during subgroup meetings. During the conference, consensus was reached for seven statements (one subject generated two statements). In total, the panel agreed on 21 statements. These covered the following topics: registration and evaluation of disaster care, capacity planning for disasters, pre hospital care of victims of burns disasters, treatment and transportation priorities, distribution of casualties (including interhospital transports), diagnosis and treatment and education and training. In disaster medicine, the paper shows how a consensus process is a suitable tool to identify areas of improvement of care after mass burns incidents.
29 CFR 1926.34 - Means of egress.
Code of Federal Regulations, 2011 CFR
2011-07-01
... visible signs in all cases where the exit or way to reach it is not immediately visible to the occupants... and effective provisions are made to remove occupants in case of fire or other emergency. (b) Exit... obstructions or impediments to full instant use in the case of fire or other emergency. [58 FR 35083, June 30...
29 CFR 1926.34 - Means of egress.
Code of Federal Regulations, 2010 CFR
2010-07-01
... visible signs in all cases where the exit or way to reach it is not immediately visible to the occupants... and effective provisions are made to remove occupants in case of fire or other emergency. (b) Exit... obstructions or impediments to full instant use in the case of fire or other emergency. [58 FR 35083, June 30...
1980-08-01
tile se(q uenw threshold does not utilize thle D)C level inlforiat ion and the time thlresliolditig adaptively adjusts for DC lvel . This characteristic...lowest 256/8 = 32 elements. The above observation can be mathematically proven to also relate the fact that the lowest (NT/W) elements can, at worst case
ERIC Educational Resources Information Center
Fitzgerald, Patricia L.
1998-01-01
Although only 5% of the population has severe food allergies, school business officials must be prepared for the worst-case scenario. Banning foods and segregating allergic children are harmful practices. Education and sensible behavior are the best medicine when food allergies and intolerances are involved. Resources are listed. (MLH)
Shuttle ECLSS ammonia delivery capability
NASA Technical Reports Server (NTRS)
1976-01-01
The possible effects of excessive requirements on ammonia flow rates required for entry cooling, due to extreme temperatures, on mission plans for the space shuttles, were investigated. An analysis of worst case conditions was performed, and indicates that adequate flow rates are available. No mission impact is therefore anticipated.
Assessing the value of increased model resolution in forecasting fire danger
Jeanne Hoadley; Miriam Rorig; Ken Westrick; Larry Bradshaw; Sue Ferguson; Scott Goodrick; Paul Werth
2003-01-01
The fire season of 2000 was used as a case study to assess the value of increasing mesoscale model resolution for fire weather and fire danger forecasting. With a domain centered on Western Montana and Northern Idaho, MM5 simulations were run at 36, 12, and 4-km resolutions for a 30 day period at the height of the fire season. Verification analyses for meteorological...
Jason Forthofer; Bret Butler
2007-01-01
A computational fluid dynamics (CFD) model and a mass-consistent model were used to simulate winds on simulated fire spread over a simple, low hill. The results suggest that the CFD wind field could significantly change simulated fire spread compared to traditional uniform winds. The CFD fire spread case may match reality better because the winds used in the fire...
H. Grant Pearce
2007-01-01
On March 24, 1998, a crew of eight rural firefighters were burned over while attempting to suppress a backburning sector of the Bucklands Crossing Fire in North Otago, New Zealand. The fire demonstrates how factors typical of the New Zealand fire environment â steep slopes, highly flammable shrub fuels, and a strong foehn wind effect â combined to produce extreme fire...
Fire prevention in Delaware: a case study of fire and life safety initiatives.
Frattaroli, Shannon; Gielen, Andrea C; Piver-Renna, Jennifer; Pollack, Keshia M; Ta, Van M
2011-01-01
Injuries resulting from residential house fires are a significant public health issue. The fire service is engaged in fire prevention activities aimed at preventing fire-related morbidity and mortality. The fire service in Delaware is regarded by some leaders in the field as a model for fire and life safety education (FLSE). We identified 3 questions to guide this research. What is the culture and context of fire prevention in Delaware? What prevention programs and policies constitute Delaware's fire prevention efforts? What can be learned from select model programs regarding their impact, sustainability, strengths, limitations, and general applicability? A discussion of the lessons learned from Delaware's experience with FLSE initiatives concludes the article. We used a single case study design and collected and analyzed data from in-depth interviews, documents, and participant observation notes to address the research questions. Data were collected in Delaware. Interviewees included a purposeful sample of members of the Delaware fire service. Descriptions of the context in which fire prevention occurs, the initiatives underway, and the factors associated with successfully supporting fire prevention in the state. Data from 16 key informant interviews, relevant documents, and direct observations of FLSE events revealed a fire service rooted in tradition, dedication, and community. A compilation of state and local FLSE initiatives illustrates the diversity of FLSE in Delaware. Thematic analysis of the data emphasize the importance of a strategic, comprehensive, and coordinated approach to realizing success in Delaware's approach to FLSE. The fire service is an important part of the public health infrastructure. While their role as first responders is evident, their contributions to prevention are also significant. This research suggests ways to support fire service prevention efforts and more fully integrate their FLSE work into the public health infrastructure.
Repetitive deliberate fires: Development and validation of a methodology to detect series.
Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi
2017-08-01
The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Firefighter Shift Schedules Affect Sleep Quality.
Billings, Joel; Focht, Will
2016-03-01
The aim of this study was to investigate the prevalence and severity of firefighter sleep quality across department shift schedules. Sleep quality was assessed using a Pittsburgh Sleep Quality Index in a sample of 109 male career firefighters from six fire departments in three Southwestern US states. The three shift schedules studied were 24on/48off, 48on/96off, and Kelly. Seventy-three percent of firefighters report poor sleep quality. The 24on/48off shift schedule is associated with the best sleep quality and Kelly is associated with the worst sleep quality. Firefighters working second jobs report significantly poorer sleep quality than those who do not. Shift schedules that disrupt normal circadian rhythms more result in poorer sleep quality, which can lead to less effective emergency response and increased risk to firefighter health and safety.
Ecotoxicity of waste water from industrial fires fighting
NASA Astrophysics Data System (ADS)
Dobes, P.; Danihelka, P.; Janickova, S.; Marek, J.; Bernatikova, S.; Suchankova, J.; Baudisova, B.; Sikorova, L.; Soldan, P.
2012-04-01
As shown at several case studies, waste waters from extinguishing of industrial fires involving hazardous chemicals could be serious threat primary for surrounding environmental compartments (e.g. surface water, underground water, soil) and secondary for human beings, animals and plants. The negative impacts of the fire waters on the environment attracted public attention since the chemical accident in the Sandoz (Schweizerhalle) in November 1986 and this process continues. Last October, special Seminary on this topic has been organized by UNECE in Bonn. Mode of interaction of fire waters with the environment and potential transport mechanisms are still discussed. However, in many cases waste water polluted by extinguishing foam (always with high COD values), flammable or toxic dangerous substances as heavy metals, pesticides or POPs, are released to surface water or soil without proper decontamination, which can lead to environmental accident. For better understanding of this type of hazard and better coordination of firemen brigades and other responders, the ecotoxicity of such type of waste water should be evaluated in both laboratory tests and in water samples collected during real cases of industrial fires. Case studies, theoretical analysis of problem and toxicity tests on laboratory model samples (e.g. on bacteria, mustard seeds, daphnia and fishes) will provide additional necessary information. Preliminary analysis of waters from industrial fires (polymer material storage and galvanic plating facility) in the Czech Republic has already confirmed high toxicity. In first case the toxicity may be attributed to decomposition of burned material and extinguishing foams, in the latter case it can be related to cyanides in original electroplating baths. On the beginning of the year 2012, two years R&D project focused on reduction of extinguish waste water risk for the environment, was approved by Technology Agency of the Czech Republic.
Johnson, Craig S; Kassir, Andrew; Marx, Daryl S; Soliman, Mark K
2018-05-30
Applications for surgical staplers continue to grow, due to the increase in minimally invasive surgical approaches, and range from vessel ligation to tissue transection and anastomoses. Complications associated with stapled tissue, such as bleeding or leaks, continue to be a concern for surgeons, as both can be associated with prolonged operative times and can contribute to postoperative morbidity and mortality. The goal of this retrospective study was to evaluate the performance of the da Vinci ® Xi EndoWrist ® Stapler 45 with SmartClamp™ technology during robotic-assisted right colectomy with intracorporeal anastomosis. We reviewed 113 consecutive cases from four medical centers. Preclinical diagnoses were inflammatory bowel disease (IBD) (n = 5), benign bowel disease (n = 77), and malignant bowel disease (n = 31). No anastomotic leaks occurred; one event of anastomotic bleeding (0.88%) resolved without surgical intervention. Overall, there were 643 clamp attempts (5.7 attempts per case), and 570 fires (5.0 fires per case). SmartClamp™ occurrences happened in approximately one out of three cases, with the highest proportion of occurrences in the IBD group (2.0 occurrences per case). The most commonly fired reload was blue (1.5 mm closed height) with 4.1 blue reloads fired per case overall. No incomplete fires occurred during the procedures. The study data demonstrate the performance of the da Vinci Xi EndoWrist ® Stapler 45 as used in right colon resection with intracorporeal anastomosis. The collection and analysis of these data provide surgeons with information related to stapler firings, which were not previously available; as such, this analysis may lead to deductions that are useful for intraoperative decision-making and clinical outcomes.
Birth wind and fire: raising awareness to operating room fires during delivery.
Wolf, Omer; Weissman, Oren; Harats, Moti; Farber, Nimrod; Stavrou, Demetris; Tessone, Ariel; Zilinsky, Isaac; Winkler, Eyal; Haik, Josef
2013-09-01
We researched whether the obstetric operating room (OR) qualified as a fire-risk environment so as to take preventive measures accordingly. We analyzed a series of iatrogenic burns inflicted during birth by collecting clinical data and comparing it with known OR fire risk factors and with other factors that repeated in all cases in search of unique characteristics of the obstetric OR. All three cases shared in common the same type of oxygen-rich open ventilation system, alcohol-based prepping solution, and the hastiness of cesarean delivery while spontaneous vaginal delivery was already in progress. The obstetric OR is, as suspected, a fire-prone zone in more ways than the regular OR. Therefore, preventive measures should be undertaken and awareness for the possibility for such occurrences should be raised.
46 CFR 122.524 - Fire fighting drills and training.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 4 2010-10-01 2010-10-01 false Fire fighting drills and training. 122.524 Section 122... Preparations for Emergencies § 122.524 Fire fighting drills and training. (a) The master shall conduct sufficient fire drills to make sure that each crew member is familiar with his or her duties in case of a...
46 CFR 122.524 - Fire fighting drills and training.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 4 2012-10-01 2012-10-01 false Fire fighting drills and training. 122.524 Section 122... Preparations for Emergencies § 122.524 Fire fighting drills and training. (a) The master shall conduct sufficient fire drills to make sure that each crew member is familiar with his or her duties in case of a...
46 CFR 122.524 - Fire fighting drills and training.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 4 2013-10-01 2013-10-01 false Fire fighting drills and training. 122.524 Section 122... Preparations for Emergencies § 122.524 Fire fighting drills and training. (a) The master shall conduct sufficient fire drills to make sure that each crew member is familiar with his or her duties in case of a...
46 CFR 122.524 - Fire fighting drills and training.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 4 2011-10-01 2011-10-01 false Fire fighting drills and training. 122.524 Section 122... Preparations for Emergencies § 122.524 Fire fighting drills and training. (a) The master shall conduct sufficient fire drills to make sure that each crew member is familiar with his or her duties in case of a...
46 CFR 122.524 - Fire fighting drills and training.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 4 2014-10-01 2014-10-01 false Fire fighting drills and training. 122.524 Section 122... Preparations for Emergencies § 122.524 Fire fighting drills and training. (a) The master shall conduct sufficient fire drills to make sure that each crew member is familiar with his or her duties in case of a...
The spatially varying influence of humans on fire probability in North America
Marc-Andre Parisien; Carol Miller; Sean A. Parks; Evan R. DeLancey; Francois-Nicolas Robinne; Mike D. Flannigan
2016-01-01
Humans affect fire regimes by providing ignition sources in some cases, suppressing wildfires in others, and altering natural vegetation in ways that may either promote or limit fire. InNorthAmerica, several studies have evaluated the effects of society on fire activity; however, most studies have been regional or subcontinental in scope and used different...
A game-theoretic model is proposed for the generalization of a discrete-fire silent duel to a silent duel with continuous firing. This zero-sum two...person game is solved in the symmetric case. It is shown that pure optimal strategies exist and hence also solve a noisy duel with continuous firing. A solution for the general non-symmetric duel is conjectured. (Author)
The Role of Social Influence on How Residence Hall Inhabitants Respond to Fire Alarms
ERIC Educational Resources Information Center
Leytem, Michael; Stark, Emily
2016-01-01
College resident halls pose a threat for a catastrophic event in the case of fire, but little research has examined potential influences on students' responses to fire alarms, particularly the role of social influence in affecting their behaviors. In the current study, residence hall inhabitants reported their knowledge about fire safety, their…
Surgical Fires in Otolaryngology: A Systematic and Narrative Review.
Day, Andrew T; Rivera, Erika; Farlow, Janice L; Gourin, Christine G; Nussenbaum, Brian
2018-04-01
Objective To bring attention to the epidemiology, prevention, management, and consequences of surgical fires in otolaryngology by reviewing the literature. Data Sources PubMed, EMBASE, Web of Science, and Scopus. Review Methods Comprehensive search terms were developed, and searches were performed from data source inception through August 2016. A total of 4506 articles were identified; 2351 duplicates were removed; and 2155 titles and abstracts were independently reviewed. Reference review was also performed. Eligible manuscripts described surgical fires involving patients undergoing otolaryngologic procedures. Results Seventy-two articles describing 87 otolaryngologic surgical fire cases were identified. These occurred during oral cavity or oropharyngeal procedures (11%), endoscopic laryngotracheal procedures (25%), tracheostomies (36%), "other" general anesthesia procedures (3%), and monitored anesthesia care or local procedures (24%). Oxidizing agents consisted of oxygen alone (n = 63 of 81, 78%), oxygen and nitric oxide (n = 17 of 81, 21%), and room air (n = 1 of 81, 1%). The fractional inspired oxygen delivered was >30% in 97% of surgical fires in non-nitrous oxide general anesthesia cases (n = 35 of 36). Laser-safe tubes were used in only 12% of endoscopic laryngotracheal cases with endotracheal tube descriptions (n = 2 of 17). Eighty-six percent of patients experienced acute complications (n = 76 of 87), including 1 intraoperative death, and 22% of patients (n = 17 of 77) experienced long-term complications. Conclusion Surgical fires in otolaryngology persist despite aggressive multi-institutional efforts to curb their incidence. Guideline recommendations to minimize the concentration of delivered oxygen and use laser-safe tubes when indicated were not observed in many cases. Improved institutional fire safety practices are needed nationally and internationally.
Witch Wildland Fire, California
NASA Technical Reports Server (NTRS)
2007-01-01
The October wildfires that plagued southern California were some of the worst on record. One of these, the Witch Wildland fire, burned 198,000 acres north of San Diego, destroying 1125 homes, commercial structures, and outbuildings. Over 3,000 firefighters finally contained the fire two weeks after it started on October 21. Now begins the huge task of planning and implementing mitigation measures to replant and reseed the burned areas. This ASTER image depicts the area after the fire, on November 6; vegetation is green, burned areas are dark red, and urban areas are blue. On the burn severity index image, calculated using infrared and visible bands, red areas are the most severely burned, followed by green and blue. This information can help the US Forest Service to plan post-fire activities. With its 14 spectral bands from the visible to the thermal infrared wavelength region, and its high spatial resolution of 15 to 90 meters (about 50 to 300 feet), ASTER images Earth to map and monitor the changing surface of our planet. ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra spacecraft. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products. The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance. The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate. Size: 37.5 by 45 kilometers (23.1 by 27.8 miles) Location: 33 degrees North latitude, 116.9 degrees West longitude Orientation: North at top Image Data: ASTER Bands 6, 3, and 1 Original Data Resolution: ASTER 15 meters (49.2 feet) Dates Acquired: November 6, 2007Space View of the 1991 Gulf War Kuwaiti Oil Fires
NASA Astrophysics Data System (ADS)
Torres, O.; Bhartia, P. K.; Larko, D.
2014-12-01
During the 1991 Persian Gulf War, over 700 oil wells in Kuwait were set ablaze by the withdrawing Iraqi army with the apparent intent of hindering satellite reconnaissance and intelligence gathering activities by the coalition of forces repelling Iraq from occupied Kuwait. The oil fires that burned for an estimated 10 months, created a huge smoke plume whose spatial extent went at times beyond the Persian Gulf region, mobilized across the Saharan Desert reaching as far west as the North Atlantic Ocean. The Nimbus-7 TOMS Total Ozone Mapping Spectrometer, in operation from October 1978 to May 1993, measured the near UV radiances that in the mid-1990's became the input in the calculation of the well know Absorbing Aerosol Index that represented a major breakthrough in satellite-based aerosol remote sensing. Thus, unknowingly to the world, the N7-TOMS sensor was collecting in 1991 an unprecedented daily record of what can be considered the worst environmental catastrophe affecting the atmosphere since the beginning of the era of space-based remote sensing in the 1970's. An overview of the temporal and spatial extent of the synoptic scale 1991 Gulf War smoke plume as seen by the Nimbus-7 TOMS Absorbing Aerosol Index will be presented.
A risk-based approach to flammable gas detector spacing.
Defriend, Stephen; Dejmek, Mark; Porter, Leisa; Deshotels, Bob; Natvig, Bernt
2008-11-15
Flammable gas detectors allow an operating company to address leaks before they become serious, by automatically alarming and by initiating isolation and safe venting. Without effective gas detection, there is very limited defense against a flammable gas leak developing into a fire or explosion that could cause loss of life or escalate to cascading failures of nearby vessels, piping, and equipment. While it is commonly recognized that some gas detectors are needed in a process plant containing flammable gas or volatile liquids, there is usually a question of how many are needed. The areas that need protection can be determined by dispersion modeling from potential leak sites. Within the areas that must be protected, the spacing of detectors (or alternatively, number of detectors) should be based on risk. Detector design can be characterized by spacing criteria, which is convenient for design - or alternatively by number of detectors, which is convenient for cost reporting. The factors that influence the risk are site-specific, including process conditions, chemical composition, number of potential leak sites, piping design standards, arrangement of plant equipment and structures, design of isolation and depressurization systems, and frequency of detector testing. Site-specific factors such as those just mentioned affect the size of flammable gas cloud that must be detected (within a specified probability) by the gas detection system. A probability of detection must be specified that gives a design with a tolerable risk of fires and explosions. To determine the optimum spacing of detectors, it is important to consider the probability that a detector will fail at some time and be inoperative until replaced or repaired. A cost-effective approach is based on the combined risk from a representative selection of leakage scenarios, rather than a worst-case evaluation. This means that probability and severity of leak consequences must be evaluated together. In marine and offshore facilities, it is conventional to use computational fluid dynamics (CFD) modeling to determine the size of a flammable cloud that would result from a specific leak scenario. Simpler modeling methods can be used, but the results are not very accurate in the region near the release, especially where flow obstructions are present. The results from CFD analyses on several leak scenarios can be plotted to determine the size of a flammable cloud that could result in an explosion that would generate overpressure exceeding the strength of the mechanical design of the plant. A cloud of this size has the potential to produce a blast pressure or flying debris capable of causing a fatality or subsequent damage to vessels or piping containing hazardous material. In cases where the leak results in a fire, rather than explosion, CFD or other modeling methods can estimate the size of a leak that would cause a fire resulting in subsequent damage to the facility, or would prevent the safe escape of personnel. The gas detector system must be capable of detecting a gas release or vapor cloud, and initiating action to prevent the leak from reaching a size that could cause injury or severe damage upon ignition.
NASA Astrophysics Data System (ADS)
Jiang, W.; Wang, F.; Meng, Q.; Li, Z.; Liu, B.; Zheng, X.
2018-04-01
This paper presents a new standardized data format named Fire Markup Language (FireML), extended by the Geography Markup Language (GML) of OGC, to elaborate upon the fire hazard model. The proposed FireML is able to standardize the input and output documents of a fire model for effectively communicating with different disaster management systems to ensure a good interoperability. To demonstrate the usage of FireML and testify its feasibility, an adopted forest fire spread model being compatible with FireML is described. And a 3DGIS disaster management system is developed to simulate the dynamic procedure of forest fire spread with the defined FireML documents. The proposed approach will enlighten ones who work on other disaster models' standardization work.
40 CFR 266.106 - Standards to control metals emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... HAZARDOUS WASTE MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.106... implemented by limiting feed rates of the individual metals to levels during the trial burn (for new... screening limit for the worst-case stack. (d) Tier III and Adjusted Tier I site-specific risk assessment...
40 CFR 266.106 - Standards to control metals emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... HAZARDOUS WASTE MANAGEMENT FACILITIES Hazardous Waste Burned in Boilers and Industrial Furnaces § 266.106... implemented by limiting feed rates of the individual metals to levels during the trial burn (for new... screening limit for the worst-case stack. (d) Tier III and Adjusted Tier I site-specific risk assessment...
49 CFR 238.431 - Brake system.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false Brake system. 238.431 Section 238.431... Equipment § 238.431 Brake system. (a) A passenger train's brake system shall be capable of stopping the... train is operating under worst-case adhesion conditions. (b) The brake system shall be designed to allow...
The off-site consequence analysis (OCA) evaluates the potential for worst-case and alternative accidental release scenarios to harm the public and environment around the facility. Public disclosure would likely reduce the number/severity of incidents.
33 CFR 155.1230 - Response plan development and evaluation criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... VESSELS Response plan requirements for vessels carrying animal fats and vegetable oils as a primary cargo... carry animal fats or vegetable oils as a primary cargo must provide information in their plan that identifies— (1) Procedures and strategies for responding to a worst case discharge of animal fats or...
33 CFR 155.1230 - Response plan development and evaluation criteria.
Code of Federal Regulations, 2010 CFR
2010-07-01
... VESSELS Response plan requirements for vessels carrying animal fats and vegetable oils as a primary cargo... carry animal fats or vegetable oils as a primary cargo must provide information in their plan that identifies— (1) Procedures and strategies for responding to a worst case discharge of animal fats or...
33 CFR 154.1029 - Worst case discharge.
Code of Federal Regulations, 2010 CFR
2010-07-01
... facility. The discharge from each pipe is calculated as follows: The maximum time to discover the release from the pipe in hours, plus the maximum time to shut down flow from the pipe in hours (based on... vessel regardless of the presence of secondary containment; plus (2) The discharge from all piping...
33 CFR 154.1029 - Worst case discharge.
Code of Federal Regulations, 2011 CFR
2011-07-01
... facility. The discharge from each pipe is calculated as follows: The maximum time to discover the release from the pipe in hours, plus the maximum time to shut down flow from the pipe in hours (based on... vessel regardless of the presence of secondary containment; plus (2) The discharge from all piping...
33 CFR 154.1029 - Worst case discharge.
Code of Federal Regulations, 2012 CFR
2012-07-01
... facility. The discharge from each pipe is calculated as follows: The maximum time to discover the release from the pipe in hours, plus the maximum time to shut down flow from the pipe in hours (based on... vessel regardless of the presence of secondary containment; plus (2) The discharge from all piping...
33 CFR 154.1029 - Worst case discharge.
Code of Federal Regulations, 2013 CFR
2013-07-01
... facility. The discharge from each pipe is calculated as follows: The maximum time to discover the release from the pipe in hours, plus the maximum time to shut down flow from the pipe in hours (based on... vessel regardless of the presence of secondary containment; plus (2) The discharge from all piping...
33 CFR 154.1029 - Worst case discharge.
Code of Federal Regulations, 2014 CFR
2014-07-01
... facility. The discharge from each pipe is calculated as follows: The maximum time to discover the release from the pipe in hours, plus the maximum time to shut down flow from the pipe in hours (based on... vessel regardless of the presence of secondary containment; plus (2) The discharge from all piping...
33 CFR 155.1230 - Response plan development and evaluation criteria.
Code of Federal Regulations, 2013 CFR
2013-07-01
... VESSELS Response plan requirements for vessels carrying animal fats and vegetable oils as a primary cargo... carry animal fats or vegetable oils as a primary cargo must provide information in their plan that identifies— (1) Procedures and strategies for responding to a worst case discharge of animal fats or...
33 CFR 155.1230 - Response plan development and evaluation criteria.
Code of Federal Regulations, 2014 CFR
2014-07-01
... VESSELS Response plan requirements for vessels carrying animal fats and vegetable oils as a primary cargo... carry animal fats or vegetable oils as a primary cargo must provide information in their plan that identifies— (1) Procedures and strategies for responding to a worst case discharge of animal fats or...
33 CFR 155.1230 - Response plan development and evaluation criteria.
Code of Federal Regulations, 2012 CFR
2012-07-01
... VESSELS Response plan requirements for vessels carrying animal fats and vegetable oils as a primary cargo... carry animal fats or vegetable oils as a primary cargo must provide information in their plan that identifies— (1) Procedures and strategies for responding to a worst case discharge of animal fats or...
Competitive Strategies and Financial Performance of Small Colleges
ERIC Educational Resources Information Center
Barron, Thomas A., Jr.
2017-01-01
Many institutions of higher education are facing significant financial challenges, resulting in diminished economic viability and, in the worst cases, the threat of closure (Moody's Investor Services, 2015). The study was designed to explore the effectiveness of competitive strategies for small colleges in terms of financial performance. Five…
40 CFR 63.11980 - What are the test methods and calculation procedures for process wastewater?
Code of Federal Regulations, 2013 CFR
2013-07-01
... calculation procedures for process wastewater? 63.11980 Section 63.11980 Protection of Environment... § 63.11980 What are the test methods and calculation procedures for process wastewater? (a) Performance... performance tests during worst-case operating conditions for the PVCPU when the process wastewater treatment...
40 CFR 63.11980 - What are the test methods and calculation procedures for process wastewater?
Code of Federal Regulations, 2012 CFR
2012-07-01
... calculation procedures for process wastewater? 63.11980 Section 63.11980 Protection of Environment... § 63.11980 What are the test methods and calculation procedures for process wastewater? (a) Performance... performance tests during worst-case operating conditions for the PVCPU when the process wastewater treatment...
40 CFR 63.11980 - What are the test methods and calculation procedures for process wastewater?
Code of Federal Regulations, 2014 CFR
2014-07-01
... calculation procedures for process wastewater? 63.11980 Section 63.11980 Protection of Environment... § 63.11980 What are the test methods and calculation procedures for process wastewater? (a) Performance... performance tests during worst-case operating conditions for the PVCPU when the process wastewater treatment...
30 CFR 254.21 - How must I format my response plan?
Code of Federal Regulations, 2010 CFR
2010-07-01
... divide your response plan for OCS facilities into the sections specified in paragraph (b) and explained in the other sections of this subpart. The plan must have an easily found marker identifying each.... (ii) Contractual agreements. (iii) Worst case discharge scenario. (iv) Dispersant use plan. (v) In...
Safety in the Chemical Laboratory: Laboratory Air Quality: Part I. A Concentration Model.
ERIC Educational Resources Information Center
Butcher, Samuel S.; And Others
1985-01-01
Offers a simple model for estimating vapor concentrations in instructional laboratories. Three methods are described for measuring ventilation rates, and the results of measurements in six laboratories are presented. The model should provide a simple screening tool for evaluating worst-case personal exposures. (JN)
A Didactic Analysis of Functional Queues
ERIC Educational Resources Information Center
Rinderknecht, Christian
2011-01-01
When first introduced to the analysis of algorithms, students are taught how to assess the best and worst cases, whereas the mean and amortized costs are considered advanced topics, usually saved for graduates. When presenting the latter, aggregate analysis is explained first because it is the most intuitive kind of amortized analysis, often…
Keyboard before Head Tracking Depresses User Success in Remote Camera Control
NASA Astrophysics Data System (ADS)
Zhu, Dingyun; Gedeon, Tom; Taylor, Ken
In remote mining, operators of complex machinery have more tasks or devices to control than they have hands. For example, operating a rock breaker requires two handed joystick control to position and fire the jackhammer, leaving the camera control to either automatic control or require the operator to switch between controls. We modelled such a teleoperated setting by performing experiments using a simple physical game analogue, being a half size table soccer game with two handles. The complex camera angles of the mining application were modelled by obscuring the direct view of the play area and the use of a Pan-Tilt-Zoom (PTZ) camera. The camera control was via either a keyboard or via head tracking using two different sets of head gestures called “head motion” and “head flicking” for turning camera motion on/off. Our results show that the head motion control was able to provide a comparable performance to using a keyboard, while head flicking was significantly worse. In addition, the sequence of use of the three control methods is highly significant. It appears that use of the keyboard first depresses successful use of the head tracking methods, with significantly better results when one of the head tracking methods was used first. Analysis of the qualitative survey data collected supports that the worst (by performance) method was disliked by participants. Surprisingly, use of that worst method as the first control method significantly enhanced performance using the other two control methods.
Carstens, Keri; Anderson, Jennifer; Bachman, Pamela; De Schrijver, Adinda; Dively, Galen; Federici, Brian; Hamer, Mick; Gielkens, Marco; Jensen, Peter; Lamp, William; Rauschen, Stefan; Ridley, Geoff; Romeis, Jörg; Waggoner, Annabel
2012-08-01
Environmental risk assessments (ERA) support regulatory decisions for the commercial cultivation of genetically modified (GM) crops. The ERA for terrestrial agroecosystems is well-developed, whereas guidance for ERA of GM crops in aquatic ecosystems is not as well-defined. The purpose of this document is to demonstrate how comprehensive problem formulation can be used to develop a conceptual model and to identify potential exposure pathways, using Bacillus thuringiensis (Bt) maize as a case study. Within problem formulation, the insecticidal trait, the crop, the receiving environment, and protection goals were characterized, and a conceptual model was developed to identify routes through which aquatic organisms may be exposed to insecticidal proteins in maize tissue. Following a tiered approach for exposure assessment, worst-case exposures were estimated using standardized models, and factors mitigating exposure were described. Based on exposure estimates, shredders were identified as the functional group most likely to be exposed to insecticidal proteins. However, even using worst-case assumptions, the exposure of shredders to Bt maize was low and studies supporting the current risk assessments were deemed adequate. Determining if early tier toxicity studies are necessary to inform the risk assessment for a specific GM crop should be done on a case by case basis, and should be guided by thorough problem formulation and exposure assessment. The processes used to develop the Bt maize case study are intended to serve as a model for performing risk assessments on future traits and crops.
ITS Institutional and Legal Issues Program : Review of the SaFIRES Operational Test
DOT National Transportation Integrated Search
1995-06-30
The SaFIRES operational test was chosen by the FHWA to be the subject of a case study. Several case studies were performed under the Intelligent Transportation Systems ITS Institutional and Legal Issues Program, which was developed in response to the...
2000-08-01
forefoot with the foot in the neutral position, and (b) similar to (a) but with heel landing. Although the authors reported no absolute strain values...diameter of sensors (or, in the case of a rectangular sensor, width as measured along pin axis). Worst case : Strike line from inside edges of sensors...potoroo it is just prior to "toe strike ". The locomotion of the potoroo is described as digitigrade, unlike humans, who walk in a plantigrade manner
Space Based Intelligence, Surveillance, and Reconnaissance Contribution to Global Strike in 2035
2012-02-15
include using high altitude air platforms and airships as a short-term solution, and small satellites with an Operationally Responsive Space (ORS) launch...irreversible threats, along with a worst case scenario. Section IV provides greater detail of the high altitude air platform, airship , and commercial space...Resultantly, the U.S. could use high altitude air platforms, airships , and cyber to complement its space systems in case of denial, degradation, or
Communication and implementation of GIS data in fire management: a case study
Kenneth G. Boykin; Douglas I. Boykin; Rusty Stovall; Ryan Whitaker
2008-01-01
Remotely sensed data and Geographical Information Systems (GIS) can be an effective tool in fire management. For the inclusion of these tools, fire management and research personnel must be effective in communication regarding needs and limitations of the data and implementing that data at various scales. A number of personnel can be involved within fire management...
Mechanical mid-story reduction treatments for forest fuel management
B. Rummer; K. Outcalt; D. Brockway
2002-01-01
There are many forest stands where exclusion of fire or lack of management has led to dense understorys and fuel accumulation. Generally, the least expensive treatment is to introduce a regime of prescribed fire as a surrogate for natural forest fire processes in these stands. However, in some cases prescribed fire is not an option. For example, heavy fuel loadings may...
Fire safety of ground-based space facilities on the spaceport ;Vostochny;
NASA Astrophysics Data System (ADS)
Artamonov, Vladimir S.; Gordienko, Denis M.; Melikhov, Anatoly S.
2017-06-01
The facilities of the spaceport ;Vostochny; and the innovative technologies for fire safety to be implemented are considered. The planned approaches and prospects for fire safety ensuring at the facilities of the spaceport ;Vostochny; are presented herein, based on the study of emergency situations having resulted in fire accidents and explosion cases at the facilities supporting space vehicles operation.
Seasonal Fluctuation in Moisture Content of Pine Foliage
Von J. Johnson
1966-01-01
Green or living fuels, particularly pine crowns, are commonly consumed by forest fires burning hot, windy weather. In some cases the pine crown fire has been known to burn ahead of surface-burning fire for some distance before dropping to the ground.
46 CFR 28.825 - Excess fire detection and protection equipment.
Code of Federal Regulations, 2011 CFR
2011-10-01
...” “CARBON DIOXIDE FIRE SYSTEM” or “FOAM FIRE SYSTEM”, as the case may be; (v) Instructions for the operation... be locked, a key to the space or enclosure shall be in a break-glass-type box conspicuously located...
NASA Technical Reports Server (NTRS)
Botteri, Benito P.
1987-01-01
During the past 15 years, very significant progress has been made toward enhancing aircraft fire safety in both normal and hostile (combat) operational environments. Most of the major aspects of the aircraft fire safety problem are touched upon here. The technology of aircraft fire protection, although not directly applicable in all cases to spacecraft fire scenarios, nevertheless does provide a solid foundation to build upon. This is particularly true of the extensive research and testing pertaining to aircraft interior fire safety and to onboard inert gas generation systems, both of which are still active areas of investigation.
Fire safety: A case study of technology transfer
NASA Technical Reports Server (NTRS)
Heins, C. F.
1975-01-01
Two basic ways in which NASA-generated technology is being used by the fire safety community are described. First, improved products and systems that embody NASA technical advances are entering the marketplace. Second, NASA test data and technical information related to fire safety are being used by persons concerned with reducing the hazards of fire through improved design information and standards. The development of commercial fire safety products and systems typically requires adaptation and integration of aerospace technologies that may not have been originated for NASA fire safety applications.
ERIC Educational Resources Information Center
Goldstein, Philip J.
2009-01-01
The phrase "worst since the Great Depression" has seemingly punctuated every economic report. The United States is experiencing the worst housing market, the worst unemployment level, and the worst drop in gross domestic product since the Great Depression. Although the steady drumbeat of bad news may have made everyone nearly numb, one…
Doberentz, E; Genneper, L; Böker, D; Lignitz, E; Madea, B
2014-11-01
The expression of heat shock proteins (hsp) increases in case of variable types of endogenous and exogenous cellular stress, as for example thermal stress. Immunohistochemical staining with hsp antibodies can visualize these stress proteins. Fifty-three cases of death due to heat and a control group of 100 deaths without any antemortem thermic stress were examined regarding hsp27 and hsp70 expression in myocardial, pulmonary, and renal tissues. The results revealed a correlation between hsp expression, survival time, and cause of death. In cases of death due to fire, the expression of hsp is more extensive than in the control group, especially in pulmonary and renal tissues. The immunohistochemical investigation of an hsp expression can support the proof of vitality in cases of death related to fire.
Remote Sensing of Chaparral Fire Potential: Case Study in Topanga Canyon, California.
Remote sensing techniques, especially the use of color infrared aerial photography, provide a useful tool for fire hazard analysis, including interpetive information about fuel volumes, physiognomic plant groupings, the relationships of buildings to both natural and planted vegetation, and fire vulnerability of roofing materials. In addition, the behavior of the September, 1970 Wright Fire in the Topanga study area suggested the validity of the fire potential analysis which had been made prior to that conflagration.
An unusual case of random fire-setting behavior associated with lacunar stroke.
Bosshart, Herbert; Capek, Sonia
2011-06-15
A case of a 47-year-old man with a sudden onset of a bizarre and random fire-setting behavior is reported. The man, who had been arrested on felony arson charges, complained of difficulties concentrating and of recent memory impairment. Axial T1-weighted magnetic resonance imaging showed a low intensity lacunar lesion in the genu and anterior limb of the left internal capsule. A neuropsychological test battery revealed lower than normal scores for executive functions, attention and memory, consistent with frontal lobe dysfunction. The recent onset of fire-setting behavior and the chronic nature of the lacunar lesion, together with an unremarkable performance on tests measuring executive functions two years prior, suggested a causal relationship between this organic brain lesion and the fire-setting behavior. The present case describes a rare and as yet unreported association between random impulse-driven fire-setting behavior and damage to the left internal capsule and suggests a disconnection of frontal lobe structures as a possible pathogenic mechanism. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Exogenous lipoid pneumonia – a case report of a fire-eater
Pielaszkiewicz-Wydra, Magdalena; Homola-Piekarska, Bożena; Szcześniak, Ewa; Ciołek-Zdun, Monika; Fall, Andrzej
2012-01-01
Summary Background: Exogenous lipoid pneumonia is an uncommon condition caused by inhalation or aspiration of a fatty substance. It usually presents as chronic respiratory illness mimicking interstitial lung diseases. Acute exogenous lipoid pneumonia is uncommon and typically is caused by an episode of aspiration of a large quantity of a petroleum-based product. Radiological findings vary and may imitate many other diseases. Case Report: We present a rare case of acute exogenous lipoid pneumonia in a fire-eater who aspirated liquid paraffin during his flame-blowing show (fire-eater’s lung). He was admitted to the hospital with productive cough, fever, hemoptysis, chest pain and dyspnea. Diagnosis was made on the basis of history of exposure to fatty substance, characteristic findings in CT examination and presence of lipid-laden macrophages in bronchoalveolar lavage fluid. Conclusions: Acute exogenous lipoid pneumonia is a very rare disease that typically occurs in fire-eaters and is called a fire-eater’s lung. The diagnosis is made on the basis of typical history and radiological, as well as histopathological findings. PMID:23269939
Climate and Wildfire in Mountains of the Western United States
NASA Astrophysics Data System (ADS)
Alfaro, E.; Westerling, A. L.; Cayan, D. R.
2004-12-01
Since the mid-1980s, there has been a dramatic increase in the area burned in wildfires in mountain forests of the western United States, with mean annual area burned nearly three and a half times higher compared to the preceding one and a half decades.(1) Concomitant increases in variability in annual area burned and in fire suppression costs pose a serious challenge for land management in the mountainous West. The variance in annual area burned since 1987 is nineteen times its previous level. Since managers must be prepared for the worst possible scenarios in every fire season, increased uncertainty about the scale of the western fire season each year imposes high costs on public agencies. Annual real suppression costs in western forests have more than doubled for the Forest Service since 1987, while the variance in annual suppression costs is over four times higher. Although federal agencies' fire suppression budgets have increased recently, they are still close to what would be spent in an "average" year that seldom occurs, while costs tend to fluctuate between low and high extremes. Modeling area burned and suppression costs as a function of climate variability alone, Westerling (2004, unpublished work) found that the probability of the Forest Service's suppression expenses exceeding the current annual suppression budget has exceeded 50% since 1987, a substantial increase from the one-in-three chance over the preceding 40 years. Recent progress in our understanding of the links between climate and wildfire, and in our ability to forecast some aspects of both climate and wildfire season severity a season or more in advance, offers some hope that these costs might be ameliorated through the integration of climate information into fire and fuels management. In addition to the effects of climate variability on wildfire, long-term biomass accumulations in some western ecosystems have fueled an increasing incidence of large, stand-replacing wildfires where such fires were previously rare. These severe large fires can result in erosion and changes in vegetation type, with consequences for water quality, stream flow, future biological productivity of the affected areas, and habitat loss for endangered species. Apart from their deleterious ecological consequences, severe fires can also dramatically affect amenity values for public lands and for homeowners living in the wildland-urban interface. In the National Fire Plan, land management agencies have committed to reducing fuels on millions of hectares of public lands. The primary means are mechanical removal, prescribed fire and wildland fire use. The Forest Service estimates they will need to spend hundreds of millions of dollars per year to meet their fuel reduction targets, while efforts in recent years have not kept up with the current rate of biomass increase. Use of climate information for targeting resources and scheduling prescribed burns could increase the efficiency of these efforts. In this study we review the fire history since 1970 for western mountain forests, and demonstrate apparent links between regional climate variability and decadal-scale changes in annual area burned. This analysis explores how wildfire size and frequency have varied over the past thirty-five years by elevation and latitude, and how climate indices such as precipitation, temperature, drought indices and the timing of spring runoff vary in importance for fire season severity by elevation in forests around the western United States.
NASA Astrophysics Data System (ADS)
Lee, A.; Jung, N. S.; Mokhtari Oranj, L.; Lee, H. S.
2018-06-01
The leakage of radioactive materials generated at particle accelerator facilities is one of the important issues in the view of radiation safety. In this study, fire and flooding at particle accelerator facilities were considered as the non-radiation disasters which result in the leakage of radioactive materials. To analyse the expected effects at each disaster, the case study on fired and flooded particle accelerator facilities was carried out with the property investigation of interesting materials presented in the accelerator tunnel and the activity estimation. Five major materials in the tunnel were investigated: dust, insulators, concrete, metals and paints. The activation levels on the concerned materials were calculated using several Monte Carlo codes (MCNPX 2.7+SP-FISPACT 2007, FLUKA 2011.4c and PHITS 2.64+DCHAIN-SP 2001). The impact weight to environment was estimated for the different beam particles (electron, proton, carbon and uranium) and the different beam energies (100, 430, 600 and 1000 MeV/nucleon). With the consideration of the leakage path of radioactive materials due to fire and flooding, the activation level of selected materials, and the impacts to the environment were evaluated. In the case of flooding, dust, concrete and metal were found as a considerable object. In the case of fire event, dust, insulator and paint were the major concerns. As expected, the influence of normal fire and flooding at electron accelerator facilities would be relatively low for both cases.
2016-09-01
iterations in that time for the student practitioners to work through. When possible, case studies will be selected from actual counter-radicalizations...justify participation in the learning 9 organization. Those cases will be evaluated on a case -by- case basis and the need to expand the CVE mission...interested within the learning organization. The National Fire Academy Executive Fire Officer Program applied research pre -course is an example of
The lionfish Pterois sp. invasion: Has the worst-case scenario come to pass?
Côté, I M; Smith, N S
2018-03-01
This review revisits the traits thought to have contributed to the success of Indo-Pacific lionfish Pterois sp. as an invader in the western Atlantic Ocean and the worst-case scenario about their potential ecological effects in light of the more than 150 studies conducted in the past 5 years. Fast somatic growth, resistance to parasites, effective anti-predator defences and an ability to circumvent predator recognition mechanisms by prey have probably contributed to rapid population increases of lionfish in the invaded range. However, evidence that lionfish are strong competitors is still ambiguous, in part because demonstrating competition is challenging. Geographic spread has likely been facilitated by the remarkable capacity of lionfish for prolonged fasting in combination with other broad physiological tolerances. Lionfish have had a large detrimental effect on native reef-fish populations in the northern part of the invaded range, but similar effects have yet to be seen in the southern Caribbean. Most other envisaged direct and indirect consequences of lionfish predation and competition, even those that might have been expected to occur rapidly, such as shifts in benthic composition, have yet to be realized. Lionfish populations in some of the first areas invaded have started to decline, perhaps as a result of resource depletion or ongoing fishing and culling, so there is hope that these areas have already experienced the worst of the invasion. In closing, we place lionfish in a broader context and argue that it can serve as a new model to test some fundamental questions in invasion ecology. © 2018 The Fisheries Society of the British Isles.
Kiatpongsan, Sorapop; Kim, Jane J
2014-01-01
Current prophylactic vaccines against human papillomavirus (HPV) target two of the most oncogenic types, HPV-16 and -18, which contribute to roughly 70% of cervical cancers worldwide. Second-generation HPV vaccines include a 9-valent vaccine, which targets five additional oncogenic HPV types (i.e., 31, 33, 45, 52, and 58) that contribute to another 15-30% of cervical cancer cases. The objective of this study was to determine a range of vaccine costs for which the 9-valent vaccine would be cost-effective in comparison to the current vaccines in two less developed countries (i.e., Kenya and Uganda). The analysis was performed using a natural history disease simulation model of HPV and cervical cancer. The mathematical model simulates individual women from an early age and tracks health events and resource use as they transition through clinically-relevant health states over their lifetime. Epidemiological data on HPV prevalence and cancer incidence were used to adapt the model to Kenya and Uganda. Health benefit, or effectiveness, from HPV vaccination was measured in terms of life expectancy, and costs were measured in international dollars (I$). The incremental cost of the 9-valent vaccine included the added cost of the vaccine counterbalanced by costs averted from additional cancer cases prevented. All future costs and health benefits were discounted at an annual rate of 3% in the base case analysis. We conducted sensitivity analyses to investigate how infection with multiple HPV types, unidentifiable HPV types in cancer cases, and cross-protection against non-vaccine types could affect the potential cost range of the 9-valent vaccine. In the base case analysis in Kenya, we found that vaccination with the 9-valent vaccine was very cost-effective (i.e., had an incremental cost-effectiveness ratio below per-capita GDP), compared to the current vaccines provided the added cost of the 9-valent vaccine did not exceed I$9.7 per vaccinated girl. To be considered very cost-effective, the added cost per vaccinated girl could go up to I$5.2 and I$16.2 in the worst-case and best-case scenarios, respectively. At a willingness-to-pay threshold of three times per-capita GDP where the 9-valent vaccine would be considered cost-effective, the thresholds of added costs associated with the 9-valent vaccine were I$27.3, I$14.5 and I$45.3 per vaccinated girl for the base case, worst-case and best-case scenarios, respectively. In Uganda, vaccination with the 9-valent vaccine was very cost-effective when the added cost of the 9-valent vaccine did not exceed I$8.3 per vaccinated girl. To be considered very cost-effective, the added cost per vaccinated girl could go up to I$4.5 and I$13.7 in the worst-case and best-case scenarios, respectively. At a willingness-to-pay threshold of three times per-capita GDP, the thresholds of added costs associated with the 9-valent vaccine were I$23.4, I$12.6 and I$38.4 per vaccinated girl for the base case, worst-case and best-case scenarios, respectively. This study provides a threshold range of incremental costs associated with the 9-valent HPV vaccine that would make it a cost-effective intervention in comparison to currently available HPV vaccines in Kenya and Uganda. These prices represent a 71% and 61% increase over the price offered to the GAVI Alliance ($5 per dose) for the currently available 2- and 4-valent vaccines in Kenya and Uganda, respectively. Despite evidence of cost-effectiveness, critical challenges around affordability and feasibility of HPV vaccination and other competing needs in low-resource settings such as Kenya and Uganda remain.
Kiatpongsan, Sorapop; Kim, Jane J.
2014-01-01
Background Current prophylactic vaccines against human papillomavirus (HPV) target two of the most oncogenic types, HPV-16 and -18, which contribute to roughly 70% of cervical cancers worldwide. Second-generation HPV vaccines include a 9-valent vaccine, which targets five additional oncogenic HPV types (i.e., 31, 33, 45, 52, and 58) that contribute to another 15–30% of cervical cancer cases. The objective of this study was to determine a range of vaccine costs for which the 9-valent vaccine would be cost-effective in comparison to the current vaccines in two less developed countries (i.e., Kenya and Uganda). Methods and Findings The analysis was performed using a natural history disease simulation model of HPV and cervical cancer. The mathematical model simulates individual women from an early age and tracks health events and resource use as they transition through clinically-relevant health states over their lifetime. Epidemiological data on HPV prevalence and cancer incidence were used to adapt the model to Kenya and Uganda. Health benefit, or effectiveness, from HPV vaccination was measured in terms of life expectancy, and costs were measured in international dollars (I$). The incremental cost of the 9-valent vaccine included the added cost of the vaccine counterbalanced by costs averted from additional cancer cases prevented. All future costs and health benefits were discounted at an annual rate of 3% in the base case analysis. We conducted sensitivity analyses to investigate how infection with multiple HPV types, unidentifiable HPV types in cancer cases, and cross-protection against non-vaccine types could affect the potential cost range of the 9-valent vaccine. In the base case analysis in Kenya, we found that vaccination with the 9-valent vaccine was very cost-effective (i.e., had an incremental cost-effectiveness ratio below per-capita GDP), compared to the current vaccines provided the added cost of the 9-valent vaccine did not exceed I$9.7 per vaccinated girl. To be considered very cost-effective, the added cost per vaccinated girl could go up to I$5.2 and I$16.2 in the worst-case and best-case scenarios, respectively. At a willingness-to-pay threshold of three times per-capita GDP where the 9-valent vaccine would be considered cost-effective, the thresholds of added costs associated with the 9-valent vaccine were I$27.3, I$14.5 and I$45.3 per vaccinated girl for the base case, worst-case and best-case scenarios, respectively. In Uganda, vaccination with the 9-valent vaccine was very cost-effective when the added cost of the 9-valent vaccine did not exceed I$8.3 per vaccinated girl. To be considered very cost-effective, the added cost per vaccinated girl could go up to I$4.5 and I$13.7 in the worst-case and best-case scenarios, respectively. At a willingness-to-pay threshold of three times per-capita GDP, the thresholds of added costs associated with the 9-valent vaccine were I$23.4, I$12.6 and I$38.4 per vaccinated girl for the base case, worst-case and best-case scenarios, respectively. Conclusions This study provides a threshold range of incremental costs associated with the 9-valent HPV vaccine that would make it a cost-effective intervention in comparison to currently available HPV vaccines in Kenya and Uganda. These prices represent a 71% and 61% increase over the price offered to the GAVI Alliance ($5 per dose) for the currently available 2- and 4-valent vaccines in Kenya and Uganda, respectively. Despite evidence of cost-effectiveness, critical challenges around affordability and feasibility of HPV vaccination and other competing needs in low-resource settings such as Kenya and Uganda remain. PMID:25198104
Fire as a galvanizing and fragmenting influence on communities: the case of the Rodeo-Chediski fire
Matthew S. Carroll; Patricia J. Cohn; David N. Seesholtz; Lorie L. Higgins
2005-01-01
Large wildfires that burn through the "forest-residential intermix" are complex events with a variety of social impacts. This study looks at three northern Arizona community clusters directly affected by the 2002 Rodeo-Chediski fire. Our analysis suggests that the fire event led to both the emergence of cohesion and conflict in the study area. Community...
Max A. Moritz; Dennis C. Odion
2006-01-01
Fire is often integral to forest ecology and can affect forest disease dynamics. Sudden oak death has spread across a large, fire-prone portion of California, killing large numbers of oaks and tanoaks and infecting most associated woody plants. Building on our earlier study of fire-disease dynamics, we examined spatial patterns of confirmed infections in relation to...
Use of air tankers pays off ... a case study
Clive M. Countryman
1969-01-01
Fire suppression costs in the 1967 Timber Canyon Fire, in Southern California, were increased by about $39,000 over what they would have been had air tankers not been used. But because aircraft were called into help put out the fire, fire damages were reduced by $501,375- yielding a "profit" of $461,574. Data on weather, fuels, and topography made it possible...
M. R. Kaufmann; L. S. Huckaby; P. Gleason
2000-01-01
An unlogged forest landscape in the Colorado Front Range provides insight into historical characteristics of ponderosa pine/Douglas-fir landscapes where the past fire regime was mixed severity with mean fire intervals of 50 years or more. Natural fire and tree recruitment patterns resulted in considerable spatial and temporal heterogeneity, whereas nearby forest...
Whitty, Jennifer A; Oliveira Gonçalves, Ana Sofia
2018-06-01
The aim of this study was to compare the acceptability, validity and concordance of discrete choice experiment (DCE) and best-worst scaling (BWS) stated preference approaches in health. A systematic search of EMBASE, Medline, AMED, PubMed, CINAHL, Cochrane Library and EconLit databases was undertaken in October to December 2016 without date restriction. Studies were included if they were published in English, presented empirical data related to the administration or findings of traditional format DCE and object-, profile- or multiprofile-case BWS, and were related to health. Study quality was assessed using the PREFS checklist. Fourteen articles describing 12 studies were included, comparing DCE with profile-case BWS (9 studies), DCE and multiprofile-case BWS (1 study), and profile- and multiprofile-case BWS (2 studies). Although limited and inconsistent, the balance of evidence suggests that preferences derived from DCE and profile-case BWS may not be concordant, regardless of the decision context. Preferences estimated from DCE and multiprofile-case BWS may be concordant (single study). Profile- and multiprofile-case BWS appear more statistically efficient than DCE, but no evidence is available to suggest they have a greater response efficiency. Little evidence suggests superior validity for one format over another. Participant acceptability may favour DCE, which had a lower self-reported task difficulty and was preferred over profile-case BWS in a priority setting but not necessarily in other decision contexts. DCE and profile-case BWS may be of equal validity but give different preference estimates regardless of the health context; thus, they may be measuring different constructs. Therefore, choice between methods is likely to be based on normative considerations related to coherence with theoretical frameworks and on pragmatic considerations related to ease of data collection.
Yu Wei; Matthew P. Thompson; Jessica R. Haas; Gregory K. Dillon; Christopher D. O’Connor
2018-01-01
This study introduces a large fire containment strategy that builds upon recent advances in spatial fire planning, notably the concept of potential wildland fire operation delineations (PODs). Multiple PODs can be clustered together to form a âboxâ that is referred as the âresponse PODâ (or rPOD). Fire lines would be built along the boundary of an rPOD to contain a...
Koljonen, Virve; Mäkisalo, Heikki
2013-01-01
This article reviews the recent literature on operating room fires. Most of the reported cases have occurred from a spark from an ignition source in an oxygen-enriched atmosphere. Fire requires the presence of three components which all are ample in the operating room: heat, flammable materials or flammable gases.
Hu, L H; Peng, W; Huo, R
2008-01-15
In case of a tunnel fire, toxic gas and smoke particles released are the most fatal contaminations. It is important to supply fresh air from the upwind side to provide a clean and safe environment upstream from the fire source for people evacuation. Thus, the critical longitudinal wind velocity for arresting fire induced upwind gas and smoke dispersion is a key criteria for tunnel safety design. Former studies and thus, the models built for estimating the critical wind velocity are all arbitrarily assuming that the fire takes place at the centre of the tunnel. However, in many real cases in road tunnels, the fire originates near the sidewall. The critical velocity of a near-wall fire should be different with that of a free-standing central fire due to their different plume entrainment process. Theoretical analysis and CFD simulation were performed in this paper to estimate the critical velocity for the fire near the sidewall. Results showed that when fire originates near the sidewall, it needs larger critical velocity to arrest the upwind gas and smoke dispersion than when fire at the centre. The ratio of critical velocity of a near-wall fire to that of a central fire was ideally estimated to be 1.26 by theoretical analysis. Results by CFD modelling showed that the ratio decreased with the increase of the fire size till near to unity. The ratio by CFD modelling was about 1.18 for a 500kW small fire, being near to and a bit lower than the theoretically estimated value of 1.26. However, the former models, including those of Thomas (1958, 1968), Dangizer and Kenndey (1982), Oka and Atkinson (1995), Wu and Barker (2000) and Kunsch (1999, 2002), underestimated the critical velocity needed for a fire near the tunnel sidewall.
78 FR 53494 - Dam Safety Modifications at Cherokee, Fort Loudoun, Tellico, and Watts Bar Dams
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-29
... fundamental part of this mission was the construction and operation of an integrated system of dams and... by the Federal Emergency Management Agency, TVA prepares for the worst case flooding event in order... appropriate best management practices during all phases of construction and maintenance associated with the...
NASA Technical Reports Server (NTRS)
Guman, W. J. (Editor)
1971-01-01
Thermal vacuum design supporting thruster tests indicate no problems under the worst case conditions of sink temperature and spin rate. The reliability of the system was calculated to be 0.92 for a five-year mission. Minus the main energy storage capacitor it is 0.98.
40 CFR 300.320 - General pattern of response.
Code of Federal Regulations, 2010 CFR
2010-07-01
...., substantial threat to the public health or welfare of the United States, worst case discharge) of the... private party efforts, and where the discharge does not pose a substantial threat to the public health or... 40 Protection of Environment 27 2010-07-01 2010-07-01 false General pattern of response. 300.320...
Small Wars 2.0: A Working Paper on Land Force Planning After Iraq and Afghanistan
2011-02-01
official examination of future ground combat demands that look genetically distinct from those undertaken in the name of the WoT. The concept of...under the worst-case rubric but for very different reasons. The latter are small wars. However, that by no means aptly describes their size
A Comparison of Learning Technologies for Teaching Spacecraft Software Development
ERIC Educational Resources Information Center
Straub, Jeremy
2014-01-01
The development of software for spacecraft represents a particular challenge and is, in many ways, a worst case scenario from a design perspective. Spacecraft software must be "bulletproof" and operate for extended periods of time without user intervention. If the software fails, it cannot be manually serviced. Software failure may…
Providing Exemplars in the Learning Environment: The Case For and Against
ERIC Educational Resources Information Center
Newlyn, David
2013-01-01
Contemporary education has moved towards the requirement of express articulation of assessment criteria and standards in an attempt to provide legitimacy in the measurement of student performance/achievement. Exemplars are provided examples of best or worst practice in the educational environment, which are designed to assist students to increase…
Ageing of Insensitive DNAN Based Melt-Cast Explosives
2014-08-01
diurnal cycle (representative of the MEAO climate). Analysis of the ingredient composition, sensitiveness, mechanical and thermal properties was...first test condition was chosen to provide a worst-case scenario. Analysis of the ingredient composition, theoretical maximum density, sensitiveness...5 4.1.1 ARX-4027 Ingredient Analysis .............................................................. 5 4.1.2 ARX-4028 Ingredient Analysis
Power Analysis for Anticipated Non-Response in Randomized Block Designs
ERIC Educational Resources Information Center
Pustejovsky, James E.
2011-01-01
Recent guidance on the treatment of missing data in experiments advocates the use of sensitivity analysis and worst-case bounds analysis for addressing non-ignorable missing data mechanisms; moreover, plans for the analysis of missing data should be specified prior to data collection (Puma et al., 2009). While these authors recommend only that…
33 CFR 154.1120 - Operating restrictions and interim operating authorization.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Facility Operating in Prince William Sound, Alaska § 154.1120 Operating restrictions and interim operating authorization. (a) The owner or operator of a TAPAA facility may not operate in Prince William Sound, Alaska... practicable, a worst case discharge or a discharge of 200,000 barrels of oil, whichever is grater, in Prince...
Facilitating Interdisciplinary Work: Using Quality Assessment to Create Common Ground
ERIC Educational Resources Information Center
Oberg, Gunilla
2009-01-01
Newcomers often underestimate the challenges of interdisciplinary work and, as a rule, do not spend sufficient time to allow them to overcome differences and create common ground, which in turn leads to frustration, unresolved conflicts, and, in the worst case scenario, discontinued work. The key to successful collaboration is to facilitate the…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
... notice is provided in accordance with the Council on Environmental Quality's regulations (40 CFR parts... interconnected, fabric-lined, sand-filled HESCO containers in order to safely pass predicted worst-case..., but will not necessarily be limited to, the potential impacts on water quality, aquatic and...
ERIC Educational Resources Information Center
Tercek, Patricia M.
This practicum study examined kindergarten teachers' perspectives regarding mixed-age groupings that included kindergarten students. The study focused on pedagogical reasons for using mixed-age grouping, ingredients necessary for successful implementation of a multiage program that includes kindergartners, and the perceived effects of a multiage…