Hydrogeology and Hydrologic Landscape Regions of Nevada
Maurer, Douglas K.; Lopes, Thomas J.; Medina, Rose L.; Smith, J. LaRue
2004-01-01
In 1999, the U.S. Environmental Protection Agency initiated a rule to protect ground water in areas other than source-water protection areas. These other sensitive ground water areas (OSGWAs) are aquifers that are not currently but could eventually be used as a source of drinking water. The OSGWA program specifically addresses existing wells that are used for underground injection of motor vehicle waste. If the injection well is in a ground-water protection area or an OSGWA, well owners must either close the well or apply for a permit. The Nevada Division of Environmental Protection will evaluate site-specific information and determine if the aquifer associated with a permit application is susceptible to contamination. A basic part of evaluating OSGWAs is characterizing the hydrogeology of aquifer systems including the lithology, hydrologic properties, soil permeability, and faulting, which partly control the susceptibility of ground water to contamination. Detailed studies that evaluate ground-water susceptibility are not practical in a largely unpopulated State like Nevada. However, existing and new information could be extrapolated to other areas of the State if there is an objective framework to transfer the information. The concept of hydrologic landscape regions, which identify areas with similar hydrologic characteristics, provides this framework. This report describes the hydrogeology and hydrologic landscape regions of Nevada. Consolidated rocks that form mountain ranges and unconsolidated sediments that fill the basins between the ranges are grouped into hydrogeologic units having similar lithology and assumed to have similar hydrologic properties. Consolidated rocks and unconsolidated sediments are the two major hydrogeologic units and comprise 51 and 49 percent of the State, respectively. Consolidated rocks are subdivided into 8 hydrogeologic units. In approximate order of decreasing horizontal hydraulic conductivity, consolidated-rock hydrogeologic units consist of: (1) carbonate rocks, Quaternary to Tertiary age; (2) basaltic, (3) rhyolitic, and (4) andesitic volcanic flows; (5) volcanic breccias, tuffs, and volcanic rocks older than Tertiary age; (6) intrusive and metamorphic rocks; (7) consolidated and semi-consolidated tuffaceous rocks and sediments; and (8) clastic rocks consisting of sandstone and siltstone. Unconsolidated sediments are subdivided into four hydrogeologic units on the basis of flow regime, topographic slope, and mapped stream channels. The four units are (1) alluvial slopes, (2) valley floors, (3) fluvial deposits, and (4) playas. Soil permeability was grouped into five descriptive categories ranging from very high to very low, which generally correspond to mapped geomorphic features such as playas and alluvial slopes. In general, soil permeability is low to moderate in northern, northeastern, and eastern Nevada and high to very high in western, southwestern, and southern Nevada. Within a particular basin, soil permeability decreases downslope from the bedrock contact. The type of parent rock, climate, and streamflow velocities are factors that likely cause these spatial patterns. Faults in unconsolidated sediments usually are barriers to ground-water flow. In consolidated rocks, permeability and ground-water flow is reduced in directions normal to the fault zone and increased in directions parallel to the fault zone. With time, mineral precipitation may seal fractures in consolidated rocks, reducing the permeability. However, continued movement along the fault may form new fractures, resulting in a fault alternating from a zone of preferred flow to a flow barrier during geologic time. The effect of faults on ground-water flow at a particular location is difficult to determine without a site- specific investigation. Hydrologic landscape regions were delineated by overlaying a grid of 100-foot (30-meter) cells over the State, estimating the value of five variables for each cell, an
NASA Astrophysics Data System (ADS)
Fazzito, Sabrina Y.; Rapalini, Augusto E.; Cortés, José M.; Terrizzano, Carla M.
2017-03-01
Palaeomagnetic data from poorly consolidated to non-consolidated late Cenozoic sediments along the central segment of the active El Tigre Fault (Central-Western Precordillera of the San Juan Province, Argentina) demonstrate broad cumulative deformation up to 450 m from the fault trace and reveal clockwise and anticlockwise vertical-axis rotations of variable magnitude. This deformation has affected in different amounts Miocene to late Pleistocene samples and indicates a complex kinematic pattern. Several inherited linear structures in the shear zone that are oblique to the El Tigre Fault may have acted as block boundary faults. Displacement along these faults may have resulted in a complex pattern of rotations. The maximum magnitude of rotation is a function of the age of the sediments sampled, with largest values corresponding to middle Miocene-lower Pliocene deposits and minimum values obtained from late Pleistocene deposits. The kinematic study is complemented by low-field anisotropy of magnetic susceptibility data to show that the local strain regime suggests a N-S stretching direction, subparallel to the strike of the main fault.
Is There a Tectonic Component On The Subsidence Process In Morelia, Mexico?
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Arciniega-Ceballos, A.; Diaz-Molina, O.; Garduno-Monroy, V.; Avila-Olivera, J.; Hernández-Madrigal, V.; Hernández-Quintero, E.
2009-12-01
Subsidence and faulting have affected cities in central Mexico for decades. This process causes substantial damages to the urban infrastructure, housing and large buildings, and is an important factor to be consider when planning urban development, land use zoning and hazard mitigation strategies. In Mexico, studies using InSAR and GPS based observations have shown that high subsidence areas are usually associated with the presence of thick lacustrine and fluvial deposits. In most cases the subsidence is closely associated with intense groundwater extraction that results in sediment consolidation. However, recent studies in the colonial city of Morelia in central Mexico show a different scenario, where groundwater extraction cannot solely explain the observed surface deformation. Our results indicate that a more complex interplay between sediment consolidation and tectonic forces is responsible for the subsidence and fault distribution within the city. The city of Morelia has experienced fault development recognized since the 80’s. This situation has led to the recognition of 9 NE-SW trending faults that cover most of its urbanized area. Displacement maps derived from differential InSAR analysis show that the La Colina fault is the highest subsiding area in Morelia with maximum annual rates over -35 mm/yr. However, lithological mapping and field reconnaissance clearly show basalts outcropping this area of high surface deformation. The subsurface characterization of the La Colina fault was carried out along 27 Ground Penetrating Radar (GPR) sections and 6 seismic tomography profiles. Assuming a constant, linear past behavior of the subsidence as observed by InSAR techniques, and based on the interpretation of the fault dislocation imaged by the shallow GPR and seismic tomography, it is suggested that the La Colina fault may have been active for the past 220-340 years and clearly pre-dates the intense water well extraction from the past century. These conditions suggest the existence of a tectonic component overlapped to the soil consolidation and its related subsidence. Therefore, these results suggest that the fault system observed within the city of Morelia may be an active segment of the Morelia-Acambay tectonic fault system.
NASA Astrophysics Data System (ADS)
Winner, A.; Saffer, D. M.; Valdez, R. D.
2014-12-01
Sediment permeability and consolidation behavior are key parameters in governing the drainage state and thus potential for excess pore fluid pressure in subduction zones. Elevated pore pressure, in turn, is one important control on the strength and sliding behavior of faults. Along many subduction margins, evidence of elevated, near-lithostatic, in situ pore pressure comes from high seismic reflectivity, low P-wave velocity (Vp), and high Vp/Vs ratios. This inference is broadly supported by numerical modeling studies that indicate elevated pore pressures are likely given high rates of burial and tectonic loading, combined with the low permeability of marine mudstones. Here, we report on a series of high-stress consolidation experiments on sediment core samples from the incoming Cocos plate obtained as part of Integrated Ocean Drilling Program (IODP) Expedition 344. Our experiments were designed to measure the consolidation behavior, permeability, and P-wave velocity of the incoming sediments over a range of confining stresses from .5 to 90 MPa. We explore a range of paths,including isostatic loading (σ1=σ2=σ3), K0 consolidation, in which the ratio of σ3/σ1 is maintained at ~0.6, and the trixial loading paths designed to maintain a near critical-state failure condition. In our tests, load is increased in a series of steps. After equilibration at each step, we conduct constant head permeability tests, and measure P-wave velocities in a "time of flight" mode. Initial results from isostatic loading tests on hemipelagic mudstone samples from 34 mbsf document consolidation and permeability-porosity trends, in which porosity decreases from 69% to 54% as stress in increased from .5 MPa to 15 MPa, and permeability decreases from 8.1 X 10-18 m2 at 1 MPa to 1.1 X 10-19 m2 at 15 MPa. P-wave velocity increases by 486-568 km/s over this effective stress range. Ultimately, data from our experiments will provide a robust basis for quantifying fluid content and pressure from seismic velocity and fault plane reflectivity at this margin, and provide data to parameterize forward models of fluid flow and consolidation.
Corporate Delivery of a Global Smart Buildings Program
Fernandes, Samuel; Granderson, Jessica; Singla, Rupam; ...
2017-11-22
Buildings account for about 40 percent of the total energy consumption in the U.S. and emit approximately one third of greenhouse gas emissions. But they also offer tremendous potential for achieving significant greenhouse gas reductions with the right savings strategies. With an increasing amount of data from buildings and advanced computational and analytical abilities, buildings can be made “smart” to optimize energy consumption and occupant comfort. Smart buildings are often characterized as having a high degree of data and system integration, connectivity and control, as well as the advanced use of data analytics. These “smarts” can enable up to 10–20%more » savings in a building, and help ensure that they persist over time. In 2009, Microsoft Corporation launched the Energy-Smart Buildings (ESB) program with a vision to improve building operations services, security and accessibility in services, and new tenant applications and services that improve productivity and optimize energy use. The ESB program focused on fault diagnostics, advanced analytics and new organizational processes and practices to support their operational integration. In addition to the ESB program, Microsoft undertook capital improvement projects that made effective use of a utility incentive program and lab consolidations over the same duration. The ESB program began with a pilot at Microsoft's Puget Sound campus that identified significant savings of up to 6–10% in the 13 pilot buildings. The success of the pilot led to a global deployment of the program. Between 2009 and 2015, there was a 23.7% reduction in annual electricity consumption (kWh) at the Puget Sound campus with 18.5% of that resulting from the ESB and lab consolidations. This article provides the results of research conducted to assess the best-practice strategies that Microsoft implemented to achieve these savings, including the fault diagnostic routines that are the foundation of the ESB program and organizational change management practices. It also presents the process that was adopted to scale the ESB program globally. We conclude with recommendations for how these successes can be generalized and replicated by other corporate enterprises.« less
Corporate Delivery of a Global Smart Buildings Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandes, Samuel; Granderson, Jessica; Singla, Rupam
Buildings account for about 40 percent of the total energy consumption in the U.S. and emit approximately one third of greenhouse gas emissions. But they also offer tremendous potential for achieving significant greenhouse gas reductions with the right savings strategies. With an increasing amount of data from buildings and advanced computational and analytical abilities, buildings can be made “smart” to optimize energy consumption and occupant comfort. Smart buildings are often characterized as having a high degree of data and system integration, connectivity and control, as well as the advanced use of data analytics. These “smarts” can enable up to 10–20%more » savings in a building, and help ensure that they persist over time. In 2009, Microsoft Corporation launched the Energy-Smart Buildings (ESB) program with a vision to improve building operations services, security and accessibility in services, and new tenant applications and services that improve productivity and optimize energy use. The ESB program focused on fault diagnostics, advanced analytics and new organizational processes and practices to support their operational integration. In addition to the ESB program, Microsoft undertook capital improvement projects that made effective use of a utility incentive program and lab consolidations over the same duration. The ESB program began with a pilot at Microsoft's Puget Sound campus that identified significant savings of up to 6–10% in the 13 pilot buildings. The success of the pilot led to a global deployment of the program. Between 2009 and 2015, there was a 23.7% reduction in annual electricity consumption (kWh) at the Puget Sound campus with 18.5% of that resulting from the ESB and lab consolidations. This article provides the results of research conducted to assess the best-practice strategies that Microsoft implemented to achieve these savings, including the fault diagnostic routines that are the foundation of the ESB program and organizational change management practices. It also presents the process that was adopted to scale the ESB program globally. We conclude with recommendations for how these successes can be generalized and replicated by other corporate enterprises.« less
NASA Astrophysics Data System (ADS)
Tsibanos, V.; Wang, G.
2017-12-01
The Long Point Fault located in Houston Texas is a complex system of normal faults which causes significant damage to urban infrastructure on both private and public property. This case study focuses on the 20-km long fault using high accuracy continuously operating global positioning satellite (GPS) stations to delineate fault movement over five years (2012 - 2017). The Long Point Fault is the longest active fault in the greater Houston area that damages roads, buried pipes, concrete structures and buildings and creates a financial burden for the city of Houston and the residents who live in close vicinity to the fault trace. In order to monitor fault displacement along the surface 11 permanent and continuously operating GPS stations were installed 6 on the hanging wall and 5 on the footwall. This study is an overview of the GPS observations from 2013 to 2017. GPS positions were processed with both relative (double differencing) and absolute Precise Point Positioning (PPP) techniques. The PPP solutions that are referred to IGS08 reference frame were transformed to the Stable Houston Reference Frame (SHRF16). Our results show no considerable horizontal displacements across the fault, but do show uneven vertical displacement attributed to regional subsidence in the range of (5 - 10 mm/yr). This subsidence can be associated to compaction of silty clays in the Chicot and Evangeline aquifers whose water depths are approximately 50m and 80m below the land surface (bls). These levels are below the regional pre-consolidation head that is about 30 to 40m bls. Recent research indicates subsidence will continue to occur until the aquifer levels reach the pre-consolidation head. With further GPS observations both the Long Point Fault and regional land subsidence can be monitored providing important geological data to the Houston community.
1998-04-01
selected is statistically based on the total number of faults and the failure rate distribution in the system under test. The fault set is also...implemented the BPM and system level emulation consolidation logic as well as statistics counters for cache misses and various bus transactions. These...instruction F22 Advanced Tactical Fighter FET Field Effect Transitor FF Flip-Flop FM Failures/Milhon hours C-3 FPGA Field Programmable Gate Array GET
Sandstone-filled normal faults: A case study from central California
NASA Astrophysics Data System (ADS)
Palladino, Giuseppe; Alsop, G. Ian; Grippa, Antonio; Zvirtes, Gustavo; Phillip, Ruy Paulo; Hurst, Andrew
2018-05-01
Despite the potential of sandstone-filled normal faults to significantly influence fluid transmissivity within reservoirs and the shallow crust, they have to date been largely overlooked. Fluidized sand, forcefully intruded along normal fault zones, markedly enhances the transmissivity of faults and, in general, the connectivity between otherwise unconnected reservoirs. Here, we provide a detailed outcrop description and interpretation of sandstone-filled normal faults from different stratigraphic units in central California. Such faults commonly show limited fault throw, cm to dm wide apertures, poorly-developed fault zones and full or partial sand infill. Based on these features and inferences regarding their origin, we propose a general classification that defines two main types of sandstone-filled normal faults. Type 1 form as a consequence of the hydraulic failure of the host strata above a poorly-consolidated sandstone following a significant, rapid increase of pore fluid over-pressure. Type 2 sandstone-filled normal faults form as a result of regional tectonic deformation. These structures may play a significant role in the connectivity of siliciclastic reservoirs, and may therefore be crucial not just for investigation of basin evolution but also in hydrocarbon exploration.
Research on Fault Characteristics and Line Protections Within a Large-scale Photovoltaic Power Plant
NASA Astrophysics Data System (ADS)
Zhang, Chi; Zeng, Jie; Zhao, Wei; Zhong, Guobin; Xu, Qi; Luo, Pandian; Gu, Chenjie; Liu, Bohan
2017-05-01
Centralized photovoltaic (PV) systems have different fault characteristics from distributed PV systems due to the different system structures and controls. This makes the fault analysis and protection methods used in distribution networks with distributed PV not suitable for a centralized PV power plant. Therefore, a consolidated expression for the fault current within a PV power plant under different controls was calculated considering the fault response of the PV array. Then, supported by the fault current analysis and the on-site testing data, the overcurrent relay (OCR) performance was evaluated in the collection system of an 850 MW PV power plant. It reveals that the OCRs at downstream side on overhead lines may malfunction. In this case, a new relay scheme was proposed using directional distance elements. In the PSCAD/EMTDC, a detailed PV system model was built and verified using the on-site testing data. Simulation results indicate that the proposed relay scheme could effectively solve the problems under variant fault scenarios and PV plant output levels.
NASA Astrophysics Data System (ADS)
Saffer, Demian M.
2003-05-01
At subduction zones, pore pressure affects fault strength, deformation style, structural development, and potentially the updip limit of seismogenic faulting behavior through its control on effective stress and consolidation state. Despite its importance for a wide range of subduction zone processes, few detailed measurements or estimates of pore pressure at subduction zones exist. In this paper, I combine logging-while-drilling (LWD) data, downhole physical properties data, and laboratory consolidation tests from the Costa Rican, Nankai, and Barbados subduction zones, to document the development and downsection variability of effective stress and pore pressure within underthrust sediments as they are progressively loaded by subduction. At Costa Rica, my results suggest that the lower portion of the underthrust section remains nearly undrained, whereas the upper portion is partially drained. An inferred minimum in effective stress developed within the section ˜1.5 km landward of the trench is consistent with core and seismic observations of faulting, and illustrates the important effects of heterogeneous drainage on structural development. Inferred pore pressures at the Nankai and northern Barbados subduction zones indicate nearly undrained conditions throughout the studied intervals, and are consistent with existing direct measurements and consolidation test results. Slower dewatering at Nankai and Barbados than at Costa Rica can be attributed to higher permeability and larger compressibility of near-surface sediments underthrust at Costa Rica. Results for the three margins indicate that the pore pressure ratio (λ) in poorly drained underthrust sediments should increase systematically with distance landward of the trench, and may vary with depth.
24 CFR 91.10 - Consolidated program year.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Consolidated program year. 91.10 Section 91.10 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS General § 91.10...
Micromechanics of sea ice gouge in shear zones
NASA Astrophysics Data System (ADS)
Sammonds, Peter; Scourfield, Sally; Lishman, Ben
2015-04-01
The deformation of sea ice is a key control on the Arctic Ocean dynamics. Shear displacement on all scales is an important deformation process in the sea cover. Shear deformation is a dominant mechanism from the scale of basin-scale shear lineaments, through floe-floe interaction and block sliding in ice ridges through to the micro-scale mechanics. Shear deformation will not only depend on the speed of movement of ice surfaces but also the degree that the surfaces have bonded during thermal consolidation and compaction. Recent observations made during fieldwork in the Barents Sea show that shear produces a gouge similar to a fault gouge in a shear zone in the crust. A range of sizes of gouge are exhibited. The consolidation of these fragments has a profound influence on the shear strength and the rate of the processes involved. We review experimental results in sea ice mechanics from mid-scale experiments, conducted in the Hamburg model ship ice tank, simulating sea ice floe motion and interaction and compare these with laboratory experiments on ice friction done in direct shear, and upscale to field measurement of sea ice friction and gouge deformation made during experiments off Svalbard. We find that consolidation, fragmentation and bridging play important roles in the overall dynamics and fit the model of Sammis and Ben-Zion, developed for understanding the micro-mechanics of rock fault gouge, to the sea ice problem.
ERIC Educational Resources Information Center
Ashby, Cornelia M.
2005-01-01
Under the Federal Family Education Loan Program (FFELP) and the Federal Direct Loan Program (FDLP), the government guarantees and makes consolidation loans to help borrowers manage their student loan debt. By combining loans into one and extending repayment, monthly repayments are reduced. Unlike other student loans, consolidation loans carry a…
Links between sediment consolidation and Cascadia megathrust slip behaviour
NASA Astrophysics Data System (ADS)
Han, Shuoshuo; Bangs, Nathan L.; Carbotte, Suzanne M.; Saffer, Demian M.; Gibson, James C.
2017-12-01
At sediment-rich subduction zones, megathrust slip behaviour and forearc deformation are tightly linked to the physical properties and in situ stresses within underthrust and accreted sediments. Yet the role of sediment consolidation at the onset of subduction in controlling the downdip evolution and along-strike variation in megathrust fault properties and accretionary wedge structure is poorly known. Here we use controlled-source seismic data combined with ocean drilling data to constrain the sediment consolidation and in situ stress state near the deformation front of the Cascadia subduction zone. Offshore Washington where the megathrust is inferred to be strongly locked, we find over-consolidated sediments near the deformation front that are incorporated into a strong outer wedge, with little sediment subducted. These conditions are favourable for strain accumulation on the megathrust and potential earthquake rupture close to the trench. In contrast, offshore Central Oregon, a thick under-consolidated sediment sequence is subducting, and is probably associated with elevated pore fluid pressures on the megathrust in a region where reduced locking is inferred. Our results suggest that the consolidation state of the sediments near the deformation front is a key factor contributing to megathrust slip behaviour and its along-strike variation, and it may also have a significant role in the deformation style of the accretionary wedge.
This final rule establishes consolidated permit program requirements governing the Hazardous Waste Management program under the Resource Conservation and Recovery Act (RCRA) and other related programs.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC.
In this report GAO recommends that the Secretary of Education assess the advantages of consolidation loans for borrowers and the government in light of program costs and identify options for reducing federal costs. Options could include targeting the program to borrowers at risk of default and extending existing consolidation alternatives to more…
ERIC Educational Resources Information Center
US General Accounting Office, 2004
2004-01-01
This statement focuses on issues related to consolidation loans and their cost implications for taxpayers and borrowers. Consolidation loans, available under the Department of Education?s (Education's) two major student loan programs?the Federal Family Education Loan Program (FFELP) and the William D. Ford Direct Loan Program (FDLP)?help borrowers…
McBride, J.H.; Stephenson, W.J.; Williams, R.A.; Odum, J.K.; Worley, D.M.; South, J.V.; Brinkerhoff, A.R.; Keach, R.W.; Okojie-Ayoro, A. O.
2010-01-01
Integrated vibroseis compressional and experimental hammer-source, shear-wave, seismic reflection profiles across the Provo segment of the Wasatch fault zone in Utah reveal near-surface and shallow bedrock structures caused by geologically recent deformation. Combining information from the seismic surveys, geologic mapping, terrain analysis, and previous seismic first-arrival modeling provides a well-constrained cross section of the upper ~500 m of the subsurface. Faults are mapped from the surface, through shallow, poorly consolidated deltaic sediments, and cutting through a rigid bedrock surface. The new seismic data are used to test hypotheses on changing fault orientation with depth, the number of subsidiary faults within the fault zone and the width of the fault zone, and the utility of integrating separate elastic methods to provide information on a complex structural zone. Although previous surface mapping has indicated only a few faults, the seismic section shows a wider and more complex deformation zone with both synthetic and antithetic normal faults. Our study demonstrates the usefulness of a combined shallow and deeper penetrating geophysical survey, integrated with detailed geologic mapping to constrain subsurface fault structure. Due to the complexity of the fault zone, accurate seismic velocity information is essential and was obtained from a first-break tomography model. The new constraints on fault geometry can be used to refine estimates of vertical versus lateral tectonic movements and to improve seismic hazard assessment along the Wasatch fault through an urban area. We suggest that earthquake-hazard assessments made without seismic reflection imaging may be biased by the previous mapping of too few faults. ?? 2010 Geological Society of America.
NASA Astrophysics Data System (ADS)
Hamahashi, Mari; Screaton, Elizabeth; Tanikawa, Wataru; Hashimoto, Yoshitaka; Martin, Kylara; Saito, Saneatsu; Kimura, Gaku
2017-07-01
Subduction of the buoyant Cocos Ridge offshore the Osa Peninsula, Costa Rica substantially affects the upper plate structure through a variety of processes, including outer forearc uplift, erosion, and focused fluid flow. To investigate the nature of a major seismic reflector (MSR) developed between slope sediments (late Pliocene-late Pleistocene silty clay) and underlying higher velocity upper plate materials (late Pliocene-early Pleistocene clayey siltstone), we infer possible mechanisms of sediment removal by examining the consolidation state, microstructure, and zeolite assemblages of sediments recovered from Integrated Ocean Drilling Program Expedition 344 Site U1380. Formation of Ca-type zeolites, laumontite and heulandite, inferred to form in the presence of Ca-rich fluids, has caused porosity reduction. We adjust measured porosity values for these pore-filling zeolites and evaluated the new porosity profile to estimate how much material was removed at the MSR. Based on the composite porosity-depth curve, we infer the past burial depth of the sediments directly below the MSR. The corrected and uncorrected porosity-depth curves yield values of 800 ± 70 m and 900 ± 70 m, respectively. We argue that deposition and removal of this entire estimated thickness in 0.49 Ma would require unrealistically large sedimentation rates and suggest that normal faulting at the MSR must contribute. The porosity offset could be explained with maximum 250 ± 70 m of normal fault throw, or 350 ± 70 m if the porosity were not corrected. The porosity correction significantly reduces the amount of sediment removal needed for the combination of mass movement and normal faulting that characterize the slope in this margin.
Guide to Direct Consolidation Loans.
ERIC Educational Resources Information Center
Department of Education, Washington, DC.
Intended for financial aid counselors, this document provides guidelines to the Federal Direct Consolidation Loan Program for borrowers who are in school, as well as those in repayment, or in default. An introductory section explains the basics of the consolidated loan program, loan categories, and interest rates. Next, standards for borrower…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-14
... 0938-AP87 Medicare Program; Prospective Payment System and Consolidated Billing for Skilled Nursing... Payment System and Consolidated Billing for Skilled Nursing Facilities for FY 2011.'' DATES: Effective... illustrate the skilled nursing facility (SNF) prospective payment system (PPS) payment rate computations for...
Pawl, Jean D; Anderson, Lori S
Consolidation of resources, programs, and even universities are measures that university systems consider for economic reasons. The transformation and restructuring of two diverse nursing programs utilized an organizational change tool to guide the consolidation efforts. Insights on how to use an organizational change model and lessons learned are shared for higher education units that may face consolidation. The ADKAR Change Management Model, one of many organizational change resources, was advantageous in consolidating two diverse nursing programs when two universities were mandated to become one. Change is inevitable yet when faced with transition and transformation, thoughtful and strong, committed leaders who portray open transparent communication are an absolute requirement for sustained change. To guide the process, the ADKAR Change Management Model is an insightful and worthwhile resource. Copyright © 2016 Elsevier Inc. All rights reserved.
Geohydrology and water chemistry in the Rialto-Colton Basin, San Bernardino County, California
Woolfenden, Linda R.; Kadhim, Dina
1997-01-01
The 40-square-mile Rialto-Colton ground- water basin is in western San Bernardino County, California, about 60 miles east of Los Angeles.This basin was chosen for storage of imported water because of the good quality of native ground water, the known capacity for additional ground-water storage in the basin, and the availability of imported water. Because the movement and mixing of imported water needed to be determined, the San Bernardino Valley Municipal Water District entered into a cooperative program with the U.S.Geological Survey in 1991 to study the geohydrology and water chemistry in the Rialto- Colton basin. Ground-water flow and chemistry were investigated using existing data, borehole- geophysical and lithologic logs from newly drilled test holes, measurement of water levels, and chemical analyses of water samples. The Rialto-Colton basin is bounded on the northwest and southeast by the San Gabriel Mountains and the Badlands, respectively. The San Jacinto Fault and Barrier E form the northeastern boundary, and the Rialto-Colton Fault forms the southwestern boundary. Except in the southeastern part of the basin, the San Jacinto and Rialto-Colton Faults act as groundwater barriers that impede ground- water flow into and out of the basin.Barrier E generally does not impede ground- water flow into the basin. The ground-water system consists primarily of gravel, sand, silt, and clay. The maximum thickness is greater than 1,000 feet. The ground- water system is divided into four water-bearing units: river-channel deposits, and upper, middle, and lower water-bearing units. Relatively impermeable consolidated deposits underlie the lower water- bearing unit and form the lower boundary of the ground- water system. Ground water moves from east to west in the river-channel deposits and upper water-bearing unit in the southeastern part of the basin, and from northwest to southeast in the middle and lower water-bearing units. Two major internal faults, Barrier J and an unnamed fault, affect ground-water movement. Ground water moves across Barrier J in the unfaulted part of the ground-water system. The unnamed fault is a partial barrier to ground-water movement in the middle water- bearing unit and an effective barrier in the lower water-bearing unit.Imported water flows laterally across the unnamed fault above the saturated zone. Major sources of recharge to the ground- water system are underflow; precipitation that collects in small streams that drain the San Gabriel Mountains and the Badlands or runs off the mountain front as sheet flow, and sub-surface inflow; imported water; seepage loss from the Santa Ana River and Warm Creek; infiltration of rainfall; and irrigation return flow. The main component of discharge is pumpage. Long-term water levels in production wells reflect precipitation cycles. During a 194777 dry period, water levels in three wells declined almost continuously?as much as 100 feet in one well.Water levels in a well north of Barrier J are not affected by stresses on the groundwater system south of the barrier, indicating that these two parts of the ground-water system are not well connected. Water levels in cluster wells east of the unnamed fault north and south of the Linden Ponds artificial-recharge site rose as much as 70 feet during 1992-95. The rise in water levels in wells near the recharge ponds was observed within 2 months after the beginning of recharge. Water levels in most wells west of the unnamed fault changed very little during 1992-95. Water-chemistry data indicate that chemical characteristics vary within the groundwater system, and that dissolvedsolids concentrations are generally higher in the river-channel deposits, upper water- bearing unit, and the consolidated deposits than in the middle and lower water-bearing units. The chemical characteristics in water from the middle water-bearing unit were similar for most wells sampled west of the unnamed fault. In water from well
ERIC Educational Resources Information Center
Blanchette, Cornelia M.
This report evaluates Department of Education opportunities to consolidate overlapping education programs, to find cost savings, and to strengthen its "gatekeeping" over schools' participation in student financial aid programs. It notes that, besides already proposed program consolidation, other programs that could be streamlined include…
Enhancing the LVRT Capability of PMSG-Based Wind Turbines Based on R-SFCL
NASA Astrophysics Data System (ADS)
Xu, Lin; Lin, Ruixing; Ding, Lijie; Huang, Chunjun
2018-03-01
A novel low voltage ride-through (LVRT) scheme for PMSG-based wind turbines based on the Resistor Superconducting Fault Current Limiter (R-SFCL) is proposed in this paper. The LVRT scheme is mainly formed by R-SFCL in series between the transformer and the Grid Side Converter (GSC), and basic modelling has been discussed in detail. The proposed LVRT scheme is implemented to interact with PMSG model in PSCAD/EMTDC under three phase short circuit fault condition, which proves that the proposed scheme based on R-SFCL can improve the transient performance and LVRT capability to consolidate grid connection with wind turbines.
NASA Astrophysics Data System (ADS)
Dang, Jiaxiang; Zhou, Yongsheng; He, Changrong; Ma, Shengli
2018-06-01
There are two co-seismic bedrock surface ruptures from the Mw 7.9 Wenchuan earthquake in the northern and central parts of the Beichuan-Yingxiu fault, Sichuan Province, southwest China. In this study, we report on the macrostructure of the fault rocks and results from X-ray powder diffraction analysis of minerals from rocks in the fault zone. The most recent fault gouge (the gouge produced by the most recent co-seismic fault movement) in all the studied outcrops is dark or grayish-black, totally unconsolidated and ultrafine-grained. Older fault gouges in the same outcrops are grayish or yellowish and weakly consolidated. X-ray powder diffraction analysis results show that mineral assemblages in both the old fault gouge and the new fault gouge are more complicated than the mineral assemblages in the bedrock as the fault gouge is rich in clay minerals. The fault gouge inherited its major rock-forming minerals from the parent rocks, but the clay minerals in the fault gouge were generated in the fault zone and are therefore authigenic and synkinematic. In profiles across the fault, clay mineral abundances increase as one traverses from the bedrock to the breccia to the old gouge and from the old gouge to the new gouge. Quartz and illite are found in all collected gouge samples. The dominant clay minerals in the new fault gouge are illite and smectite along the northern part of the surface rupture and illite/smectite mixed-layer clay in the middle part of the rupture. Illite/smectite mixed-layer clay found in the middle part of the rupture indicates that fault slip was accompanied by K-rich fluid circulation. The existence of siderite, anhydrite, and barite in the northern part of the rupture suggests that fault slip at this locality was accompanied by acidic fluids containing ions of Fe, Ca, and Ba.
ERIC Educational Resources Information Center
Ashby, Cornelia M.
2004-01-01
This study investigated: (1) differences between the Federal Family Education Loan Program (FFELP) and William D. Ford Federal Direct Loan Program (FDLP) consolidation loans and borrowers; (2) the extent to which borrowers with student loans under one program obtain consolidation loans under the other; and (3) how FFELP and FDLP borrower and loan…
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on Labor and Human Resources.
This hearing is a continuation of a bipartisan effort to consolidate, reform, and revitalize federally funded job training programs. Testimony includes statements of U.S. senators and individuals representing the following: National Association of State Job Training Coordinating Council and Human Resource Investment Council; American Federation of…
McElroy, Lisa M; Khorzad, Rebeca; Rowe, Theresa A; Abecassis, Zachary A; Apley, Daniel W; Barnard, Cynthia; Holl, Jane L
The purpose of this study was to use fault tree analysis to evaluate the adequacy of quality reporting programs in identifying root causes of postoperative bloodstream infection (BSI). A systematic review of the literature was used to construct a fault tree to evaluate 3 postoperative BSI reporting programs: National Surgical Quality Improvement Program (NSQIP), Centers for Medicare and Medicaid Services (CMS), and The Joint Commission (JC). The literature review revealed 699 eligible publications, 90 of which were used to create the fault tree containing 105 faults. A total of 14 identified faults are currently mandated for reporting to NSQIP, 5 to CMS, and 3 to JC; 2 or more programs require 4 identified faults. The fault tree identifies numerous contributing faults to postoperative BSI and reveals substantial variation in the requirements and ability of national quality data reporting programs to capture these potential faults. Efforts to prevent postoperative BSI require more comprehensive data collection to identify the root causes and develop high-reliability improvement strategies.
34 CFR 76.131 - How does an insular area apply for a consolidated grant?
Code of Federal Regulations, 2010 CFR
2010-07-01
... § 76.125(c) under which the consolidated grant funds will be used and administered; (3) Describes the goals, objectives, activities, and the means of evaluating program outcomes for the programs for which the Insular Area will use the funds received under the consolidated grant during the fiscal year for...
NASA Astrophysics Data System (ADS)
Hudson, M. R.; Minor, S. A.; Caine, J. S.
2015-12-01
Permanent strain in sediments associated with shallow fault zones can be difficult to characterize. Anisotropy of magnetic susceptibility (AMS) data were obtained from 120 samples at 6 sites to assess the nature of fault-related AMS fabrics for 4 faults cutting Miocene-Pliocene basin fill sediments of the Rio Grande rift of north-central New Mexico. The San Ysidro (3 sites), Sand Hill, and West Paradise faults within the northern Albuquerque basin have normal offset whereas an unnamed fault near Buckman in the western Española basin has oblique strike-slip offset. Previous studies have shown that detrital magnetite controls magnetic susceptibility in rift sandstones, and in a 50-m-long hanging wall traverse of the San Ysidro fault, non-gouge samples have typical sedimentary AMS fabrics with Kmax and Kint axes (defining magnetic foliation) scattered within bedding. For the 5 normal-fault sites, samples from fault cores or adjacent mixed zones that lie within 1 m of the principal slip surface developed common deformation fabrics with (1) magnetic foliation inclined in the same azimuth but more shallowly dipping than the fault plane, and (2) magnetic lineation plunging down foliation dip with nearly the same trend as the fault striae, although nearer for sand versus clay gouge samples. These relations suggest that the sampled fault materials deformed by particulate flow with alignment of magnetite grains in the plane of maximum shortening. For a 2-m-long traverse at the Buckman site, horizontal sedimentary AMS foliation persists to < 15 cm to the fault slip surface, wherein foliation in sand and clay gouge rotates toward the steeply dipping fault plane in a sense consistent with sinistral offset. Collectively these data suggest permanent deformation fabrics were localized within < 1 m of fault surfaces and that AMS fabrics from gouge samples can provide kinematic information for faults in unconsolidated sediments which may lack associated slickenlines.
Izbicki, John A.; Teague, Nicholas F.; Hatzinger, Paul B.; Böhlke, John Karl; Sturchio, Neil C.
2015-01-01
Perchlorate from military, industrial, and legacy agricultural sources is present within an alluvial aquifer in the Rialto-Colton groundwater subbasin, 80 km east of Los Angeles, California (USA). The area is extensively faulted, with water-level differences exceeding 60 m across parts of the Rialto-Colton Fault separating the Rialto-Colton and Chino groundwater subbasins. Coupled well-bore flow and depth-dependent water-quality data show decreases in well yield and changes in water chemistry and isotopic composition, reflecting changing aquifer properties and groundwater recharge sources with depth. Perchlorate movement through some wells under unpumped conditions from shallower to deeper layers underlying mapped plumes was as high as 13 kg/year. Water-level maps suggest potential groundwater movement across the Rialto-Colton Fault through an overlying perched aquifer. Upward flow through a well in the Chino subbasin near the Rialto-Colton Fault suggests potential groundwater movement across the fault through permeable layers within partly consolidated deposits at depth. Although potentially important locally, movement of groundwater from the Rialto-Colton subbasin has not resulted in widespread occurrence of perchlorate within the Chino subbasin. Nitrate and perchlorate concentrations at the water table, associated with legacy agricultural fertilizer use, may be underestimated by data from long-screened wells that mix water from different depths within the aquifer.
NASA Astrophysics Data System (ADS)
Flemings, P. B.; Song, I.; Saffer, D. M.
2012-04-01
Integrated Ocean Drilling Program (IODP) Expedition 308 was dedicated to the study of fluid flow, overpressure, and slope stability in the Ursa Basin, on the continental slope of the Gulf of Mexico. In this location, turbidite channel levees deposited a wedge-shaped body: the deposition rate in the thick part of the wedge exceeded 12 mm/yr. This rapid deposition of fine grained sediments generated excess pore pressure observed near the seafloor. IODP drilling focused on three Sites: U1322, U1323, and U1324, along the steepest slope (2°) on the eastern section of the Ursa Canyon levee deposits. In this study, we conducted a suite of deformation experiments on samples from Site 1324, to understand the stress-strain behavior and stress history of the recovered core material. Our samples were taken from depths of 30-160 meters below seafloor, and are composed of ~40% silt and ~60% clay, with porosities ranging from ~42-55%. We first conducted uniaxial consolidation tests to determine pre-consolidation stresses and define deformation behavior due to simulated vertical loading. In a subset of tests, we subjected the samples to undrained shearing following consolidation, to define the friction angle and define relationships between stress state and deformation. We find that the lateral effective stress during uniaxial compression is 56-64% of the vertical effective stress (avg. K0=0.6). Pre-consolidation stresses suggest that pore pressure is hydrostatic to 50 mbsf (meters below seafloor), and is overpressured below this, with excess pressures up to 70% of the hydrostatic effective vertical stress (λ*=0.7) at 160 mbsf. The time coefficient of consolidation (cv) in these experiments is ~2.2x10-8 m2/s. Undrained shear tests define a failure envelope with a residual friction angle (φ) of 23° and zero cohesion. In our shearing tests, we observed no pore pressure change during initial (primarily elastic) shear deformation, but note a monotonic increase in pore pressure during the later plastic shear deformation, possibly due to re-organization of sediment grains. Our consolidated undrained tests suggest that the slope in the study area should remain stable during sedimentation, despite the high overpressure (λ*=0.7). However, this stress condition could be affected by gravitational and seepage forces that cause horizontal extension along the slope. In this case, a reduction in horizontal confining stress would render the slope sediments unstable (drive them to active failure) as defined by the Coulomb criterion. If shear strain during slope failure leads to plastic deformation of the sediments, this would also induce a pore pressure increase, further decreasing the factor of safety (FS) for landslides. For the landslides of the slope (i.e., FS=1.0), the overpressure rate λ* should reach 0.92 for the given slope (2°). However, active normal faulting takes place at lower values of λ* (0.2-0.8). Our analysis suggests that the instability of the slope may arise more likely from normal faults dipping stiff (45°+φ/2) than from landslides slipping on a plane parallel to such a gentle slope of seafloor.
Geologic map of the Washougal quadrangle, Clark County, Washington, and Multnomah County, Oregon
Evarts, Russell C.; O'Connor, Jim E.; Tolan, Terry L.
2013-01-01
The Washougal 7.5’ quadrangle spans the boundary between the Portland Basin and the Columbia River Gorge, approximately 30 km east of Portland, Oregon. The map area contains the westernmost portion of the Columbia River Gorge National Scenic area as well as the rapidly growing areas surrounding the Clark County, Washington, cities of Camas and Washougal. The Columbia River transects the map area, and two major tributaries, the Washougal River in Washington and the Sandy River in Oregon, also flow through the quadrangle. The Columbia, Washougal, and Sandy Rivers have all cut deep valleys through hilly uplands, exposing Oligocene volcanic bedrock in the north part of the map area and lava flows of the Miocene Columbia River Basalt Group in the western Columbia River Gorge. Elsewhere in the map area, these older rocks are buried beneath weakly consolidated to well-consolidated Neogene and younger basin-fill sedimentary rocks and Quaternary volcanic and sedimentary deposits. The Portland Basin is part of the Coastal Lowland that separates the Cascade Range from the Oregon Coast Range. The basin has been interpreted as a pull-apart basin located in the releasing stepover between two en echelon, northwest-striking, right-lateral fault zones. These fault zones are thought to reflect regional transpression, transtension, and dextral shear within the forearc in response to oblique subduction of the Pacific plate along the Cascadia Subduction Zone. The southwestern margin of the Portland Basin is a well-defined topographic break along the base of the Tualatin Mountains, an asymmetric anticlinal ridge that is bounded on its northeast flank by the Portland Hills Fault Zone, which is probably an active structure. The nature of the corresponding northeastern margin of the basin is less clear, but a series of poorly defined and partially buried dextral extensional structures has been hypothesized from topography, microseismicity, potential-field anomalies, and reconnaissance geologic mapping. This map is a contribution to a program designed to improve the geologic database for the Portland Basin region of the Pacific Northwest urban corridor, the densely populated Cascadia forearc region of western Washington and Oregon. Updated, more detailed information on the bedrock and surficial geology of the basin and its surrounding area will facilitate improved assessments of seismic risk, and resource availability in this rapidly growing region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, A.J.; Tjokrosapoetro, S.; Charlton, T.R.
In Timor, eastern Indonesia, where the northern margin of the Australian continent is colliding with the Banda Arc, Australian continental margin sediments are being incorporated into an imbricate wedge, which passes northward into a foreland fold and thrust belt. Field mapping in Timor has shown that scale clays, containing irregularly shaped or phacoidal blocks (up to several meters long) and composed of a wide range of lithologies derived from local stratigraphic units, occur in three environments: along wrench faults, as crosscutting shale diapirs, and associated with mud volcanoes. A model is proposed linking these phenomena. Shales become overpressured as amore » result of overthrusting; this overpressure is released along vertical wrench faults, which cut through the overthrust units; overpressured shales containing blocks of consolidated units rise along the fault zones as shale diapirs; and escaping water, oil, and gas construct mud volcanoes at the surface. 6 figures, 1 table.« less
Status and Progress of a Fault Current Limiting Hts Cable to BE Installed in the con EDISON Grid
NASA Astrophysics Data System (ADS)
Maguire, J.; Folts, D.; Yuan, J.; Henderson, N.; Lindsay, D.; Knoll, D.; Rey, C.; Duckworth, R.; Gouge, M.; Wolff, Z.; Kurtz, S.
2010-04-01
In the last decade, significant advances in the performance of second generation (2G) high temperature superconducting wire have made it suitable for commercially viable applications such as electric power cables and fault current limiters. Currently, the U.S. Department of Homeland Security is co-funding the design, development and demonstration of an inherently fault current limiting HTS cable under the Hydra project with American Superconductor and Consolidated Edison. The cable will be approximately 300 m long and is being designed to carry 96 MVA at a distribution level voltage of 13.8 kV. The underground cable will be installed and energized in New York City. The project is led by American Superconductor teamed with Con Edison, Ultera (Southwire and nkt cables joint venture), and Air Liquide. This paper describes the general goals, design criteria, status and progress of the project. Fault current limiting has already been demonstrated in 3 m prototype cables, and test results on a 25 m three-phase cable will be presented. An overview of the concept of a fault current limiting cable and the system advantages of this unique type of cable will be described.
ASSOCIATION BETWEEN PAYMENT REFORM AND PROVIDER CONSOLIDATION
Neprash, Hannah T.; Chernew, Michael E.; McWilliams, J. Michael
2017-01-01
Provider consolidation has been associated with higher health care prices and spending. Prevailing wisdom assumes that payment reform will accelerate consolidation, especially between physicians and hospitals and among physician groups, as providers position themselves to bear financial risk for the full continuum of patient care. Drawing from a number of data sources from 2008 onward, we examined the relationship between Medicare’s Accountable Care Organization (ACO) programs and provider consolidation. According to multiple measures, consolidation was underway in 2008–2010, before the Affordable Care Act (ACA) established the ACO programs. While the number of hospital mergers and specialty-oriented physician group size increased after the ACA, we found minimal evidence associating consolidation with ACO penetration at a market level or with ACO participation by physicians within markets. We conclude that payment reform has been associated with little acceleration in consolidation apart from trends already underway, but with some evidence of potential defensive consolidation in response to new payment models. PMID:28167725
24 CFR 401.401 - Consolidated Restructuring Plans.
Code of Federal Regulations, 2010 CFR
2010-04-01
... PROGRAM (MARK-TO-MARKET) Restructuring Plan § 401.401 Consolidated Restructuring Plans. A PAE may request HUD to approve a Consolidated Restructuring Plan that presents an overall strategy for more than one... resources, HUD will not approve any Consolidated Restructuring Plans that have a detrimental effect on...
NASA Astrophysics Data System (ADS)
Song, I.; Elphick, S.; Main, I.; Ngwenya, B.
2003-04-01
We present hydraulic and mechanical characteristics of a calcilutite (calcitic mud) sample from an outcrop 4 km south of the Aigion fault zone, on the Southern shore of the Gulf of Corinth, Greece. This fine-grained sediment may provide a top seal for fluid pressure, and is also representative of limestone gouge materials, and hence its properties are important for modelling the hydro-mechanical response of the Aigion fault zone. An X-ray diffraction analysis revealed that the sample consists mostly of calcite (82%), with quartz (10%), and minor clay minerals. An unconsolidated sample was remoulded into a core shape (38 mm diameter by 45 mm length) under slight compaction, and then placed in the centre of an oedometer cell, covered by two porous steel fluid distribution discs on the top and bottom of the sample. The sample was subjected in turn to a constant vertical stress of 16.2, 18.9, 21.6, 24.3, and 27.0 MPa. The vertical load at each level was held constant for 24 hours to measure the compaction/consolidation under passive drained conditions, and then the permeability was measured for the following 24 hours at constant flow rate. Axial deformation was measured by two LVDTs at diagonally-opposite positions on the sample. At the end of the test, we measured the sample dimensions, and its wet and dry weights, obtaining a void ratio of 0.58 and a porosity of 0.37. The axial strain measurements show a consolidation curve with a decelerating strain rate that can be approximated by a power-law function. The permeability is negatively and linearly correlated to the stress, and ranges from 0.9 - 1.5 x 10-17 m2. When fluid is first pumped into the sample at a constant rate, we observed a transient decelerating increase in pore pressure due to swelling in the samples. Conversely on the release of the axial stress a transient reduction in pore pressure was observed, in turn sucking fluid back into the sample. These transient responses to sudden changes in effective stress imply that such fine-grained calcitic mud-like materials may play a crucial role in the time-dependent triggering of fault movement in the Aigion region, especially in faults when the powder has been smeared along the fault surface by repeated movement.
Randomized controlled trial of a dose consolidation program.
Delate, Thomas; Fairman, Kathleen A; Carey, Shelly M; Motheral, Brenda R
2004-01-01
To evaluate the effectiveness and financial impact of a drug dose consolidation (optimization) program using letter intervention. This pilot program in a large, mid-Atlantic health plan utilized a randomized controlled trial research design. A review of adjudicated pharmacy claims records was performed monthly for 3 consecutive months from November 2002 through February 2003 to identify inefficient (i.e., >once-daily) regimens for any one of 68 dosage strengths of 37 single-source maintenance drugs with once-daily dosing recommendations. Prescribers who had prescribed one or more inefficient regimens were identified and randomized to one of the 2 intervention arms or a control arm. Prescribers in both intervention arms were sent personalized letters with information on their patients. inefficient regimens and suggested dose consolidation options. Patients of prescribers in one intervention arm received a complementary, patient-oriented letter. Pharmacy claims for patients in all arms were examined at 180 days after the date of the letter mailing for conversion to an efficient (once-daily) regimen. Financial modeling analysis calculated net savings as changes in pharmacy expenditures minus administrative costs. A total of 2,614 inefficient regimens, representing 6.7% of claims for the targeted medications, were identified. The rate of consolidation to a suggested dosing option was lower for the Physician Letter arm (7.3%) than for the Physician/Member Letter arm (10.2%) (P = 0.046). Both intervention arms had higher consolidation rates than the Control arm (3.9%) (P = 0.018 and P = 0.000, respectively.). Approximately 30% of the regimens in each study arm were never refilled after being targeted. Financial modeling indicated that a dose consolidation intervention could save 0.03 dollars to 0.07 dollars per member per month (PMPM) in 2003 dollars with full medication compliance but only 0.02 dollars to 0.03 dollars PMPM when savings were calculated with realistic, partial compliance rates. Subanalyses performed at the drug therapy class level revealed few opportunities to justify implementing a dose consolidation program. After taking into consideration program administrative costs, high rates of refill discontinuation, and dose consolidation that occurs naturally without intervention, the results indicated that a letter-based dose consolidation program did not appreciably decrease pharmacy expenditures.
Expert System Detects Power-Distribution Faults
NASA Technical Reports Server (NTRS)
Walters, Jerry L.; Quinn, Todd M.
1994-01-01
Autonomous Power Expert (APEX) computer program is prototype expert-system program detecting faults in electrical-power-distribution system. Assists human operators in diagnosing faults and deciding what adjustments or repairs needed for immediate recovery from faults or for maintenance to correct initially nonthreatening conditions that could develop into faults. Written in Lisp.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Boerschlein, David P.
1993-01-01
Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.
School Consolidation: Is Bigger Better? Part II. Options in Education, Program #90.
ERIC Educational Resources Information Center
George Washington Univ., Washington, DC. Inst. for Educational Leadership.
This publication is the complete transcript of a weekly radio program devoted to contemporary issues in American education. This particular program is the second of two that focus on the topic of school consolidation. In separate segments of the program, Wendy Blair and John Merrow of National Public Radio discuss declining school enrollment and…
Expanding an Honors Program in the Midst of Institution Consolidation
ERIC Educational Resources Information Center
Jacobs, Bonita C.
2015-01-01
Institutions of higher learning have been facing budget constrictions throughout the country, leading to consolidations and cutbacks. Administrators often have to make hard choices about what programs to eliminate or cut back, but one program that is not on the table at the University of North Georgia is the honors program. The university is…
ERIC Educational Resources Information Center
Chamberlain, Ed
This report recommends that the Neglected and Delinquent (ND) Program of the Columbus (Ohio) Public Schools, funded by the Education Consolidation and Improvement Act Chapter 1, be continued in the 1988-89 school year because the program provides a needed service to pupils in exceptional circumstances. The ND Program is designed to provide…
The evolution of Orbiter depot support, with applications to future space vehicles
NASA Technical Reports Server (NTRS)
Mcclain, Michael L.
1990-01-01
The reasons for depot consolidation and the processes established to implement the Orbiter depot are presented. The Space Shuttle Orbiter depot support is presently being consolidated due to equipment suppliers leaving the program, escalating depot support costs, and increasing repair turnaround times. Details of the depot support program for orbiter hardware and selected pieces of support equipment are discussed. The benefits gained from this consolidation and the lessons learned are then applied to future reuseable space vehicles to provide program managers a forward look at the need for efficient depot support.
State Student Financial Aid. Report and Recommendations.
ERIC Educational Resources Information Center
Florida State Postsecondary Education Planning Commission, Tallahassee.
This report presents the results of a review of all state student financial aid programs in Florida and presents recommendations concerning program consolidation. The review was designed to address a variety of aid-related issues, including unexpended financial aid resources, program consolidation, budget request and aid distribution procedures,…
41 CFR 101-6.212-5 - Consolidated or joint hearings.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., provide for the conduct of consolidated or joint hearings, and for the application to such hearings of... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Consolidated or joint...-Nondiscrimination in Programs Receiving Federal Financial Assistance § 101-6.212-5 Consolidated or joint hearings...
Neprash, Hannah T; Chernew, Michael E; McWilliams, J Michael
2017-02-01
Provider consolidation has been associated with higher health care prices and spending. The prevailing wisdom is that payment reform will accelerate consolidation, especially between physicians and hospitals and among physician groups, as providers position themselves to bear financial risk for the full continuum of patient care. Drawing on data from a number of sources from 2008 onward, we examined the relationship between Medicare's accountable care organization (ACO) programs and provider consolidation. We found that consolidation was under way in the period 2008-10, before the Affordable Care Act (ACA) established the ACO programs. While the number of hospital mergers and the size of specialty-oriented physician groups increased after the ACA was passed, we found minimal evidence that consolidation was associated with ACO penetration at the market level or with physicians' participation in ACOs within markets. We conclude that payment reform has been associated with little acceleration in consolidation in addition to trends already under way, but there is evidence of potential defensive consolidation in response to new payment models. Project HOPE—The People-to-People Health Foundation, Inc.
Measurement of fault latency in a digital avionic miniprocessor
NASA Technical Reports Server (NTRS)
Mcgough, J. G.; Swern, F. L.
1981-01-01
The results of fault injection experiments utilizing a gate-level emulation of the central processor unit of the Bendix BDX-930 digital computer are presented. The failure detection coverage of comparison-monitoring and a typical avionics CPU self-test program was determined. The specific tasks and experiments included: (1) inject randomly selected gate-level and pin-level faults and emulate six software programs using comparison-monitoring to detect the faults; (2) based upon the derived empirical data develop and validate a model of fault latency that will forecast a software program's detecting ability; (3) given a typical avionics self-test program, inject randomly selected faults at both the gate-level and pin-level and determine the proportion of faults detected; (4) determine why faults were undetected; (5) recommend how the emulation can be extended to multiprocessor systems such as SIFT; and (6) determine the proportion of faults detected by a uniprocessor BIT (built-in-test) irrespective of self-test.
Eastern rim of the Chesapeake Bay impact crater: Morphology, stratigraphy, and structure
Poag, C.W.
2005-01-01
This study reexamines seven reprocessed (increased vertical exaggeration) seismic reflection profiles that cross the eastern rim of the Chesapeake Bay impact crater. The eastern rim is expressed as an arcuate ridge that borders the crater in a fashion typical of the "raised" rim documented in many well preserved complex impact craters. The inner boundary of the eastern rim (rim wall) is formed by a series of raterfacing, steep scarps, 15-60 m high. In combination, these rim-wall scarps represent the footwalls of a system of crater-encircling normal faults, which are downthrown toward the crater. Outboard of the rim wall are several additional normal-fault blocks, whose bounding faults trend approximately parallel to the rim wall. The tops of the outboard fault blocks form two distinct, parallel, flat or gently sloping, terraces. The innermost terrace (Terrace 1) can be identified on each profile, but Terrace 2 is only sporadically present. The terraced fault blocks are composed mainly of nonmarine, poorly to moderately consolidated, siliciclastic sediments, belonging to the Lower Cretaceous Potomac Formation. Though the ridge-forming geometry of the eastern rim gives the appearance of a raised compressional feature, no compelling evidence of compressive forces is evident in the profiles studied. The structural mode, instead, is that of extension, with the clear dominance of normal faulting as the extensional mechanism.
Thermomagnetic properties of peat-soil layers from Sag pond near Lembang Fault, West Java, Indonesia
NASA Astrophysics Data System (ADS)
Iryanti, Mimin; Wibowo, Dimas Maulana; Bijaksana, Satria
2015-09-01
Sag pond is a body of water near fault system as water flows blocked by the fault. Sag pond is a special type of environment for peat formation as peat layers in were deposited as the fault moves in episodic fashion. Depending on the history of the fault, peat layers are often interrupted by soil layers. In this study, core of peat-soil layers from a Sag pond in Karyawangi Village near Lembang Fault was obtained and analyzed for its magnetic properties. The 5 m core was obtained using a hand auger. Individual samples were obtained every cm and measured for their magnetic susceptibility. In general, there are three distinct magnetic susceptibility layers that were associated with peat and soil layers. The upper first 1 m is unconsolidated mud layer with its relatively high magnetic susceptibility. Between 1-2.81 m, there is consolidated mud layer and the lowest part (2.82-5) m is basically peat layer. Six samples were then measured for their thermomagnetic properties by measuring their susceptibility during heating and cooling from room temperature to 700°C. The thermomagnetic profiles provide Curie temperatures for various magnetic minerals in the cores. It was found that the upper part (unconsolidated mud) contains predominantly iron-oxides, such as magnetite while the lowest part (peat layer) contains significant amount of iron-sulphides, presumably greigite.
GUI Type Fault Diagnostic Program for a Turboshaft Engine Using Fuzzy and Neural Networks
NASA Astrophysics Data System (ADS)
Kong, Changduk; Koo, Youngju
2011-04-01
The helicopter to be operated in a severe flight environmental condition must have a very reliable propulsion system. On-line condition monitoring and fault detection of the engine can promote reliability and availability of the helicopter propulsion system. A hybrid health monitoring program using Fuzzy Logic and Neural Network Algorithms can be proposed. In this hybrid method, the Fuzzy Logic identifies easily the faulted components from engine measuring parameter changes, and the Neural Networks can quantify accurately its identified faults. In order to use effectively the fault diagnostic system, a GUI (Graphical User Interface) type program is newly proposed. This program is composed of the real time monitoring part, the engine condition monitoring part and the fault diagnostic part. The real time monitoring part can display measuring parameters of the study turboshaft engine such as power turbine inlet temperature, exhaust gas temperature, fuel flow, torque and gas generator speed. The engine condition monitoring part can evaluate the engine condition through comparison between monitoring performance parameters the base performance parameters analyzed by the base performance analysis program using look-up tables. The fault diagnostic part can identify and quantify the single faults the multiple faults from the monitoring parameters using hybrid method.
School Consolidation: Is Bigger Better? Part I. Options in Education, Program #89.
ERIC Educational Resources Information Center
George Washington Univ., Washington, DC. Inst. for Educational Leadership.
This publication is the complete transcript of a weekly radio program devoted to contemporary issues in American education. This particular program is the first of two that focus on the topic of school consolidation. In separate segments of the program, a teacher and students drill in a one-room school in Indiana; a long-time rural school teacher…
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Martensen, Anna L.
1992-01-01
FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.
Student Loans Driving You Crazy? A Borrower's Guide to Direct Consolidation Loans.
ERIC Educational Resources Information Center
Office of Federal Student Aid (ED), Washington, DC.
This booklet describes the Direct Consolidation Loan program students can use to combine one or more student loans into a new loan. Things to consider before seeking a consolidation loan are outlined. Direct consolidation loans offer a number of advantages; they are free, result in one lender and one monthly payment, and offer flexible repayment…
ERIC Educational Resources Information Center
Cole, Renee E.; Horacek, Tanya
2009-01-01
Objective: To describe the use of a consolidated version of the PRECEDE-PROCEED participatory program planning model to collaboratively design an intuitive eating program with Fort Drum military spouses tailored to their readiness to reject the dieting mentality and make healthful lifestyle modifications. Design: A consolidated version of…
Automatic translation of digraph to fault-tree models
NASA Technical Reports Server (NTRS)
Iverson, David L.
1992-01-01
The author presents a technique for converting digraph models, including those models containing cycles, to a fault-tree format. A computer program which automatically performs this translation using an object-oriented representation of the models has been developed. The fault-trees resulting from translations can be used for fault-tree analysis and diagnosis. Programs to calculate fault-tree and digraph cut sets and perform diagnosis with fault-tree models have also been developed. The digraph to fault-tree translation system has been successfully tested on several digraphs of varying size and complexity. Details of some representative translation problems are presented. Most of the computation performed by the program is dedicated to finding minimal cut sets for digraph nodes in order to break cycles in the digraph. Fault-trees produced by the translator have been successfully used with NASA's Fault-Tree Diagnosis System (FTDS) to produce automated diagnostic systems.
1996-12-27
consolidated financial statements for FY 1996. The Office of Civilian Health and Medical Program of the Uniformed Services (OCHAMPUS), part of the FY 1995...12.3 billion Defense Health Program, is one of the entities that DoD will include in its FY 1996 consolidated financial statements . The OCHAMPUS
Runtime Speculative Software-Only Fault Tolerance
2012-06-01
reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing
24 CFR 92.608 - Consolidated plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Consolidated plan. 92.608 Section 92.608 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development HOME INVESTMENT PARTNERSHIPS PROGRAM American Dream Downpayment Initiative § 92.608 Consolidated...
24 CFR 92.608 - Consolidated plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Consolidated plan. 92.608 Section 92.608 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development HOME INVESTMENT PARTNERSHIPS PROGRAM American Dream Downpayment Initiative § 92.608 Consolidated...
24 CFR 92.608 - Consolidated plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Consolidated plan. 92.608 Section 92.608 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development HOME INVESTMENT PARTNERSHIPS PROGRAM American Dream Downpayment Initiative § 92.608 Consolidated...
24 CFR 92.608 - Consolidated plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Consolidated plan. 92.608 Section 92.608 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development HOME INVESTMENT PARTNERSHIPS PROGRAM American Dream Downpayment Initiative § 92.608 Consolidated...
24 CFR 92.608 - Consolidated plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Consolidated plan. 92.608 Section 92.608 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development HOME INVESTMENT PARTNERSHIPS PROGRAM American Dream Downpayment Initiative § 92.608 Consolidated...
24 CFR 91.315 - Strategic plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS State Governments; Contents of Consolidated Plan § 91.315 Strategic plan. (a) General. For the categories described in paragraphs (b), (c), (d), (e), and (f) of this section, the consolidated plan must do the following: (1) Indicate the general...
Slicken 1.0: Program for calculating the orientation of shear on reactivated faults
NASA Astrophysics Data System (ADS)
Xu, Hong; Xu, Shunshan; Nieto-Samaniego, Ángel F.; Alaniz-Álvarez, Susana A.
2017-07-01
The slip vector on a fault is an important parameter in the study of the movement history of a fault and its faulting mechanism. Although there exist many graphical programs to represent the shear stress (or slickenline) orientations on faults, programs to quantitatively calculate the orientation of fault slip based on a given stress field are scarce. In consequence, we develop Slicken 1.0, a software to rapidly calculate the orientation of maximum shear stress on any fault plane. For this direct method of calculating the resolved shear stress on a planar surface, the input data are the unit vector normal to the involved plane, the unit vectors of the three principal stress axes, and the stress ratio. The advantage of this program is that the vertical or horizontal principal stresses are not necessarily required. Due to its nimble design using Java SE 8.0, it runs on most operating systems with the corresponding Java VM. The software program will be practical for geoscience students, geologists and engineers and will help resolve a deficiency in field geology, and structural and engineering geology.
NASA Astrophysics Data System (ADS)
Chorowicz, Jean; Benissa, Mahmoud
2016-11-01
The N75°E-trending Qarqaf arch in NW Libya separates the Ghadamis and Murzuq basins. We have updated existing geological maps by remote sensing analysis and fieldwork in order to describe the tectonic style of the Palaeozoic units. We have evidenced a Bir Aishah anticline, a Wadi Ash Shabiyat graben and arrays of sedimentary and/or vein quartz dykes that relate to extension fractures or open faults some of them being filled up by on-going sedimentation. We show that continuous brittle syn-depositional deformation occurred throughout the Palaeozoic and progressively with time focused into major faults. The Qarqaf arch is a Palaeozoic right-lateral fault zone comprising main conjugate dextral N60°E and sinistral N90°E fault families. It also comprises ∼ N-striking extensional faults with related drag or fault-propagation folds. The Palaeozoic tectonic style is that of rift basins connected by a major transfer fault zone. The arch is as a consequence of strike-slip mechanism. In order to account for distinct folds affecting the Carboniferous strata we argue that partly consolidated silty Devonian and Carboniferous deposits slid in mass by places at the end of their deposition over tilting Devonian layers. Our model is alternative to the currently considered concept of major Variscan compressional orogen in this area. The regional so called 'Variscan' age disconformity actually is the Triassic early Neo-Tethyan event. These general concepts have potential impact on basin modelling of subsidence, uplift, thermal history and hydrocarbon migration. Any new structural geology study in this area is important for oil exploration.
24 CFR 91.235 - Special case; abbreviated consolidated plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Special case; abbreviated consolidated plan. 91.235 Section 91.235 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS...
24 CFR 91.235 - Special case; abbreviated consolidated plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Special case; abbreviated consolidated plan. 91.235 Section 91.235 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS...
24 CFR 91.235 - Special case; abbreviated consolidated plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Special case; abbreviated consolidated plan. 91.235 Section 91.235 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS...
24 CFR 91.235 - Special case; abbreviated consolidated plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Special case; abbreviated consolidated plan. 91.235 Section 91.235 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS...
24 CFR 91.235 - Special case; abbreviated consolidated plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Special case; abbreviated consolidated plan. 91.235 Section 91.235 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS...
75 FR 75666 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-06
... loans that he or she wishes to consolidate, if there is insufficient space on the Application and.... Ford Federal Direct Loan (Direct Loan) Program Federal Direct Consolidation Loan Application and... Federal Direct Consolidation Loan Application and Promissory Note serves as the means by which a borrower...
Flexure and faulting of sedimentary host rocks during growth of igneous domes, Henry Mountains, Utah
Jackson, M.D.; Pollard, D.D.
1990-01-01
A sequence of sedimentary rocks about 4 km thick was bent, stretched and uplifted during the growth of three igneous domes in the southern Henry Mountains. Mount Holmes, Mount Ellsworth and Mount Hillers are all about 12 km in diameter, but the amplitudes of their domes are about 1.2, 1.85 and 3.0 km, respectively. These mountains record successive stages in the inflation of near-surface diorite intrusions that are probably laccolithic in origin. The host rocks deformed along networks of outcrop-scale faults, or deformation bands, marked by crushed grains, consolidation of the porous sandstone and small displacements of sedimentary beds. Zones of deformation bands oriented parallel to the beds and formation contacts subdivided the overburden into thin mechanical layers that slipped over one another during doming. Measurements of outcrop-scale fault populations at the three mountains reveal a network of faults that strikes at high angles to sedimentary beds which themselves strike tangentially about the domes. These faults have normal and reverse components of slip that accommodated bending and stretching strains within the strata. An early stage of this deformation is displayed at Mount Holmes, where states of stress computed from three fault samples correlate with the theoretical distribution of stresses resulting from bending of thin, circular, elastic plates. Field observations and analysis of frictional driving stresses acting on horizontal planes above an opening-mode dislocation, as well as the paleostress analysis of faulting, indicate that bedding-plane slip and layer flexure were important components of the early deformation. As the amplitude of doming increased, radial and circumferential stretching of the strata and rotation of the older faults in the steepening limbs of the domes increased the complexity of the fault patterns. Steeply-dipping, map-scale faults with dip-slip displacements indicate a late-stage jostling of major blocks over the central magma chamber. Radial dikes pierced the dome and accommodated some of the circumferential stretching. ?? 1990.
BERG2 Micro-computer Estimation of Freeze and Thaw Depths and Thaw Consolidation (PDF file)
DOT National Transportation Integrated Search
1989-06-01
The BERG2 microcomputer program uses a methology similar to the Modified Berggren method (Aldrich and Paynter, 1953) to estimate the freeze and thaw depths in layered soil systems. The program also provides an estimate of the thaw consolidation in ic...
Sacramento City Unified School District Chapter 1/State Compensatory Education Handbook Series.
ERIC Educational Resources Information Center
Sacramento City Unified School District, CA.
Four handbooks developed by the Consolidated Programs Department of the Sacramento City Unified School District (California) provide a means by which the multitude of federal, state, and district rules and regulations pertaining to compensatory education can be understood. The "Consolidated Programs Office Management Procedures" handbook…
7 CFR 1782.15 - Mergers and consolidations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 12 2010-01-01 2010-01-01 false Mergers and consolidations. 1782.15 Section 1782.15... AGRICULTURE (CONTINUED) SERVICING OF WATER AND WASTE PROGRAMS § 1782.15 Mergers and consolidations. Mergers... transaction under consideration and the unique facts involved in each transaction. Mergers occur when two or...
40 CFR 78.8 - Consolidation and severance of appeals proceedings.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Consolidation and severance of appeals proceedings. 78.8 Section 78.8 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPEAL PROCEDURES § 78.8 Consolidation and severance of appeals proceedings. (a) The...
NASA Astrophysics Data System (ADS)
Goto, J.; Miwa, T.; Tsuchi, H.; Karasaki, K.
2009-12-01
The Nuclear Waste Management Organization of Japan (NUMO), after volunteering municipalities arise, will start a three-staged program for selecting a HLW and TRU waste repository site. It is recognized from experiences from various site characterization programs in the world that the hydrologic property of faults is one of the most important parameters in the early stage of the program. It is expected that numerous faults of interest exist in an investigation area of several tens of square kilometers. It is, however, impossible to characterize all these faults in a limited time and budget. This raises problems in the repository designing and safety assessment that we may have to accept unrealistic or over conservative results by using a single model or parameters for all the faults in the area. We, therefore, seek to develop an efficient and practical methodology to characterize hydrologic property of faults. This project is a five year program started in 2007, and comprises the basic methodology development through literature study and its verification through field investigations. The literature study tries to classify faults by correlating their geological features with hydraulic property, to search for the most efficient technology for fault characterization, and to develop a work flow diagram. The field investigation starts from selection of a site and fault(s), followed by existing site data analyses, surface geophysics, geological mapping, trenching, water sampling, a series of borehole investigations and modeling/analyses. Based on the results of the field investigations, we plan to develop a systematic hydrologic characterization methodology of faults. A classification method that correlates combinations of geological features (rock type, fault displacement, fault type, position in a fault zone, fracture zone width, damage zone width) with widths of high permeability zones around a fault zone was proposed through a survey on available documents of the site characterization programs. The field investigation started in 2008, by selecting the Wildcat Fault that cut across the Laurence Berkeley National Laboratory (LBNL) site as the target. Analyses on site-specific data, surface geophysics, geological mapping and trenching have confirmed the approximate location and characteristics of the fault (see Session H48, Onishi, et al). The plan for the remaining years includes borehole investigations at LBNL, and another series of investigations in the northern part of the Wildcat Fault.
On TTEthernet for Integrated Fault-Tolerant Spacecraft Networks
NASA Technical Reports Server (NTRS)
Loveless, Andrew
2015-01-01
There has recently been a push for adopting integrated modular avionics (IMA) principles in designing spacecraft architectures. This consolidation of multiple vehicle functions to shared computing platforms can significantly reduce spacecraft cost, weight, and de- sign complexity. Ethernet technology is attractive for inclusion in more integrated avionic systems due to its high speed, flexibility, and the availability of inexpensive commercial off-the-shelf (COTS) components. Furthermore, Ethernet can be augmented with a variety of quality of service (QoS) enhancements that enable its use for transmitting critical data. TTEthernet introduces a decentralized clock synchronization paradigm enabling the use of time-triggered Ethernet messaging appropriate for hard real-time applications. TTEthernet can also provide two forms of event-driven communication, therefore accommodating the full spectrum of traffic criticality levels required in IMA architectures. This paper explores the application of TTEthernet technology to future IMA spacecraft architectures as part of the Avionics and Software (A&S) project chartered by NASA's Advanced Exploration Systems (AES) program.
The influence of faults in basin-fill deposits on land subsidence, Las Vegas Valley, Nevada, USA
NASA Astrophysics Data System (ADS)
Burbey, Thomas
2002-07-01
The role of horizontal deformation caused by pumping of confined-aquifer systems is recognized as contributing to the development of earth fissures in semiarid regions, including Las Vegas Valley, Nevada. In spite of stabilizing water levels, new earth fissures continue to develop while existing ones continue to lengthen and widen near basin-fill faults. A three-dimensional granular displacement model based on Biot's consolidation theory (Biot, MA, 1941, General theory of three-dimensional consolidation. Jour. Applied Physics 12:155-164) has been used to evaluate the nature of displacement in the vicinity of two vertical faults. The fault was simulated as (1) a low-permeability barrier to horizontal flow, (2) a gap or structural break in the medium, but where groundwater flow is not obstructed, and (3) a combination of conditions (1) and (2). Results indicate that the low-permeability barrier greatly enhances horizontal displacement. The fault plane also represents a location of significant differential vertical subsidence. Large computed strains in the vicinity of the fault may suggest high potential for failure and the development of earth fissures when the fault is assumed to have low permeability. Results using a combination of the two boundaries suggest that potential fissure development may be great at or near the fault plane and that horizontal deformation is likely to play a key role in this development. Résumé. On considère que la déformation horizontale provoquée par un pompage dans un aquifère captif joue un rôle dans le développement des fissures du sol en régions semi-arides, comme la vallée de Las Vegas (Nevada). Malgré des niveaux d'eau stabilisés, de nouvelles fissures du sol continuent de se développer en longueur et en largeur au voisinage de failles dans les bassins sédimentaires. Un modèle de déplacement granulaire tri-dimensionnel, basé sur la théorie de la consolidation de Biot (Biot, M A, 1941, General theory of three-dimensional consolidation. Jour. Applied Physics 12:155-164), a été utilisé pour évaluer la nature du déplacement au voisinage de deux failles verticales. La faille a été simulée comme 1) une barrière de faible perméabilité pour l'écoulement horizontal, 2) une rupture structurale dans le milieu, mais sans obstruction de l'écoulement, et 3) une combinaison des deux précédentes conditions. Les résultats indiquent que la barrière de faible perméabilité favorise fortement le déplacement horizontal. Le plan de faille constitue aussi un lieu de subsidence différentielle verticale significative. Les fortes contraintes calculées au voisinage de la faille laissent penser qu'il existe un fort potentiel de rupture et le développement de fissures du sol quand on suppose que la faille possède une faible perméabilité. Les résultats utilisant une combinaison des deux conditions suggèrent que le développement potentiel de fissures peut être grand sur ou à proximité du plan de faille et que la déformation horizontale joue vraisemblablement un rôle clé dans ce développement. Resumen. Se conoce la contribución que la deformación horizontal causada por el bombeo de sistemas acuíferos confinados tienen en el desarrollo de fisuras en regiones semiáridas, como es el caso del Valle de Las Vegas (Nevada, Estados Unidos de América). A pesar de la estabilización de los niveles, se continúa desarrollando nuevas fisuras, mientras las ya existentes se alargan y ensanchan cerca de las fallas de relleno de cuenca. Se ha utilizado un modelo tridimensional de desplazamiento granular basado en la teoría de consolidación de Biot (Biot, M.A., 1941. General theory of three-dimensional consolidation. J. Applied Physics, 12: 155-164) para evaluar la naturaleza del desplazamiento junto a dos fallas verticales. Se ha simulado cada falla como (1) una barrera de baja permeabilidad al flujo horizontal, (2) un hueco o ruptura estructural en el medio pero sin obstrucción al flujo de aguas subterráneas, y (3) una combinación de las dos condiciones anteriores. Los resultados indican que la barrera de baja permeabilidad incrementa enormemente el desplazamiento horizontal. El plano de falla también representa una situación de subsidencia diferencial vertical significativa. Los valores elevados que se han calculado para la deformación en la proximidad de la falla pueden sugerir que existe un alto potencial de fallo y desarrollo de fisuras cuando se supone que la falla posee una baja permeabilidad. Si se combinan los dos contornos, los resultados sugieren que el desarrollo potencial de fisuras puede ser mayor en o cerca de el plano de falla, y que es probable que la deformación horizontal desempeñe un papel clave en él.
Consolidation of Customer Orders into Truckloads at a Large Manufacturer
1997-08-01
Consolidation of Customer Orders into Truckloads at a Large Manufacturer G. G. Brown; D. Ronen The Journal of the Operational Research Society...AND SUBTITLE Consolidation of Customer Orders into Truckloads at a Large Manufacturer 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...All rights reserved. 0160-5682/97 $12.00 ~ Consolidation of customer orders into truckloads at a large manufacturer GG Brown1 and D Ronen2 1 Naval
Modeling of a latent fault detector in a digital system
NASA Technical Reports Server (NTRS)
Nagel, P. M.
1978-01-01
Methods of modeling the detection time or latency period of a hardware fault in a digital system are proposed that explain how a computer detects faults in a computational mode. The objectives were to study how software reacts to a fault, to account for as many variables as possible affecting detection and to forecast a given program's detecting ability prior to computation. A series of experiments were conducted on a small emulated microprocessor with fault injection capability. Results indicate that the detecting capability of a program largely depends on the instruction subset used during computation and the frequency of its use and has little direct dependence on such variables as fault mode, number set, degree of branching and program length. A model is discussed which employs an analog with balls in an urn to explain the rate of which subsequent repetitions of an instruction or instruction set detect a given fault.
The Fault Tree Compiler (FTC): Program and mathematics
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Martensen, Anna L.
1989-01-01
The Fault Tree Compiler Program is a new reliability tool used to predict the top-event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, AND m OF n gates. The high-level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precisely (within the limits of double precision floating point arithmetic) within a user specified number of digits accuracy. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Equipment Corporation (DEC) VAX computer with the VMS operation system.
34 CFR 76.132 - What assurances must be in a consolidated grant application?
Code of Federal Regulations, 2010 CFR
2010-07-01
... use administrative practices that will insure that non-Federal funds will not be supplanted by Federal funds made available under the authority of the programs in the consolidated grant; (2) Comply with the... statutes and implementing regulations for the programs under which funds are to be used and administered...
Code of Federal Regulations, 2014 CFR
2014-04-01
... renewal of expiring consolidated ACC funding increments. 982.102 Section 982.102 Housing and Urban... ASSISTANCE: HOUSING CHOICE VOUCHER PROGRAM Funding and PHA Application for Funding § 982.102 Allocation of budget authority for renewal of expiring consolidated ACC funding increments. (a) Applicability. This...
Code of Federal Regulations, 2010 CFR
2010-04-01
... renewal of expiring consolidated ACC funding increments. 982.102 Section 982.102 Housing and Urban... ASSISTANCE: HOUSING CHOICE VOUCHER PROGRAM Funding and PHA Application for Funding § 982.102 Allocation of budget authority for renewal of expiring consolidated ACC funding increments. (a) Applicability. This...
Code of Federal Regulations, 2011 CFR
2011-04-01
... renewal of expiring consolidated ACC funding increments. 982.102 Section 982.102 Housing and Urban... ASSISTANCE: HOUSING CHOICE VOUCHER PROGRAM Funding and PHA Application for Funding § 982.102 Allocation of budget authority for renewal of expiring consolidated ACC funding increments. (a) Applicability. This...
Code of Federal Regulations, 2013 CFR
2013-04-01
... renewal of expiring consolidated ACC funding increments. 982.102 Section 982.102 Housing and Urban... ASSISTANCE: HOUSING CHOICE VOUCHER PROGRAM Funding and PHA Application for Funding § 982.102 Allocation of budget authority for renewal of expiring consolidated ACC funding increments. (a) Applicability. This...
Code of Federal Regulations, 2012 CFR
2012-04-01
... renewal of expiring consolidated ACC funding increments. 982.102 Section 982.102 Housing and Urban... ASSISTANCE: HOUSING CHOICE VOUCHER PROGRAM Funding and PHA Application for Funding § 982.102 Allocation of budget authority for renewal of expiring consolidated ACC funding increments. (a) Applicability. This...
System and method for bearing fault detection using stator current noise cancellation
Zhou, Wei; Lu, Bin; Habetler, Thomas G.; Harley, Ronald G.; Theisen, Peter J.
2010-08-17
A system and method for detecting incipient mechanical motor faults by way of current noise cancellation is disclosed. The system includes a controller configured to detect indicia of incipient mechanical motor faults. The controller further includes a processor programmed to receive a baseline set of current data from an operating motor and define a noise component in the baseline set of current data. The processor is also programmed to repeatedly receive real-time operating current data from the operating motor and remove the noise component from the operating current data in real-time to isolate any fault components present in the operating current data. The processor is then programmed to generate a fault index for the operating current data based on any isolated fault components.
Integrated Approach To Design And Analysis Of Systems
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Iverson, David L.
1993-01-01
Object-oriented fault-tree representation unifies evaluation of reliability and diagnosis of faults. Programming/fault tree described more fully in "Object-Oriented Algorithm For Evaluation Of Fault Trees" (ARC-12731). Augmented fault tree object contains more information than fault tree object used in quantitative analysis of reliability. Additional information needed to diagnose faults in system represented by fault tree.
NASA Technical Reports Server (NTRS)
Rogers, William H.; Schutte, Paul C.
1993-01-01
Advanced fault management aiding concepts for commercial pilots are being developed in a research program at NASA Langley Research Center. One aim of this program is to re-evaluate current design principles for display of fault information to the flight crew: (1) from a cognitive engineering perspective and (2) in light of the availability of new types of information generated by advanced fault management aids. The study described in this paper specifically addresses principles for organizing fault information for display to pilots based on their mental models of fault management.
Proceedings of the workshop on urban freight consolidation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-06-01
The Urban Freight Consolidation Workshop discusses the desirability and practicality of establishing programs to consolidate the pickup and delivery of small shipments in highly concentrated urban areas. After presentation of an overview paper, Institutional Issues in Urban Freight Consolidation, by Ernest R. Cadotte and Robert A. Robicheaux, the following papers were given: Consolidation and Distribution--The Broad Picture, John T. Norris; Transportation Facilitation Center Concept, Irwin Blatner; The Regulatory Issues of Small-Shipment Consolidation, A. Daniel O'Neal; Chicago's Perspective of Urban-Freight Consolidation, Charles W. Lustig; Freight Consolidation in New York City, Samuel D. Kahan; Baltimore's Perspective of Urban-Freight Consolidation, Siegbert Schacknies; Small-Shippermore » Perspective, Richard A. Whitty; The Perspective of a ''Big Shipper,'' William K. Smith; A Receiver's Viewpoint of Consolidation, William P. McDaniel; For-Hire Motor-Carrier Perspective of Urban-Freight Consolidation, John L. Reith; Private-Carrier Perspective of Urban-Freight Consolidation, H. E. Manker; Union Perspective, M. R. Nensel; Urban-Freight Distribution Myopia, Carl S. Rappaport; Freight-Service Expectations, Performance, and Tradeoffs in Urban Areas: A Survey, Robert A. Robicheaux and Ernest R. Cadotte; and Freight Consolidation--Can It Be Successfully Implemented, James F. Robeson. (MCW)« less
34 CFR 76.130 - How are consolidated grants made?
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirements of §§ 76.125 through 76.137 and each program under which the grant funds are to be used and... grant if the Secretary determines that the Insular Area failed to meet the program objectives stated in... advisory council. (d) Although Pub. L. 95-134 authorizies the Secretary to consolidate grant funds that the...
NASA Technical Reports Server (NTRS)
Harper, Richard
1989-01-01
In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.
45 CFR 97.12 - Which grants may be consolidated?
Code of Federal Regulations, 2014 CFR
2014-10-01
....12 Public Welfare Department of Health and Human Services GENERAL ADMINISTRATION CONSOLIDATION OF...) Preventive Health and Health Services, 42 U.S.C. 300w-300w-10. 1 1 Certain Public Health Service programs for... consolidated grant. (2) Alcohol and Drug Abuse and Mental Health Services, 42 U.S.C. 300x-300x-9. 2 2 See...
45 CFR 97.12 - Which grants may be consolidated?
Code of Federal Regulations, 2013 CFR
2013-10-01
....12 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CONSOLIDATION OF...) Preventive Health and Health Services, 42 U.S.C. 300w-300w-10. 1 1 Certain Public Health Service programs for... consolidated grant. (2) Alcohol and Drug Abuse and Mental Health Services, 42 U.S.C. 300x-300x-9. 2 2 See...
45 CFR 97.12 - Which grants may be consolidated?
Code of Federal Regulations, 2010 CFR
2010-10-01
... consolidated grant. (2) Alcohol and Drug Abuse and Mental Health Services, 42 U.S.C. 300x-300x-9. 2 2 See....12 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CONSOLIDATION OF...) Preventive Health and Health Services, 42 U.S.C. 300w-300w-10. 1 1 Certain Public Health Service programs for...
NASA Astrophysics Data System (ADS)
Wu, Schuman
1989-12-01
In a low-temperature environment, the thin-section scale rock-deformation mode is primarily a function of confining pressure and total strain at geological strain rates. A deformation mode diagram is constructed from published experimental data by plotting the deformation mode on a graph of total strain versus the confining pressure. Four deformation modes are shown on the diagram: extensional fracturing, mesoscopic faulting, incipient faulting, and uniform flow. By determining the total strain and the deformation mode of a naturally deformed sample, the confining pressure and hence the depth at which the rock was deformed can be evaluated. The method is applied to normal faults exposed on the gently dipping southeast limb of the Birmingham anticlinorium in the Red Mountain expressway cut in Birmingham, Alabama. Samples of the Ordovician Chickamauga Limestone within and adjacent to the faults contain brittle structures, including mesoscopic faults and veins, and ductile deformation features including calcite twins, intergranular and transgranular pressure solution, and deformed burrows. During compaction, a vertical shortening of about 45 to 80% in shale is indicated by deformed burrows and relative compaction of shale to burrows, about 6% in limestone by stylolites. The normal faults formed after the Ordovician rocks were consolidated because the faults and associated veins truncate the deformed burrows and stylolites, which truncate the calcite cement. A total strain of 2.0% was caused by mesoscopic faults during normal faulting. A later homogenous deformation, indicated by the calcite twins in veins, cement and fossil fragments, has its major principal shortening strain in the dip direction at a low angle (about 22°) to bedding. The strain magnitude is about 2.6%. By locating the observed data on the deformation mode diagram, it is found that the normal faulting characterized by brittle deformation occurred under low confining pressure (< 18 MPa) at shallow depth (< 800 m), and the homogenous horizontal compression characterized by uniform flow occurred under higher confining pressure (at least 60 MPa) at greater depth (> 2.5 km).
NASA Astrophysics Data System (ADS)
Zuluaga, Luisa F.; Fossen, Haakon; Rotevatn, Atle
2014-11-01
Monoclinal fault propagation folds are a common type of structure in orogenic foreland settings, particularly on the Colorado Plateau. We have studied a portion of the San Rafael monocline, Utah, assumed to have formed through pure thrust- or reverse-slip (blind) fault movement, and mapped a particular sequence of subseismic cataclastic deformation structures (deformation bands) that can be related in terms of geometry, density and orientation to the dip of the forelimb or fold interlimb angle. In simple terms, deformation bands parallel to bedding are the first structures to form, increasing exponentially in number as the forelimb gets steeper. At about 30° rotation of the forelimb, bands forming ladder structures start to cross-cut bedding, consolidating themselves into a well-defined and regularly spaced network of deformation band zones that rotate with the layering during further deformation. In summary, we demonstrate a close relationship between limb dip and deformation band density that can be used to predict the distribution and orientation of such subseismic structures in subsurface reservoirs of similar type. Furthermore, given the fact that these cataclastic deformation bands compartmentalize fluid flow, this relationship can be used to predict or model fluid flow across and along comparable fault-propagation folds.
Interim reliability evaluation program, Browns Ferry fault trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, M.E.
1981-01-01
An abbreviated fault tree method is used to evaluate and model Browns Ferry systems in the Interim Reliability Evaluation programs, simplifying the recording and displaying of events, yet maintaining the system of identifying faults. The level of investigation is not changed. The analytical thought process inherent in the conventional method is not compromised. But the abbreviated method takes less time, and the fault modes are much more visible.
User's guide to programming fault injection and data acquisition in the SIFT environment
NASA Technical Reports Server (NTRS)
Elks, Carl R.; Green, David F.; Palumbo, Daniel L.
1987-01-01
Described are the features, command language, and functional design of the SIFT (Software Implemented Fault Tolerance) fault injection and data acquisition interface software. The document is also intended to assist and guide the SIFT user in defining, developing, and executing SIFT fault injection experiments and the subsequent collection and reduction of that fault injection data. It is also intended to be used in conjunction with the SIFT User's Guide (NASA Technical Memorandum 86289) for reference to SIFT system commands, procedures and functions, and overall guidance in SIFT system programming.
NASA Astrophysics Data System (ADS)
Sun, Z.; Ding, W.; Zhao, X.; Qiu, N.; Lin, J.; Li, C.
2017-12-01
In Internaltional Ocean Discovery Program (IODP) Expedition 349, four sites were drilled and cored successfully in the South China Sea (SCS). Three of them are close to the central spreading ridge (Sites U1431, U1433 and U1434), and one (Site U1435) is located on an outer rise,,providingsignificant information on the spreading history of the SCS.In order to constrain the spreading historymore accurately with the core results, we analyzed the identifiable macrostructures (over 300 fractures, veins and slickensides)from all the consolidated samples of these four drill sites. Then we made a retrograde reconstruction of the SCS spreading history with the constraints of the estimated fractures and veins, post-spreading volcanism,seismic interpretation, as well as free-air gravity and magnetic anomaly and topography analysis. Our study indicates that the spreading of the SCS experienced at least one ridge jump event and two events of ridge orientation and spreading direction adjustment, which mademagnetic anomaly orientation, ridge positionand facture zone directionskeep changing in the South China Sea. During the last spreading stage, the spreading direction was north-southward but lasted for a very short time period. The oceanic crust is wider in the eastern SCS and tapers out toward west.Due to the subductionof SCS beneath the Philippine Sea plate, the seafloor began to develop new fractures:the NWW-to EW-trending R' shear faults and the NE-trending P faultsbecame dominant faults and controlled the eruption of post-drift volcanism.
Determining on-fault earthquake magnitude distributions from integer programming
NASA Astrophysics Data System (ADS)
Geist, Eric L.; Parsons, Tom
2018-02-01
Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106 variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.
NASA Astrophysics Data System (ADS)
Pascal, Christophe
2004-04-01
Stress inversion programs are nowadays frequently used in tectonic analysis. The purpose of this family of programs is to reconstruct the stress tensor characteristics from fault slip data acquired in the field or derived from earthquake focal mechanisms (i.e. inverse methods). Until now, little attention has been paid to direct methods (i.e. to determine fault slip directions from an inferred stress tensor). During the 1990s, the fast increase in resolution in 3D seismic reflection techniques made it possible to determine the geometry of subsurface faults with a satisfactory accuracy but not to determine precisely their kinematics. This recent improvement allows the use of direct methods. A computer program, namely SORTAN, is introduced. The program is highly portable on Unix platforms, straightforward to install and user-friendly. The computation is based on classical stress-fault slip relationships and allows for fast treatment of a set of faults and graphical presentation of the results (i.e. slip directions). In addition, the SORTAN program permits one to test the sensitivity of the results to input uncertainties. It is a complementary tool to classical stress inversion methods and can be used to check the mechanical consistency and the limits of structural interpretations based upon 3D seismic reflection surveys.
NASA Technical Reports Server (NTRS)
Martensen, Anna L.; Butler, Ricky W.
1987-01-01
The Fault Tree Compiler Program is a new reliability tool used to predict the top event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N gates. The high level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precise (within the limits of double precision floating point arithmetic) to the five digits in the answer. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Corporation VAX with the VMS operation system.
Fenix, A Fault Tolerant Programming Framework for MPI Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamel, Marc; Teranihi, Keita; Valenzuela, Eric
2016-10-05
Fenix provides APIs to allow the users to add fault tolerance capability to MPI-based parallel programs in a transparent manner. Fenix-enabled programs can run through process failures during program execution using a pool of spare processes accommodated by Fenix.
System and method for motor fault detection using stator current noise cancellation
Zhou, Wei; Lu, Bin; Nowak, Michael P.; Dimino, Steven A.
2010-12-07
A system and method for detecting incipient mechanical motor faults by way of current noise cancellation is disclosed. The system includes a controller configured to detect indicia of incipient mechanical motor faults. The controller further includes a processor programmed to receive a baseline set of current data from an operating motor and define a noise component in the baseline set of current data. The processor is also programmed to acquire at least on additional set of real-time operating current data from the motor during operation, redefine the noise component present in each additional set of real-time operating current data, and remove the noise component from the operating current data in real-time to isolate any fault components present in the operating current data. The processor is then programmed to generate a fault index for the operating current data based on any isolated fault components.
Consolidation of Federal Aid Programs for Education: A Case for Block Grant Funding.
ERIC Educational Resources Information Center
Main, Robert G.
The need for a new approach to federal support of education by reducing the number of narrow categorical aid programs is developed through a case study of the 1976 Ford Administration proposal for a consolidated block grant of 24 separate authorities. The merits of block grant funding are examined both in terms of the administration-sponsored bill…
Fracture structures of active Nojima fault, Japan, revealed by borehole televiewer imaging
NASA Astrophysics Data System (ADS)
Nishiwaki, T.; Lin, A.
2017-12-01
Most large intraplate earthquakes occur as slip on mature active faults, any investigation of the seismic faulting process and assessment of seismic hazards require an understanding of the nature of active fault damage zones as seismogenic source. In this study, we focus on the fracture structures of the Nojima Fault (NF) that triggered the 1995 Kobe Mw 7.2 earthquake using ultrasonic borehole televiewer (BHTV) images from a borehole wall. The borehole used in this study was drilled throughout the NF at 1000 m in depth by a science project of Drilling into Fault Damage Zone(DFDZ) in 2016 (Lin, 2016; Miyawaki et al., 2016). In the depth of <230 m of the borehole, the rocks are composed of weak consolidated sandstone and conglomerate of the Plio-Pleistocene Osaka-Group and mudstone and sandstone of the Miocene Kobe Group. The basement rock in the depth of >230 m consist of pre-Neogene granitic rock. Based on the observations of cores and analysis of the BHTV images, the main fault plane was identified at a depth of 529.3 m with a 15 cm thick fault gouge zone and a damage zone of 100 m wide developed in the both sides of the main fault plane. Analysis of the BHTV images shows that the fractures are concentrated in two groups: N45°E (Group-1), parallel to the general trend of the NF, and another strikes N70°E (Group-2), oblique to the fault with an angle of 20°. It is well known that Riedel shear structures are common within strike-slip fault zones. Previous studies show that the NF is a right-lateral strike-slip fault with a minor thrust component, and that the fault damage zone is characterized by Riedel shear structures dominated by Y shears (main faults), R shears and P foliations (Lin, 2001). We interpret that the fractures of Group (1) correspond to Y Riedel fault shears, and those of Group (2) are R shears. Such Riedel shear structures indicate that the NF is a right-lateral strike-slip fault which is activated under a regional stress field oriented to the direction close to east-west, coincident with that inferred from geophysical observations (Tsukahara et al., 2001), seismic inversion results (Katao, 1997) and geological structures (Lin, 2001).Katao et al., 1997. J. Phys. Earth, 45, 105.Lin, 2016. AGU, Fall Meeting.Lin, 2001. J. Struc. Geo., 23, 1167.Miyawaki and Uchida, 2016. AGU, Fall Meeting.Tsukahara et al., 2001. Isl. Arc, 10, 261.
Resolving the fault systems with the magnetotelluric method in the western Ilan plain of NE Taiwan
NASA Astrophysics Data System (ADS)
Chang, P. Y.; Chen, C. S.
2017-12-01
In the study we attempt to use the magnetotelluric (MT) surveys to delineate the basement topography of the western part of the Ilan plain. The triangular plain is located on the extension part of the Okinawa Trough, and is thought to be a subsidence basin bounded by the Hsueshan Range in the north and the Central Range in the south. The basement of the basin is composed of Tertiary metamorphic rocks such as argillites and slates. The recent extension of the Okinawa Trough started from approximately 0.1 Ma and involved ENE- and WSW-trending normal faults that may extended into the Ilan plain area. However, high sedimentation rates as well as the frequent human activities have resulted in unconsolidated sediments with a thickness of over 100 meters, and caused the difficulties in observing the surface traces of the active faults in the area. Hence we deployed about 70 MT stations across the southwestern tip of the triangular plain. We also tried to resolve the subsurface faults the relief variations of the basement with the inverted resistivity images, since the saturated sediments are relatively conductive and the consolidated rocks are resistive. With the inverted MT images, we found that there are a series of N-S trending horsts and grabens in addition to the ENE-WSW normal fault systems. The ENE-WSW trending faults are dipping mainly toward the north in our study area in the western tip of the Ilan plain. The preliminary results suggest that a younger N-S trending normal fault system may modify the relief of the basement in the recent stage after the activation of the ENE-WSW normal faults. The findings of the MT resistivity images provide new information to further review the tectonic explanations of the region in the future.
NASA Astrophysics Data System (ADS)
Yang, Z.; Juanes, R.
2015-12-01
The geomechanical processes associated with subsurface fluid injection/extraction is of central importance for many industrial operations related to energy and water resources. However, the mechanisms controlling the stability and slip motion of a preexisting geologic fault remain poorly understood and are critical for the assessment of seismic risk. In this work, we develop a coupled hydro-geomechanical model to investigate the effect of fluid injection induced pressure perturbation on the slip behavior of a sealing fault. The model couples single-phase flow in the pores and mechanics of the solid phase. Granular packs (see example in Fig. 1a) are numerically generated where the grains can be either bonded or not, depending on the degree of cementation. A pore network is extracted for each granular pack with pore body volumes and pore throat conductivities calculated rigorously based on geometry of the local pore space. The pore fluid pressure is solved via an explicit scheme, taking into account the effect of deformation of the solid matrix. The mechanics part of the model is solved using the discrete element method (DEM). We first test the validity of the model with regard to the classical one-dimensional consolidation problem where an analytical solution exists. We then demonstrate the ability of the coupled model to reproduce rock deformation behavior measured in triaxial laboratory tests under the influence of pore pressure. We proceed to study the fault stability in presence of a pressure discontinuity across the impermeable fault which is implemented as a plane with its intersected pore throats being deactivated and thus obstructing fluid flow (Fig. 1b, c). We focus on the onset of shear failure along preexisting faults. We discuss the fault stability criterion in light of the numerical results obtained from the DEM simulations coupled with pore fluid flow. The implication on how should faults be treated in a large-scale continuum model is also presented.
NASA Astrophysics Data System (ADS)
Morgan, J. K.
2014-12-01
Particle-based numerical simulations allow detailed investigations of small-scale processes and mechanisms associated with fault initiation and slip, which emerge naturally in such models. This study investigates the evolving mechanical conditions and associated micro-mechanisms during transient slip on a weak decollement propagating beneath a growing contractional wedge (e.g., accretionary prism, fold and thrust belt). The models serve as analogs of the seismic cycle, although lacking full earthquake dynamics. Nonetheless, the mechanical evolution of both decollement and upper plate can be monitored, and correlated with the particle-scale physical and contact properties, providing insights into changes that accompany such stick-slip behavior. In this study, particle assemblages consolidated under gravity and bonded to impart cohesion, are pushed at a constant velocity above a weak, unbonded decollement surface. Forward propagation of decollement slip occurs in discrete pulses, modulated by heterogeneous stress conditions (e.g., roughness, contact bridging) along the fault. Passage of decollement slip resets the stress along this horizon, producing distinct patterns: shear stress is enhanced in front of the slipped decollement due to local contact bridging and fault locking; shear stress minima occur immediately above the tip, denoting local stress release and contact reorganization following slip; more mature portions of the fault exhibit intermediate shear stress, reflecting more stable contact force distributions and magnitudes. This pattern of shear stress pre-conditions the decollement for future slip events, which must overcome the high stresses at the fault tip. Long-term slip along the basal decollement induces upper plate contraction. When upper plate stresses reach critical strength conditions, new thrust faults break through the upper plate, relieving stresses and accommodating horizontal shortening. Decollement activity retreats back to the newly formed thrust fault. The cessation of upper plate fault slip causes gradual increases in upper plate stresses, rebuilding shear stresses along the decollement and enabling renewed pulses of decollement slip. Thus, upper plate deformation occurs out of phase with decollement propagation.
Final Project Report. Scalable fault tolerance runtime technology for petascale computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Sadayappan, P
With the massive number of components comprising the forthcoming petascale computer systems, hardware failures will be routinely encountered during execution of large-scale applications. Due to the multidisciplinary, multiresolution, and multiscale nature of scientific problems that drive the demand for high end systems, applications place increasingly differing demands on the system resources: disk, network, memory, and CPU. In addition to MPI, future applications are expected to use advanced programming models such as those developed under the DARPA HPCS program as well as existing global address space programming models such as Global Arrays, UPC, and Co-Array Fortran. While there has been amore » considerable amount of work in fault tolerant MPI with a number of strategies and extensions for fault tolerance proposed, virtually none of advanced models proposed for emerging petascale systems is currently fault aware. To achieve fault tolerance, development of underlying runtime and OS technologies able to scale to petascale level is needed. This project has evaluated range of runtime techniques for fault tolerance for advanced programming models.« less
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Cigna, F.; Osmanoglu, B.; Dixon, T.; Wdowinski, S.
2011-12-01
Subsidence and faulting have affected Mexico city for more than a century and the process is becoming widespread throughout larger urban areas in central Mexico. This process causes substantial damages to the urban infrastructure and housing structures and will certainly become a major factor to be considered when planning urban development, land use zoning and hazard mitigation strategies in the next decades. Subsidence is usually associated with aggressive groundwater extraction rates and a general decrease of aquifer static level that promotes soil consolidation, deformation and ultimately, surface faulting. However, local stratigraphic and structural conditions also play an important role in the development and extension of faults. In all studied cases stratigraphy of the uppermost sediment strata and the structure of the underlying volcanic rocks impose a much different subsidence pattern which is most suitable for imaging through satellite geodetic techniques. We present examples from several cities in central Mexico: a) Mexico-Chalco. Very high rates of subsidence, up to 370 mm/yr are observed within this lacustrine environment surrounded by Pliocene-Quaternary volcanic structures. b) Aguascalientes where rates up to 90 mm/yr in the past decade are observed, is controlled by a stair stepped N-S trending graben that induces nucleation of faults along the edges of contrasting sediment package thicknesses. c) Morelia presents subsidence rates as high as 80 mm/yr. Differential deformation is observed across major basin-bounding E-W trending faults and with higher subsidence rates on their hanging walls, where the thickest sequences of compressible Quaternary sediments crop out. Our subsidence and faulting study in urban areas of central Mexico is based on a horizontal gradient analysis using displacement maps from Persistent Scatterer InSAR that allows definition of areas with high vulnerability to surface faulting. Correlation of the surface subsidence pattern through satellite geodesy and surface faults show that the principal factor for defining these hazardous areas is best determined not by solely using the subsidence magnitude rates but rather by using a combined magnitude and horizontal subsidence gradient analysis. This approach is used as the basis for the generation of subsidence-induced surface faulting hazard maps for the studied urban areas.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-03
... Billing for Skilled Nursing Facilities for FY 2014; Correction AGENCY: Centers for Medicare & Medicaid...; Prospective Payment System and Consolidated Billing for Skilled Nursing Facilities for FY 2014.'' DATES: These...
Fault management for data systems
NASA Technical Reports Server (NTRS)
Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann
1993-01-01
Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.
Program listing for fault tree analysis of JPL technical report 32-1542
NASA Technical Reports Server (NTRS)
Chelson, P. O.
1971-01-01
The computer program listing for the MAIN program and those subroutines unique to the fault tree analysis are described. Some subroutines are used for analyzing the reliability block diagram. The program is written in FORTRAN 5 and is running on a UNIVAC 1108.
KINKFOLD—an AutoLISP program for construction of geological cross-sections using borehole image data
NASA Astrophysics Data System (ADS)
Özkaya, Sait Ismail
2002-04-01
KINKFOLD is an AutoLISP program designed to construct geological cross-sections from borehole image or dip meter logs. The program uses the kink-fold method for cross-section construction. Beds are folded around hinge lines as angle bisectors so that bedding thickness remains unchanged. KINKFOLD may be used to model a wide variety of parallel fold structures, including overturned and faulted folds, and folds truncated by unconformities. The program accepts data from vertical or inclined boreholes. The KINKFOLD program cannot be used to model fault drag, growth folds, inversion structures or disharmonic folds where the bed thickness changes either because of deformation or deposition. Faulted structures and similar folds can be modelled by KINKFOLD by omitting dip measurements within fault drag zones and near axial planes of similar folds.
Determining on-fault earthquake magnitude distributions from integer programming
Geist, Eric L.; Parsons, Thomas E.
2018-01-01
Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106 variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.
ERIC Educational Resources Information Center
Hassett, Kevin A.; Shapiro, Robert J.
2004-01-01
The federal government's student loan programs have been very successful, with two-thirds of all students or their families relying on loans provided or subsidized by the federal government. One of these student loan programs allows student borrowers to consolidate their previous loans into a single loan at a subsidized fixed rate based on the…
Map and database of Quaternary faults in Venezuela and its offshore regions
Audemard, F.A.; Machette, M.N.; Cox, J.W.; Dart, R.L.; Haller, K.M.
2000-01-01
As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.The project is sponsored by the International Lithosphere Program and funded by the USGS’s National Earthquake Hazards Reduction Program. The primary elements of the project are general supervision and interpretation of geologic/tectonic information, data compilation and entry for fault catalog, database design and management, and digitization and manipulation of data in †ARCINFO. For the compilation of data, we engaged experts in Quaternary faulting, neotectonics, paleoseismology, and seismology.
Using Dynamic Sensitivity Analysis to Assess Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey; Morell, Larry; Miller, Keith
1990-01-01
This paper discusses sensitivity analysis and its relationship to random black box testing. Sensitivity analysis estimates the impact that a programming fault at a particular location would have on the program's input/output behavior. Locations that are relatively \\"insensitive" to faults can render random black box testing unlikely to uncover programming faults. Therefore, sensitivity analysis gives new insight when interpreting random black box testing results. Although sensitivity analysis is computationally intensive, it requires no oracle and no human intervention.
Ontology-Based Method for Fault Diagnosis of Loaders.
Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei
2018-02-28
This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.
Ontology-Based Method for Fault Diagnosis of Loaders
Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei
2018-01-01
This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study. PMID:29495646
Gill, H.E.
1969-01-01
This report gives a general summary of the availability and use of ground water and describes the occurrence of ground water in five major geohydrologic provinces lying in the eight administrative regions of Ghana. The identification and delineation of the geohydrologic provinces are based on their distinctive characteristics with respect to the occurrence and availability of ground water. The Precambrian province occupies the southern, western, and northern parts of Ghana and is underlain largely by intrusive crystalline and metasedimentary rocks. The Voltaian province includes that part of the Voltaian sedimentary basin in central Ghana and is underlain chiefly by consolidated sandstone, mudstone, and shale. Narrow discontinuous bands of consolidated Devonian and Jurassic sedimentary rocks near the coast constitute the Coastal Block Fault province. The Coastal Plain province includes semiconsolidated to unconsolidated sediments of Cretaceous to Holocene age that underlie coastal plain areas in southwestern and southeastern Ghana. The Alluvial province includes the Quaternary alluvial deposits in the principal river valleys and on the delta of the Volta River. Because of the widespread distribution of crystalline and consolidated sedimentary rocks of low permeability in the Precambrian, Voltaian, and Coastal Block Fault provinces, it is difficult to develop large or event adequate groundwater supplies in much of Ghana. On the other hand, small (1 to 50 gallons per minute) supplies of water of usable quality are available from carefully sited boreholes in most parts of the country. Also, moderate (50 to 200 gpm) supplies of water are currently (1964) obtained from small-diameter screened boreholes tapping sand and limestone aquifers in the Coastal Plain province in southwestern and southeastern Ghana, but larger supplies could be obtained through properly constructed boreholes. In the Alluvial province, unconsolidated deposits in the larger stream valleys that are now largely undeveloped offer desirable locations for shallow vertical or horizontal wells, which can induce infiltration from streams and yield moderate to large water supplies. The principal factors that limit development of ground-water supplies in Ghana are (1) prevailing low permeability and water-yielding potential of the crystalline and consolidated sedimentary rocks that underlie most of the country, (2) highly mineralized ground water which appears to be widely distributed in the northern part of the Voltaian province, and (3) potential problems of salt-water encroachment in the Coastal Plain province in the Western Region and in the Keta area. On the other hand, weathering has increased porosity and has thus substantially increased the water-yielding potential of the crystalline and consolidated sedimentary rocks in much of central and northern Ghana. Also, with proper construction and development, much larger yields than those now (1964) prevalent could be obtained from boreholes tapping sand and limestone aquifers in the Coastal Plain province.
MIRAP, microcomputer reliability analysis program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jehee, J.N.T.
1989-01-01
A program for a microcomputer is outlined that can determine minimal cut sets from a specified fault tree logic. The speed and memory limitations of the microcomputers on which the program is implemented (Atari ST and IBM) are addressed by reducing the fault tree's size and by storing the cut set data on disk. Extensive well proven fault tree restructuring techniques, such as the identification of sibling events and of independent gate events, reduces the fault tree's size but does not alter its logic. New methods are used for the Boolean reduction of the fault tree logic. Special criteria formore » combining events in the 'AND' and 'OR' logic avoid the creation of many subsuming cut sets which all would cancel out due to existing cut sets. Figures and tables illustrates these methods. 4 refs., 5 tabs.« less
Evolution and Submarine Landslide Potential of Monterey Canyon Head, Offshore Central California
NASA Astrophysics Data System (ADS)
Maier, K. L.; Johnson, S. Y.; Hart, P. E.; Hartwell, S. R.
2016-12-01
Monterey Canyon, offshore central California, incises the shelf from near the shoreline to 30 km seaward where axial water depths approach 2,000 m. It is one of the world's most studied submarine canyons, yet debate continues concerning its age, formation, and associated geologic hazards. To address these issues, the USGS, with partial support from the California Seafloor Mapping Program, collected hundreds of kilometers of high-resolution, mini-sparker, single-channel (2009 and 2011 surveys) and multichannel (2015 survey) seismic-reflection profiles near the canyon head. The seismic data were combined with multibeam bathymetry to generate a geologic map of the proximal canyon, which delineates numerous faults and compound submarine landslide headwall scarps (covering up to 4 km2) along canyon walls. Seismic-reflection data reveal a massive ( 100 km2 lateral extent) paleochannel cut-and-fill complex underlying the proximal canyon. These subsurface cut-and-fill deposits span both sides of the relatively narrow modern canyon head, crop out in canyon walls, and incise into Purisima Formation (late Miocene and Pliocene) bedrock to depths of up to 0.3 s two-way travel time ( 240 m) below the modern shelf. We propose that the paleochannel complex represents previous locations of a migrating canyon head, and attribute its origin to multiple alternating cycles of fluvial and submarine canyon erosion and deposition linked to fluctuating sea levels. Thus, the canyon head imaged in modern bathymetry is a relatively young feature, perhaps forming in the last 20,000 years of sea-level rise. The paleocanyon deposits are significantly less consolidated than bedrock in deeper canyon walls, and therefore, are probably more prone to submarine landsliding. Nearby mapped faults occur within the active, distributed, San Andreas fault system, and earthquake-generated strong ground motions are likely triggers for past and future submarine landslides and potential associated tsunamis.
NASA Astrophysics Data System (ADS)
Yim, Keun Soo
This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of program states that included dynamically allocated memory (to be spatially comprehensive). In GPUs, we used fault injection studies to demonstrate the importance of detecting silent data corruption (SDC) errors that are mainly due to the lack of fine-grained protections and the massive use of fault-insensitive data. This dissertation also presents transparent fault tolerance frameworks and techniques that are directly applicable to hybrid computers built using only commercial off-the-shelf hardware components. This dissertation shows that by developing understanding of the failure characteristics and error propagation paths of target programs, we were able to create fault tolerance frameworks and techniques that can quickly detect and recover from hardware faults with low performance and hardware overheads.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on Labor and Human Resources.
This congressional report contains testimony pertaining to reauthorization of the Vocational Rehabilitation Act, which was drafted to authorize funds for programs covered by the act and consolidate, coordinate, and improve employment, training, literacy, and vocational rehabilitation programs in the United States. Statements were provided by three…
NASA Astrophysics Data System (ADS)
Kroenke, Samantha E.
In June 2009, a 2.2 square mile 3-D high resolution seismic reflection survey was shot in southeastern Illinois in the Phillipstown Consolidated oilfield. A well was drilled in the 3-D survey area to tie the seismic to the geological data with a synthetic seismogram from the sonic log. The objectives of the 3-D seismic survey were three-fold: (1) To image and interpret faulting of the Herald-Phillipstown Fault using drillhole-based geological and seismic cross-sections and structural contour maps created from the drillhole data and seismic reflection data, (2) To test the effectiveness of imaging the faults by selected seismic attributes, and (3) To compare spectral decomposition amplitude maps with an isochron map and an isopach map of a selected geologic interval (VTG interval). Drillhole and seismic reflection data show that various formation offsets increase near the main Herald-Phillipstown fault, and that the fault and its large offset subsidiary faults penetrate the Precambrian crystalline basement. A broad, northeast-trending 10,000 feet wide graben is consistently observed in the drillhole data. Both shallow and deep formations in the geological cross-sections reveal small horst and graben features within the broad graben created possibly in response to fault reactivations. The HPF faults have been interpreted as originally Precambrian age high-angle, normal faults reactivated with various amounts and types of offset. Evidence for strike-slip movement is also clear on several faults. Changes in the seismic attribute values in the selected interval and along various time slices throughout the whole dataset correlate with the Herald-Phillipstown faults. Overall, seismic attributes could provide a means of mapping large offset faults in areas with limited or absent drillhole data. Results of the spectral decomposition suggest that if the interval velocity is known for a particular formation or interval, high-resolution 3-D seismic reflection surveys could utilize these amplitudes as an alternative seismic interpretation method for estimating formation thicknesses. A VTG isopach map was compared with an isochron map and a spectral decomposition amplitude map. The results reveal that the isochron map strongly correlates with the isopach map as well as the spectral decomposition map. It was also found that thicker areas in the isopach correlated with higher amplitude values in the spectral decomposition amplitude map. Offsets along the faults appear sharper in these amplitudes and isochron maps than in the isopach map, possibly as a result of increased spatial sampling.
40 CFR 164.32 - Consolidation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Consolidation. 164.32 Section 164.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS RULES OF PRACTICE... REGISTER, CANCELLATIONS OF REGISTRATIONS, CHANGES OF CLASSIFICATIONS, SUSPENSIONS OF REGISTRATIONS AND...
40 CFR 164.32 - Consolidation.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Consolidation. 164.32 Section 164.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS RULES OF PRACTICE... REGISTER, CANCELLATIONS OF REGISTRATIONS, CHANGES OF CLASSIFICATIONS, SUSPENSIONS OF REGISTRATIONS AND...
DOT National Transportation Integrated Search
2012-08-01
Concrete specimens were fabricated for shrinkage, creep, and abrasion resistance : testing. Variations of self-consolidating concrete (SCC) and conventional concrete were : all tested. The results were compared to previous similar testing programs an...
Diagnostics Tools Identify Faults Prior to Failure
NASA Technical Reports Server (NTRS)
2013-01-01
Through the SBIR program, Rochester, New York-based Impact Technologies LLC collaborated with Ames Research Center to commercialize the Center s Hybrid Diagnostic Engine, or HyDE, software. The fault detecting program is now incorporated into a software suite that identifies potential faults early in the design phase of systems ranging from printers to vehicles and robots, saving time and money.
Fluid Pressure in the Shallow Plate Interface at the Nankai Trough Subduction Zone
NASA Astrophysics Data System (ADS)
Tobin, H. J.; Saffer, D.
2003-12-01
The factors controlling the occurrence, magnitude, and other characteristics of great earthquakes is a fundamental outstanding question in fault physics. Pore fluid pressure is perhaps the most critical yet poorly known parameter governing the strength and seismogenic character of plate boundary faults, but unfortunately cannot be directly inferred through available geophysical sensing methods. Moreover, true in situ fluid pressure has proven difficult to measure even in boreholes. At the Nankai Trough, several hundred meters of sediment are subducted beneath the frontal portion of the accretionary prism. The up-dip portion of the plate interface is therefore hosted in these fine-grained marine sedimentary rocks. ODP Leg 190 and 196 showed that these rapidly-loaded underthrust sediments are significantly overpressured near the deformation front. Here, we attempt to quantitatively infer porosity, pore pressure, and effective normal stress at the plate interface at depths currently inaccessible to drilling. Using seismic reflection interval velocity calibrated at the boreholes to porosity, we quantitatively infer pore pressure to ˜ 20 km down-dip of the deformation front, to a plate interface depth of ˜ 6 km. We have developed a Nankai-specific velocity-porosity transform using ODP cores and logs. We use this function to derive a porosity profile for each of two down-dip seismic sections extracted from a 3-D dataset from the Cape Muroto region. We then calculate pore fluid pressure and effective vertical (fault-normal) stress for the underthrust sediment section using a compaction disequilibrium approach and core-based consolidation test data. Because the pore fluid pressure at the fault interface is likely controlled by that of the top of the underthrust section, this calculation represents a quantitative profile of effective stress and pore pressure at the plate interface. Results show that seismic velocity and porosity increase systematically downdip in the underthrust section, but the increase is suppressed relative to that expected from normally consolidating sediments. The computed pore pressure increases landward from an overpressure ratio (λ * = hydrostatic pressure divided by the lithostatic overburden) of ˜ 0.6 at the deformation front to ˜ 0.77 where sediments have been subducted 15 km. The results of this preliminary analysis suggest that a 3-dimensional mapping of predicted effective normal stress in the seismic data volume is possible.
Examples of Nonconservatism in the CARE 3 Program
NASA Technical Reports Server (NTRS)
Dotson, Kelly J.
1988-01-01
This paper presents parameter regions in the CARE 3 (Computer-Aided Reliability Estimation version 3) computer program where the program overestimates the reliability of a modeled system without warning the user. Five simple models of fault-tolerant computer systems are analyzed; and, the parameter regions where reliability is overestimated are given. The source of the error in the reliability estimates for models which incorporate transient fault occurrences was not readily apparent. However, the source of much of the error for models with permanent and intermittent faults can be attributed to the choice of values for the run-time parameters of the program.
Certification of computational results
NASA Technical Reports Server (NTRS)
Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.
1993-01-01
A conceptually novel and powerful technique to achieve fault detection and fault tolerance in hardware and software systems is described. When used for software fault detection, this new technique uses time and software redundancy and can be outlined as follows. In the initial phase, a program is run to solve a problem and store the result. In addition, this program leaves behind a trail of data called a certification trail. In the second phase, another program is run which solves the original problem again. This program, however, has access to the certification trail left by the first program. Because of the availability of the certification trail, the second phase can be performed by a less complex program and can execute more quickly. In the final phase, the two results are compared and if they agree the results are accepted as correct; otherwise an error is indicated. An essential aspect of this approach is that the second program must always generate either an error indication or a correct output even when the certification trail it receives from the first program is incorrect. The certification trail approach to fault tolerance is formalized and realizations of it are illustrated by considering algorithms for the following problems: convex hull, sorting, and shortest path. Cases in which the second phase can be run concurrently with the first and act as a monitor are discussed. The certification trail approach are compared to other approaches to fault tolerance.
Injecting Artificial Memory Errors Into a Running Computer Program
NASA Technical Reports Server (NTRS)
Bornstein, Benjamin J.; Granat, Robert A.; Wagstaff, Kiri L.
2008-01-01
Single-event upsets (SEUs) or bitflips are computer memory errors caused by radiation. BITFLIPS (Basic Instrumentation Tool for Fault Localized Injection of Probabilistic SEUs) is a computer program that deliberately injects SEUs into another computer program, while the latter is running, for the purpose of evaluating the fault tolerance of that program. BITFLIPS was written as a plug-in extension of the open-source Valgrind debugging and profiling software. BITFLIPS can inject SEUs into any program that can be run on the Linux operating system, without needing to modify the program s source code. Further, if access to the original program source code is available, BITFLIPS offers fine-grained control over exactly when and which areas of memory (as specified via program variables) will be subjected to SEUs. The rate of injection of SEUs is controlled by specifying either a fault probability or a fault rate based on memory size and radiation exposure time, in units of SEUs per byte per second. BITFLIPS can also log each SEU that it injects and, if program source code is available, report the magnitude of effect of the SEU on a floating-point value or other program variable.
NASA Technical Reports Server (NTRS)
Hatfield, Jack J.; Villarreal, Diana
1990-01-01
The topic of advanced display and control technology is addressed along with the major objectives of this technology, the current state of the art, major accomplishments, research programs and facilities, future trends, technology issues, space transportation systems applications and projected technology readiness for those applications. The holes that may exist between the technology needs of the transportation systems versus the research that is currently under way are addressed, and cultural changes that might facilitate the incorporation of these advanced technologies into future space transportation systems are recommended. Some of the objectives are to reduce life cycle costs, improve reliability and fault tolerance, use of standards for the incorporation of advancing technology, and reduction of weight, volume and power. Pilot workload can be reduced and the pilot's situational awareness can be improved, which would result in improved flight safety and operating efficiency. This could be accomplished through the use of integrated, electronic pictorial displays, consolidated controls, artificial intelligence, and human centered automation tools. The Orbiter Glass Cockpit Display is an example examined.
78 FR 77574 - Protection System Maintenance Reliability Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... protection system component type, except that the maintenance program for all batteries associated with the... Electric System reliability and promoting efficiency through consolidation [of protection system-related... ITC that PRC-005-2 promotes efficiency by consolidating protection system maintenance requirements...
Consolidation of data base for Army generalized missile model
NASA Technical Reports Server (NTRS)
Klenke, D. J.; Hemsch, M. J.
1980-01-01
Data from plume interaction tests, nose mounted canard configuration tests, and high angle of attack tests on the Army Generalized Missile model are consolidated in a computer program which makes them readily accessible for plotting, listing, and evaluation. The program is written in FORTRAN and will run on an ordinary minicomputer. It has the capability of retrieving any coefficient from the existing DATAMAN tapes and displaying it in tabular or plotted form. Comparisons of data taken in several wind tunnels and of data with the predictions of Program MISSILE2 are also presented.
Measurement of fault latency in a digital avionic mini processor, part 2
NASA Technical Reports Server (NTRS)
Mcgough, J.; Swern, F.
1983-01-01
The results of fault injection experiments utilizing a gate-level emulation of the central processor unit of the Bendix BDX-930 digital computer are described. Several earlier programs were reprogrammed, expanding the instruction set to capitalize on the full power of the BDX-930 computer. As a final demonstration of fault coverage an extensive, 3-axis, high performance flght control computation was added. The stages in the development of a CPU self-test program emphasizing the relationship between fault coverage, speed, and quantity of instructions were demonstrated.
Software For Fault-Tree Diagnosis Of A System
NASA Technical Reports Server (NTRS)
Iverson, Dave; Patterson-Hine, Ann; Liao, Jack
1993-01-01
Fault Tree Diagnosis System (FTDS) computer program is automated-diagnostic-system program identifying likely causes of specified failure on basis of information represented in system-reliability mathematical models known as fault trees. Is modified implementation of failure-cause-identification phase of Narayanan's and Viswanadham's methodology for acquisition of knowledge and reasoning in analyzing failures of systems. Knowledge base of if/then rules replaced with object-oriented fault-tree representation. Enhancement yields more-efficient identification of causes of failures and enables dynamic updating of knowledge base. Written in C language, C++, and Common LISP.
Procedures for Computing Site Seismicity
1994-02-01
Fourth World Conference on Earthquake Engineering, Santiago, Chile , 1969. Schnabel, P.B., J. Lysmer, and H.B. Seed (1972). SHAKE, a computer program for...This fault system is composed of the Elsinore and Whittier fault zones, Agua Caliente fault, and Earthquake Valley fault. Five recent earthquakes of
Moment tensor solutions for the Iberian-Maghreb region during the IberArray deployment (2009-2013)
NASA Astrophysics Data System (ADS)
Martín, R.; Stich, D.; Morales, J.; Mancilla, F.
2015-11-01
We perform regional moment tensor inversion for 84 earthquakes that occurred in the Iberian-Maghreb region during the second and third leg of IberArray deployment (2009-2013). During this period around 300 seismic broadband stations were operating in the area, reducing the interstation spacing to ~ 50 km over extended areas. We use the established processing sequence of the IAG moment tensor catalogue, increasing to 309 solutions with this update. New moment tensor solutions present magnitudes ranging from Mw 3.2 to 6.3 and source depths from 2 to 620 km. Most solutions correspond to Northern Algeria, where a compressive deformation pattern is consolidated. The Betic-Rif sector shows a progression of faulting styles from mainly shear faulting in the east via predominantly extension in the central sector to reverse and strike-slip faulting in the west. At the SW Iberia margin, the predominance of strike-slip and reverse faulting agrees with the expected transpressive character of the Eurasian-Nubia plate boundary. New strike-slip and oblique reverse solutions in the Trans-Alboran Shear Zone reflect its left-lateral regime. The most significant improvement corresponds to the Atlas Mountains and the surroundings of the Gibraltar Arc with scarce previous solutions. Reverse and strike-slip faulting solutions in the Atlas System display the accommodation of plate convergence by shortening in the belt. At the Gibraltar Arc, several new solutions were obtained at lower crustal and subcrustal depths. These mechanisms show substantial heterogeneity, covering the full range of faulting styles with highly variable orientations of principal stress axes, including opposite strike slip faulting solutions at short distance. The observations are not straightforward to explain by a simple geodynamic scenario and suggest the interplay of different processes, among them plate convergence in old oceanic lithospheric with large brittle thickness at the SW Iberia margin, as well as delamination of thickened continental lithosphere beneath the Betic-Rif arc.
Origin and late quaternary tectonism of a western Canadian continental shelf trough
NASA Astrophysics Data System (ADS)
Moslow, Thomas F.; Luternauer, John L.; Rohr, Kristin
1991-08-01
Analyses of high resolution and multi-channel seismic profiles from the central continental shelf of western Canada ascribe a late Quaternary glacial origin to large-scale troughs. Along the margins of Moresby Trough, one of three large-scale cross-shelf bathymetric depressions in Queen Charlotte Sound, seismic profiles within Quaternary sediments show a divergence of reflectors, thickening and folding of seismic units, and concavity of reflectors suggestive of drag. Compactional subsidence, growth faulting, and compaction faulting are also observed. Fault traces commonly terminate below the seabed. Deformation of Quaternary sediments due to faulting is plastic in nature and maximum offset of reflectors is 2.5 m. The observed Quaternary deformation appears to be a product of rapid deposition, loading and subsidence of late Quaternary sediment, which is unrelated to seismic activity. In addition, Quaternary faulting was probably activated by post-glacial loading and isostatic rebound of consolidated Tertiary strata along the margins of continental shelf troughs. The presence of mass movement (slump or debris flow) deposits overlying lithified Tertiary strata along the flanks of Moresby Trough provides the only evidence of seismic activity in the study area. The lack of a mud drape over these deposits implies a late Holocene age for the timing of their emplacement. The Quaternary troughs are incised into Tertiary-aged sedimentary fill of the Queen Charlotte basin. Previous workers had interpreted seafloor escarpments paralleling the trough margins to indicate that the location of Moresby Trough was controlled by renewed or continued activity on Tertiary-aged faults. A multi-channel seismic line across Moresby Trough shows that such an escarpment on the seafloor does not correlate to faults either in the Tertiary basin fill or the underlying basement. Tertiary reflectors are continuous underneath Moresby Trough; the seafloor escarpment is an erosional feature and was not created by reactivation of Tertiary structures. Trough erosion and subsequent fill (up to 175 m thick) are entirely of Quaternary age.
34 CFR 685.220 - Consolidation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false Consolidation. 685.220 Section 685.220 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION WILLIAM D. FORD FEDERAL DIRECT LOAN PROGRAM Borrower Provisions § 685.220...
School Consolidation Survey: Iredell County: Mooresville, Statesville, North Carolina.
ERIC Educational Resources Information Center
Engelhardt and Engelhardt, Inc., Purdy Station, NY.
This document explores the feasibility of consolidating three North Carolina school districts into a single administrative unit. Factors analyzed include future population and enrollment growth, existing buildings and school building needs, program offerings, staff qualifications, administrative organization, and financial considerations of…
Field-assisted sintering and phase transition of ZnS-CaLa 2S 4 composite ceramics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yiyu; Zhang, Lihua; Kisslinger, Kim
In the present study, zinc sulfide (ZnS) and calcium lanthanum sulfide (CaLa 2S 4, CLS) composite ceramics were consolidated via field-assisted sintering of 0.5ZnS-0.5CLS (volume ratio) composite powders at 800–1050 °C. Through sintering curve analyses and microstructural observations, it was determined that between 800 and 1000 °C, grain boundary diffusion was the main mechanism controlling grain growth for both the ZnS and CLS phases within the composite ceramics. The consolidated composite ceramics were determined to be composed of sphalerite ZnS, wurtzite ZnS and thorium phosphate CLS. The sphalerite-wurtzite phase transition of ZnS was further demonstrated to be accompanied by themore » formation of stacking faults and twins in the ceramics. Furthermore, it was also found that the addition of the CLS phase improved the indentation hardness of the ceramics relative to pure ZnS by homogeneous dispersion of ZnS and CLS small grains.« less
Field-assisted sintering and phase transition of ZnS-CaLa 2S 4 composite ceramics
Li, Yiyu; Zhang, Lihua; Kisslinger, Kim; ...
2017-07-17
In the present study, zinc sulfide (ZnS) and calcium lanthanum sulfide (CaLa 2S 4, CLS) composite ceramics were consolidated via field-assisted sintering of 0.5ZnS-0.5CLS (volume ratio) composite powders at 800–1050 °C. Through sintering curve analyses and microstructural observations, it was determined that between 800 and 1000 °C, grain boundary diffusion was the main mechanism controlling grain growth for both the ZnS and CLS phases within the composite ceramics. The consolidated composite ceramics were determined to be composed of sphalerite ZnS, wurtzite ZnS and thorium phosphate CLS. The sphalerite-wurtzite phase transition of ZnS was further demonstrated to be accompanied by themore » formation of stacking faults and twins in the ceramics. Furthermore, it was also found that the addition of the CLS phase improved the indentation hardness of the ceramics relative to pure ZnS by homogeneous dispersion of ZnS and CLS small grains.« less
NASA Astrophysics Data System (ADS)
Yoneda, J.; Oshima, M.; Kida, M.; Kato, A.; Konno, Y.; Jin, Y.; Waite, W. F.; Jang, J.; Kumar, P.; Tenma, N.
2017-12-01
Pressure coring and analysis technology allows for gas hydrate to be recovered from the deep seabed, transferred to the laboratory and characterized while continuously maintaining gas hydrate stability. For this study, dozens of hydrate-bearing pressure core sediment subsections recovered from the Krishna-Godavari Basin during India's National Gas Hydrate Program Expedition NGHP-02 were tested with Pressure Core Non-destructive Analysis Tools (PNATs) through a collaboration between Japan and India. PNATs, originally developed by AIST as a part of the Japanese National hydrate research program (MH21, funded by METI) conducted permeability, compression and consolidation tests under various effective stress conditions, including the in situ stress state estimated from downhole bulk density measurements. At the in situ effective stress, gas hydrate-bearing sediments had an effective permeability range of 0.01-10mD even at pore-space hydrate saturations above 60%. Permeability increased by 10 to 100 times after hydrate dissociation at the same effective stress, but these post-dissociation gains were erased when effective stress was increased from in situ values ( 1 MPa) to 10MPa in a simulation of the depressurization method for methane extraction from hydrate. Vertical-to-horizontal permeability anisotropy was also investigated. First-ever multi-stage loading tests and strain-rate alternation compression tests were successfully conducted for evaluating sediment strengthening dependence on the rate and magnitude of effective confining stress changes. In addition, oedometer tests were performed up to 40MPa of consolidation stress to simulate the depressurization method in ultra-deep sea environments. Consolidation curves measured with and without gas hydrate were investigated over a wide range of effective confining stresses. Compression curves for gas hydrate-bearing sediments were convex downward due to high hydrate saturations. Consolidation tests show that, regardless of the consolidation history with hydrate in place, the consolidation behavior after dissociation will first return to, then follow, the original normal consolidation curve for the hydrate-free host sediment.
NASA Astrophysics Data System (ADS)
Goto, J.; Moriya, T.; Yoshimura, K.; Tsuchi, H.; Karasaki, K.; Onishi, T.; Ueta, K.; Tanaka, S.; Kiho, K.
2010-12-01
The Nuclear Waste Management Organization of Japan (NUMO), in collaboration with Lawrence Berkeley National Laboratory (LBNL), has carried out a project to develop an efficient and practical methodology to characterize hydrologic property of faults since 2007, exclusively for the early stage of siting a deep underground repository. A preliminary flowchart of the characterization program and a classification scheme of fault hydrology based on the geological feature have been proposed. These have been tested through the field characterization program on the Wildcat Fault in Berkeley, California. The Wildcat Fault is a relatively large non-active strike-slip fault which is believed to be a subsidiary of the active Hayward Fault. Our classification scheme assumes the contrasting hydrologic features between the linear northern part and the split/spread southern part of the Wildcat Fault. The field characterization program to date has been concentrated in and around the LBNL site on the southern part of the fault. Several lines of electrical and reflection seismic surveys, and subsequent trench investigations, have revealed the approximate distribution and near-surface features of the Wildcat Fault (see also Onishi, et al. and Ueta, et al.). Three 150m deep boreholes, WF-1 to WF-3, have been drilled on a line normal to the trace of the fault in the LBNL site. Two vertical holes were placed to characterize the undisturbed Miocene sedimentary formations at the eastern and western sides of the fault (WF-1 and WF-2 respectively). WF-2 on the western side intersected the rock formation, which was expected only in WF-1, and several of various intensities. Therefore, WF-3, originally planned as inclined to penetrate the fault, was replaced by the vertical hole further to the west. It again encountered unexpected rocks and faults. Preliminary results of in-situ hydraulic tests suggested that the transmissivity of WF-1 is ten to one hundred times higher than WF-2. The monitoring of hydraulic pressure displayed different head distribution patterns between WF-1 and WF-2 (see also Karasaki, et al.). Based on these results, three hypotheses on the distribution of the Wildcat Fault were proposed: (a) a vertical fault in between WF-1 and WF-2, (b) a more gently dipping fault intersected in WF-2 and WF-3, and (c) a wide zone of faults extending between WF-1 and WF-3. At present, WF-4, an inclined hole to penetrate the possible (eastern?) master fault, is ongoing to test these hypotheses. After the WF-4 investigation, hydrologic and geochemical analyses and modeling of the southern part of the fault will be carried out. A simpler field characterization program will also be carried out in the northern part of the fault. Finally, all the results will be synthesized to improve the comprehensive methodology.
Preijers, Frank W M B; van der Velden, Vincent H J; Preijers, Tim; Brooimans, Rik A; Marijt, Erik; Homburg, Christa; van Montfort, Kees; Gratama, Jan W
2016-05-01
In 1985, external quality assurance was initiated in the Netherlands to reduce the between-laboratory variability of leukemia/lymphoma immunophenotyping and to improve diagnostic conclusions. This program consisted of regular distributions of test samples followed by biannual plenary participant meetings in which results were presented and discussed. A scoring system was developed in which the quality of results was rated by systematically reviewing the pre-analytical, analytical, and post-analytical assay stages using three scores, i.e., correct (A), minor fault (B), and major fault (C). Here, we report on 90 consecutive samples distributed to 40-61 participating laboratories between 1998 and 2012. Most samples contained >20% aberrant cells, mainly selected from mature lymphoid malignancies (B or T cell) and acute leukemias (myeloid or lymphoblastic). In 2002, minimally required monoclonal antibody (mAb) panels were introduced, whilst methodological guidelines for all three assay stages were implemented. Retrospectively, we divided the study into subsequent periods of 4 ("initial"), 4 ("learning"), and 7 years ("consolidation") to detect "learning effects." Uni- and multivariate models showed that analytical performance declined since 2002, but that post-analytical performance improved during the entire period. These results emphasized the need to improve technical aspects of the assay, and reflected improved interpretational skills of the participants. A strong effect of participant affiliation in all three assay stages was observed: laboratories in academic and large peripheral hospitals performed significantly better than those in small hospitals. © 2015 International Clinical Cytometry Society. © 2015 International Clinical Cytometry Society.
Reliability database development for use with an object-oriented fault tree evaluation program
NASA Technical Reports Server (NTRS)
Heger, A. Sharif; Harringtton, Robert J.; Koen, Billy V.; Patterson-Hine, F. Ann
1989-01-01
A description is given of the development of a fault-tree analysis method using object-oriented programming. In addition, the authors discuss the programs that have been developed or are under development to connect a fault-tree analysis routine to a reliability database. To assess the performance of the routines, a relational database simulating one of the nuclear power industry databases has been constructed. For a realistic assessment of the results of this project, the use of one of existing nuclear power reliability databases is planned.
PV Systems Reliability Final Technical Report: Ground Fault Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lavrova, Olga; Flicker, Jack David; Johnson, Jay
We have examined ground faults in PhotoVoltaic (PV) arrays and the efficacy of fuse, current detection (RCD), current sense monitoring/relays (CSM), isolation/insulation (Riso) monitoring, and Ground Fault Detection and Isolation (GFID) using simulations based on a Simulation Program with Integrated Circuit Emphasis SPICE ground fault circuit model, experimental ground faults installed on real arrays, and theoretical equations.
NASA Astrophysics Data System (ADS)
Bitzer, Klaus
1999-05-01
Geological processes that create sedimentary basins or act during their formation can be simulated using the public domain computer code `BASIN'. For a given set of geological initial and boundary conditions the sedimentary basin evolution is calculated in a forward modeling approach. The basin is represented in a two-dimensional vertical cross section with individual layers. The stratigraphic, tectonic, hydrodynamic and thermal evolution is calculated beginning at an initial state, and subsequent changes of basin geometry are calculated from sedimentation rates, compaction and pore fluid mobilization, isostatic compensation, fault movement and subsidence. The sedimentologic, hydraulic and thermal parameters are stored at discrete time steps allowing the temporal evolution of the basin to be analyzed. A maximum flexibility in terms of geological conditions is achieved by using individual program modules representing geological processes which can be switched on and off depending on the data available for a specific simulation experiment. The code incorporates a module for clastic and carbonate sedimentation, taking into account the impact of clastic sediment supply on carbonate production. A maximum of four different sediment types, which may be mixed during sedimentation, can be defined. Compaction and fluid flow are coupled through the consolidation equation and the nonlinear form of the equation of state for porosity, allowing nonequilibrium compaction and overpressuring to be calculated. Instead of empirical porosity-effective stress equations, a physically consistent consolidation model is applied which incorporates a porosity dependent sediment compressibility. Transient solute transport and heat flow are calculated as well, applying calculated fluid flow rates from the hydraulic model. As a measure for hydrocarbon generation, the Time-Temperature Index (TTI) is calculated. Three postprocessing programs are available to provide graphic output in PostScript format: BASINVIEW is used to display the distribution of parameters in the simulated cross-section of the basin for defined time steps. It is used in conjunction with the Ghostview software, which is freeware and available on most computer systems. AIBASIN provides PostScript output for Adobe Illustrator®, taking advantage of the layer-concept which facilitates further graphic manipulation. BASELINE is used to display parameter distribution at a defined well or to visualize the temporal evolution of individual elements located in the simulated sedimentary basin. The modular structure of the BASIN code allows additional processes to be included. A module to simulate reactive transport and diagenetic reactions is planned for future versions. The program has been applied to existing sedimentary basins, and it has also shown a high potential for classroom instruction, giving the possibility to create hypothetical basins and to interpret basin evolution in terms of sequence stratigraphy or petroleum potential.
Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing
2012-12-14
Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing Matei Zaharia Tathagata Das Haoyuan Li Timothy Hunter Scott Shenker Ion...SUBTITLE Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...time. However, current programming models for distributed stream processing are relatively low-level often leaving the user to worry about consistency of
Development of an improved method of consolidating fatigue life data
NASA Technical Reports Server (NTRS)
Leis, B. N.; Sampath, S. G.
1978-01-01
A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.
Fault-Tolerant Control For A Robotic Inspection System
NASA Technical Reports Server (NTRS)
Tso, Kam Sing
1995-01-01
Report describes first phase of continuing program of research on fault-tolerant control subsystem of telerobotic visual-inspection system. Goal of program to develop robotic system for remotely controlled visual inspection of structures in outer space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almasi, Gheorghe; Blumrich, Matthias Augustin; Chen, Dong
Methods and apparatus perform fault isolation in multiple node computing systems using commutative error detection values for--example, checksums--to identify and to isolate faulty nodes. When information associated with a reproducible portion of a computer program is injected into a network by a node, a commutative error detection value is calculated. At intervals, node fault detection apparatus associated with the multiple node computer system retrieve commutative error detection values associated with the node and stores them in memory. When the computer program is executed again by the multiple node computer system, new commutative error detection values are created and stored inmore » memory. The node fault detection apparatus identifies faulty nodes by comparing commutative error detection values associated with reproducible portions of the application program generated by a particular node from different runs of the application program. Differences in values indicate a possible faulty node.« less
Effect of Ethanol Blends and Batching Operations on SCC of Carbon Steel
DOT National Transportation Integrated Search
2011-02-08
This is the draft final report of the project on blending and batching (WP#325) of the Consolidated Program on Development of Guidelines for Safe and Reliable Pipeline Transportation of Ethanol Blends. The other two aspects of the consolidated progra...
24 CFR 91.410 - Housing market analysis.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Housing market analysis. 91.410... Development CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS Consortia; Contents of Consolidated Plan § 91.410 Housing market analysis. Housing market analysis must be described in the...
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, John C.
1987-01-01
Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.
Experiments in fault tolerant software reliability
NASA Technical Reports Server (NTRS)
Mcallister, David F.; Vouk, Mladen A.
1989-01-01
Twenty functionally equivalent programs were built and tested in a multiversion software experiment. Following unit testing, all programs were subjected to an extensive system test. In the process sixty-one distinct faults were identified among the versions. Less than 12 percent of the faults exhibited varying degrees of positive correlation. The common-cause (or similar) faults spanned as many as 14 components. However, a majority of these faults were trivial, and easily detected by proper unit and/or system testing. Only two of the seven similar faults were difficult faults, and both were caused by specification ambiguities. One of these faults exhibited variable identical-and-wrong response span, i.e. response span which varied with the testing conditions and input data. Techniques that could have been used to avoid the faults are discussed. For example, it was determined that back-to-back testing of 2-tuples could have been used to eliminate about 90 percent of the faults. In addition, four of the seven similar faults could have been detected by using back-to-back testing of 5-tuples. It is believed that most, if not all, similar faults could have been avoided had the specifications been written using more formal notation, the unit testing phase was subject to more stringent standards and controls, and better tools for measuring the quality and adequacy of the test data (e.g. coverage) were used.
NASA Astrophysics Data System (ADS)
Páez, G. N.; Permuy Vidal, C.; Galina, M.; López, L.; Jovic, S. M.; Guido, D. M.
2018-04-01
This work explores the textural characteristics, morphology and facies architecture of well-preserved Paleocene hyaloclastic and peperitic breccias associated with subvolcanic intrusions at El Guanaco gold mine (Northern Chile). The El Guanaco mine volcanic sequence is part of a polymagmatic compound cone-dome volcanic complex grouping several dacitic domes and maar-diatremes, and subordinated subvolcanic intrusions of basaltic, andesitic and dacitic compositions. The Soledad-Peñafiel Fault System is a first order regional structure controlling the location and style of the volcanism in the region. Three different intrusive bodies (Basaltic sills, Dacitic cryptodomes and Andesitic cryptodomes) were found to intrude into a wet and poorly consolidated pyroclastic sequence representing the upper portions of a maar-diatreme. Consequently, extensive quench fragmentation and fluidization along their contacts occurred, leading to the formation of widespread breccia bodies enclosing a coherent nucleus. Differences in matrix composition allows to define two main breccias types: 1) poorly-sorted monomictic breccias (intrusive hyaloclastites) and 2) poorly-sorted tuff-matrix breccias (peperites). The observed facies architecture is interpreted as the result of the interplay of several factors, including: 1) magma viscosity, 2) the geometry of the intrusives, and 3) variations on the consolidation degree of the host rocks. Additionally, the overall geometry of each intrusive is interpreted to be controlled by the effective viscosity of the magmas along with the available magma volume at the time of the intrusions. The presence of three compositionally different subvolcanic bodies with intrusive hyaloclastite and peperite envelopes indicate, not only that all these intrusions occurred in a short period of time (probably less than 2-3 Ma), but also that the volcaniciclastic pile suffer little or none compaction nor consolidation during that time. The presence of three compositionally variated synvolcanic intrusives can be explained either by the presence of a zoned magmatic chamber feeding the volcanic complex, or more likely by the influence of the Soledad-Peñafiel Fault Zone acting as a preferential pathway for different magma compositions/sources to rise to the surface.
Field aided characterization of a sandstone reservoir: Arroyo Grande Oil Field, California, USA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antonellini, M.; Aydin, A.
1995-08-01
The Arroyo Grande Oil Field in Central California has been productive since 1905 from the miopliocene Edna member of the Pismo formation. The Edna member is a massive poorly consolidated sandstone unit with an average porosity of 0.2 and a permeability of 1000-5000 md; the producing levels are shallow, 100 to 500 m from the ground surface. Excellent surface exposures of the same formation along road cuts across the field and above the reservoir provide an opportunity to study reservoir rocks at the surface and to relate fracture and permeability distribution obtained from cores to folds and faults observed inmore » outcrops. We mapped in outcrops the major structures of the oil field and determine the statistical distribution and orientation of small faults (deformation bands) that have been observed both in cores and outcrop. The relation between deformation bands and major structures has also been characterized with detailed mapping. By using synthetic logs it is possible to determine the log signature of structural heterogeneities such as deformation bands in sandstone; these faults cause a neutron porosity drop respect to the host rock in the order of 1-4%. Image analysis has been used to determine the petrophysical properties of the sandstone in outcrop and in cores; permeability is three orders of magnitude lower in faults than in the host rock and capillary pressure is 1-2 orders of magnitude larger in faults than in the host rock. Faults with tens of meters offsets are associated with an high density of deformation bands (10 to 250 m{sup -1}) and with zones of cement precipitation up to 30 m from the fault. By combining well and field data, we propose a structural model for the oil field in which high angle reverse faults with localized deformation bands control the distribution of the hydrocarbons on the limb of a syncline, thereby explaining the seemingly unexpected direction of slope of the top surface of the reservoir which was inferred by well data only.« less
Seismic influence in the Quaternary uplift of the Central Chile coastal margin, preliminary results.
NASA Astrophysics Data System (ADS)
Valdivia, D.; del Valle, F.; Marquardt, C.; Elgueta, S.
2017-12-01
In order to quantify the influence of NW striking potentially seismogenic normal faults over the longitudinal variation of the Central Chile Coastal margin uplift, we measured Quaternary marine terraces, which represent the tectonic uplift of the coastal margin. Movement in margin oblique normal faults occurs by co-seismic extension of major subduction earthquakes and has occurred in the Pichilemu fault, generating a 7.0 Mw earthquake after the 2010 8.8 Mw Maule earthquake.The coastal area between 32° and 34° S was selected due to the presence of a well-preserved sequence of 2 to 5 Quaternary marine terraces. In particular, the margin oblique normal NW-trending, SW-dipping Laguna Verde fault, south of Valparaiso (33° S) puts in contact contrasting morphologies: to the south, a flat coast with wide marine terraces is carved in both, Jurassic plutonic rocks and Neogene semi-consolidated marine sediments; to the north, a steeper scarp with narrower marine terraces, over 120 m above the corresponding ones in the southern coast, is carved in Jurassic plutonic rocks.We have collected over 6 months microseimic data, providing information on seismic activity and underground geometry of the Laguna Verde fault. We collected ca. 100 systematic measurements of fringes at the base of paleo coastal scarps through field mapping and a 5 m digital elevation model. These fringes mark the maximum sea level during the terrace's carving.The heights of these fringes range between 0 and 250 masl. We estimate a 0.7 mm/yr slip rate for the Laguna Verde fault based on the height difference between corresponding terraces north- and southward, with an average uplift rate of 0.3 mm/yr for the whole area.NW striking normal faults, besides representing a potential seismic threat to the near population on one of the most densely populated areas of Chile, heavily controls the spatial variation of the coastal margin uplift. In Laguna Verde, the uplift rate differs more than three times northward of the fault.
Re-Envisioning a DNP Program for Quality and Sustainability.
Killien, Marcia; Thompson, Hilaire; Kieckhefer, Gail; Bekemeier, Betty; Kozuki, Yoriko; Perry, Cynthia K
When the University of Washington, School of Nursing determined that its post-BSN-DNP degree program, with multiple specialty tracks and programs of study, was not sustainable, the curriculum was re-envisioned. The revised program is consistent with the American Association of Colleges of Nursing (AACN) Essentials of Doctoral Education for Advanced Nursing Practice and the national Licensure Accreditation, Certification, and Education (LACE) model. The re-envisioned program was conceptualized as a single degree in which students preparing for any specialty would have the same number of required credits with the majority of courses (DNP core) required for all students. Two major pathways, 1) advanced practice registered nursing and 2) advanced systems and population health were identified. The model allows for specialties to be added or discontinued without major disruption to the core curriculum. The consolidated curriculum reduced instructional costs to the school by approximately 26% and reduced and made more equitable the tuition costs for the majority of students. The revised consolidated program is innovative, maintains quality, attracts students, and aligns with resources. This article discusses how we achieved revision and consolidation of a post-BSN DNP program with multiple specialty tracks that is innovative, high quality, sustainable, and replicable by other schools of nursing. Copyright © 2016 Elsevier Inc. All rights reserved.
Clustering: Working Together for Better Schools.
ERIC Educational Resources Information Center
Nachtigal, Paul; Parker, Sylvia D.
With declining enrollments and budget limitations, it becomes more and more difficult for small rural schools to offer state-approved programs (often based on the "bigger is better" model of education). For many already consolidated districts, further consolidation is not a viable solution to the problem. Cooperative arrangements are…
DOT National Transportation Integrated Search
2010-12-01
As part of a national experiment sponsored by the FHWA under the Innovative Bridge Research and Construction (IBRC) : program, CDOT used self-consolidating concrete (SCC) to construct abutments, piers, and retaining walls on a bridge : replacement pr...
76 FR 81950 - Privacy Act; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... ``Consolidated Data Repository'' (09-90-1000). This system of records is being amended to include records... Repository'' (SORN 09-90-1000). OIG is adding record sources to the system. This system fulfills our..., and investigations of the Medicare and Medicaid programs. SYSTEM NAME: Consolidated Data Repository...
24 CFR 91.315 - Strategic plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Strategic plan. 91.315 Section 91... CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS State Governments; Contents of Consolidated Plan § 91.315 Strategic plan. (a) General. For the categories described in paragraphs (b), (c), (d...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Busbey, A.B.
Seismic Processing Workshop, a program by Parallel Geosciences of Austin, TX, is discussed in this column. The program is a high-speed, interactive seismic processing and computer analysis system for the Apple Macintosh II family of computers. Also reviewed in this column are three products from Wilkerson Associates of Champaign, IL. SubSide is an interactive program for basin subsidence analysis; MacFault and MacThrustRamp are programs for modeling faults.
Multistate Health Plans: Agents for Competition or Consolidation?
Moffit, Robert E; Meredith, Neil R
2015-01-01
We discuss and evaluate the Multi-State Plan (MSP) Program, a provision of the Affordable Care Act that has not been the subject of much debate as yet. The MSP Program provides the Office of Personnel Management with new authority to negotiate and implement multistate insurance plans on all health insurance exchanges within the United States. We raise the concern that the MSP Program may lead to further consolidation of the health insurance industry despite the program's stated goal of increasing competition by means of health insurance exchanges. The MSP Program arguably gives a competitive advantage to large insurers, which already dominate health insurance markets. We also contend that the MSP Program's failure to produce increased competition may motivate a new effort for a public health insurance option. © The Author(s) 2015.
Application of superalloy powder metallurgy for aircraft engines
NASA Technical Reports Server (NTRS)
Dreshfield, R. L.; Miner, R. V., Jr.
1980-01-01
The results of the Materials for Advanced Turbine Engines (MATE) program initiated by NASA are presented. Mechanical properties comparisons are made for superalloy parts produced by as-HIP powder consolidation and by forging of HIP consolidated billets. The effect of various defects on the mechanical properties of powder parts are shown.
DOT National Transportation Integrated Search
2012-08-01
The main objective of this study was to determine the effect on bond performance : of mild reinforcing steel in self-consolidating concrete (SCC). The SCC test program : consisted of comparing the bond performance of normal and high strength SCC with...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
... the Assistant Secretary for Community Planning and Development, HUD. ACTION: Notice of Extension of... requirements for Consolidated Planning for Community Planning and Development (CPD) programs described below... INFORMATION CONTACT: Salvatore Sclafani, Office of Community Planning & Development, telephone (202) 402-4364...
ERIC Educational Resources Information Center
Young, Eileen, Ed.
The 20th annual report describes federally-funded compensatory programs for educationally disadvantaged children (migrants, handicapped, neglected, and delinquent) provided in Ohio through Chapter I of the Education Consolidation and Improvement Act and presents statistics for fiscal 1985, participation trends, instructional impact, expenditure…
The geology of the Oceanographer Transform: The ridge-transform intersection
NASA Astrophysics Data System (ADS)
Karson, J. A.; Fox, P. J.; Sloan, H.; Crane, K. T.; Kidd, W. S. F.; Bonatti, E.; Stroup, J. B.; Fornari, D. J.; Elthon, D.; Hamlyn, P.; Casey, J. F.; Gallo, D. G.; Needham, D.; Sartori, R.
1984-06-01
Seven dives in the submersible ALVIN and four deep-towed (ANGUS) camera lowerings have been made at the eastern ridge-transform intersection of the Oceanographer Transform with the axis of the Mid-Atlantic Ridge. These data constrain our understanding of the processes that create and shape the distinctive morphology that is characteristic of slowly-slipping ridge-transform-ridge plate boundaries. Although the geological relationships observed in the rift valley floor in the study area are similar to those reported for the FAMOUS area, we observe a distinct change in the character of the rift valley floor with increasing proximity to the transform. Over a distance of approximately ten kilometers the volcanic constructional terrain becomes increasingly more disrupted by faulting and degraded by mass wasting. Moreover, proximal to the transform boundary, faults with orientations oblique to the trend of the rift valley are recognized. The morphology of the eastern rift valley wall is characterized by inward-facing scarps that are ridge-axis parallel, but the western rift valley wall, adjacent to the active transform zone, is characterized by a complex fault pattern defined by faults exhibiting a wide range of orientations. However, even for transform parallel faults no evidence for strike-slip displacement is observed throughout the study area and evidence for normal (dip-slip) displacement is ubiquitous. Basalts, semi-consolidated sediments (chalks, debris slide deposits) and serpentinized ultramafic rocks are recovered from localities within or proximal to the rift valley. The axis of accretion-principal transform displacement zone intersection is not clearly established, but appears to be located along the E-W trending, southern flank of the deep nodal basin that defines the intersection of the transform valley with the rift floor.
U.S. National Institutes of Health core consolidation-investing in greater efficiency.
Chang, Michael C; Birken, Steven; Grieder, Franziska; Anderson, James
2015-04-01
The U.S. National Institutes of Health (NIH) invests substantial resources in core research facilities (cores) that support research by providing advanced technologies and scientific and technical expertise as a shared resource. In 2010, the NIH issued an initiative to consolidate multiple core facilities into a single, more efficient core. Twenty-six institutions were awarded supplements to consolidate a number of similar core facilities. Although this approach may not work for all core settings, this effort resulted in consolidated cores that were more efficient and of greater benefit to investigators. The improvements in core operations resulted in both increased services and more core users through installation of advanced instrumentation, access to higher levels of management expertise; integration of information management and data systems; and consolidation of billing; purchasing, scheduling, and tracking services. Cost recovery to support core operations also benefitted from the consolidation effort, in some cases severalfold. In conclusion, this program of core consolidation resulted in improvements in the effective operation of core facilities, benefiting both investigators and their supporting institutions.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Neighborhood Development program (see 24 CFR part 594); (15) The “Lead-Based Paint Hazard Reduction Program... following formula grant programs are covered by the consolidated plan: (1) The Community Development Block...
Code of Federal Regulations, 2012 CFR
2012-04-01
... Neighborhood Development program (see 24 CFR part 594); (15) The “Lead-Based Paint Hazard Reduction Program... following formula grant programs are covered by the consolidated plan: (1) The Community Development Block...
Code of Federal Regulations, 2011 CFR
2011-04-01
... Neighborhood Development program (see 24 CFR part 594); (15) The “Lead-Based Paint Hazard Reduction Program... following formula grant programs are covered by the consolidated plan: (1) The Community Development Block...
Code of Federal Regulations, 2014 CFR
2014-04-01
... Neighborhood Development program (see 24 CFR part 594); (15) The “Lead-Based Paint Hazard Reduction Program... following formula grant programs are covered by the consolidated plan: (1) The Community Development Block...
Using certification trails to achieve software fault tolerance
NASA Technical Reports Server (NTRS)
Sullivan, Gregory F.; Masson, Gerald M.
1993-01-01
A conceptually novel and powerful technique to achieve fault tolerance in hardware and software systems is introduced. When used for software fault tolerance, this new technique uses time and software redundancy and can be outlined as follows. In the initial phase, a program is run to solve a problem and store the result. In addition, this program leaves behind a trail of data called a certification trail. In the second phase, another program is run which solves the original problem again. This program, however, has access to the certification trail left by the first program. Because of the availability of the certification trail, the second phase can be performed by a less complex program and can execute more quickly. In the final phase, the two results are accepted as correct; otherwise an error is indicated. An essential aspect of this approach is that the second program must always generate either an error indication or a correct output even when the certification trail it receives from the first program is incorrect. The certification trail approach to fault tolerance was formalized and it was illustrated by applying it to the fundamental problem of finding a minimum spanning tree. Cases in which the second phase can be run concorrectly with the first and act as a monitor are discussed. The certification trail approach was compared to other approaches to fault tolerance. Because of space limitations we have omitted examples of our technique applied to the Huffman tree, and convex hull problems. These can be found in the full version of this paper.
Object-oriented fault tree evaluation program for quantitative analyses
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1988-01-01
Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.
Borgia, A.; Delaney, P.T.; Denlinger, R.P.
2000-01-01
As volcanoes grow, they become ever heavier. Unlike mountains exhumed by erosion of rocks that generally were lithified at depth, volcanoes typically are built of poorly consolidated rocks that may be further weakened by hydrothermal alteration. The substrates upon which volcanoes rest, moreover, are often sediments lithified by no more than the weight of the volcanic overburden. It is not surprising, therefore, that volcanic deformation includes-and in the long term is often dominated by-spreading motions that translate subsidence near volcanic summits to outward horizontal displacements around the flanks and peripheries. We review examples of volcanic spreading and go on to derive approximate expressions for the time volcanoes require to deform by spreading on weak substrates. We also demonstrate that shear stresses that drive low-angle thrust faulting from beneath volcanic constructs have maxima at volcanic peripheries, just where such faults are seen to emerge. Finally, we establish a theoretical basis for experimentally derived scalings that delineate volcanoes that spread from those that do not.
System and method of detecting cavitation in pumps
Lu, Bin; Sharma, Santosh Kumar; Yan, Ting; Dimino, Steven A.
2017-10-03
A system and method for detecting cavitation in pumps for fixed and variable supply frequency applications is disclosed. The system includes a controller having a processor programmed to repeatedly receive real-time operating current data from a motor driving a pump, generate a current frequency spectrum from the current data, and analyze current data within a pair of signature frequency bands of the current frequency spectrum. The processor is further programmed to repeatedly determine fault signatures as a function of the current data within the pair of signature frequency bands, repeatedly determine fault indices based on the fault signatures and a dynamic reference signature, compare the fault indices to a reference index, and identify a cavitation condition in a pump based on a comparison between the reference index and a current fault index.
Reliability computation using fault tree analysis
NASA Technical Reports Server (NTRS)
Chelson, P. O.
1971-01-01
A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.
NASA Astrophysics Data System (ADS)
Peterson, D. E.; Keranen, K. M.
2017-12-01
Differences in fluid pressure and mechanical properties at megathrust boundaries in subduction zones have been proposed to create varying seismogenic behavior. In Cascadia, where large ruptures are possible but little seismicity occurs presently, new seismic transects across the deformation front (COAST cruise; Holbrook et al., 2012) image an unusually high-wavespeed sedimentary unit directly overlying oceanic crust. Wavespeed increases before sediments reach the deformation front, and the well-laminated unit, consistently of 1 km thickness, can be traced for 50 km beneath the accretionary prism before imaging quality declines. Wavespeed is modeled via iterative prestack time migration (PSTM) imaging and increases from 3.5 km/sec on the seaward end of the profile to >5.0 km/s near the deformation front. Landward of the deformation front, wavespeed is low along seaward-dipping thrust faults in the Quaternary accretionary prism, indicative of rapid dewatering along faults. The observed wavespeed of 5.5 km/sec just above subducting crust is consistent with porosity <5% (Erickson and Jarrard, 1998), possibly reflecting enhanced consolidation, cementation, and diagenesis as the sediments encounter the deformation front. Beneath the sediment, the compressional wavespeed of uppermost oceanic crust is 3-4 km/sec, likely reduced by alteration and/or fluids, lowest within a propagator wake. The propagator wake intersects the plate boundary at an oblique angle and changes the degree of hydration of the oceanic plate as it subducts within our area. Fluid flow out of oceanic crust is likely impeded by the low-porosity basal sediment package except along the focused thrust faults. Decollements are present at the top of oceanic basement, at the top of the high-wavespeed basal unit, and within sedimentary strata at higher levels; the decollement at the top of oceanic crust is active at the toe of the deformation front. The basal sedimentary unit appears to be mechanically strong, similar to observations from offshore Sumatra, where strongly consolidated sediments at the deformation front are interpreted to facilitate megathrust rupture to the trench (Hupers et al., 2017). A uniformly strong plate interface at Cascadia may inhibit microseismicity while building stress that is released in great earthquakes.
NASA Technical Reports Server (NTRS)
Brunelle, J. E.; Eckhardt, D. E., Jr.
1985-01-01
Results are presented of an experiment conducted in the NASA Avionics Integrated Research Laboratory (AIRLAB) to investigate the implementation of fault-tolerant software techniques on fault-tolerant computer architectures, in particular the Software Implemented Fault Tolerance (SIFT) computer. The N-version programming and recovery block techniques were implemented on a portion of the SIFT operating system. The results indicate that, to effectively implement fault-tolerant software design techniques, system requirements will be impacted and suggest that retrofitting fault-tolerant software on existing designs will be inefficient and may require system modification.
FTC - THE FAULT-TREE COMPILER (SUN VERSION)
NASA Technical Reports Server (NTRS)
Butler, R. W.
1994-01-01
FTC, the Fault-Tree Compiler program, is a tool used to calculate the top-event probability for a fault-tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. The high-level input language is easy to understand and use. In addition, the program supports a hierarchical fault tree definition feature which simplifies the tree-description process and reduces execution time. A rigorous error bound is derived for the solution technique. This bound enables the program to supply an answer precisely (within the limits of double precision floating point arithmetic) at a user-specified number of digits accuracy. The program also facilitates sensitivity analysis with respect to any specified parameter of the fault tree such as a component failure rate or a specific event probability by allowing the user to vary one failure rate or the failure probability over a range of values and plot the results. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. FTC was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The program is written in PASCAL, ANSI compliant C-language, and FORTRAN 77. The TEMPLATE graphics library is required to obtain graphical output. The standard distribution medium for the VMS version of FTC (LAR-14586) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of FTC (LAR-14922) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. FTC was developed in 1989 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. SunOS is a trademark of Sun Microsystems, Inc.
FTC - THE FAULT-TREE COMPILER (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Butler, R. W.
1994-01-01
FTC, the Fault-Tree Compiler program, is a tool used to calculate the top-event probability for a fault-tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. The high-level input language is easy to understand and use. In addition, the program supports a hierarchical fault tree definition feature which simplifies the tree-description process and reduces execution time. A rigorous error bound is derived for the solution technique. This bound enables the program to supply an answer precisely (within the limits of double precision floating point arithmetic) at a user-specified number of digits accuracy. The program also facilitates sensitivity analysis with respect to any specified parameter of the fault tree such as a component failure rate or a specific event probability by allowing the user to vary one failure rate or the failure probability over a range of values and plot the results. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. FTC was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The program is written in PASCAL, ANSI compliant C-language, and FORTRAN 77. The TEMPLATE graphics library is required to obtain graphical output. The standard distribution medium for the VMS version of FTC (LAR-14586) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of FTC (LAR-14922) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. FTC was developed in 1989 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. SunOS is a trademark of Sun Microsystems, Inc.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
The Cost of the Consolidation Option for Student Loans. A CBO Paper
ERIC Educational Resources Information Center
Weinberg, Steven; Moore, Damien
2006-01-01
The federal government's student loan programs for higher education convey substantial financial benefits to borrowers because of their broad availability and favorable terms. Of the various provisions included in a federal student loan contract, the option to consolidate individual loans contributes greatly to a borrower's benefits and the cost…
13 CFR 108.470 - SBA approval of merger, consolidation, or reorganization of NMVC Company.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false SBA approval of merger, consolidation, or reorganization of NMVC Company. 108.470 Section 108.470 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW MARKETS VENTURE CAPITAL (âNMVCâ) PROGRAM Changes in Ownership, Structure, or...
24 CFR 982.355 - Portability: Administration by receiving PHA.
Code of Federal Regulations, 2011 CFR
2011-04-01
... by the receiving PHA. (1) If funding is available under the consolidated ACC for the receiving PHA... consolidated ACC for the receiving PHA tenant-based program. (2) HUD may require that the receiving PHA absorb... families to the receiving PHA from funds available under the initial PHA ACC. (2) HUD may provide...
24 CFR 982.355 - Portability: Administration by receiving PHA.
Code of Federal Regulations, 2010 CFR
2010-04-01
... by the receiving PHA. (1) If funding is available under the consolidated ACC for the receiving PHA... consolidated ACC for the receiving PHA tenant-based program. (2) HUD may require that the receiving PHA absorb... families to the receiving PHA from funds available under the initial PHA ACC. (2) HUD may provide...
ERIC Educational Resources Information Center
Ohio State Dept. of Education, Columbus.
This report summarizes recent compensatory education program activities in Ohio, which were funded through Chapter 1 of the Education Consolidation and Improvement Act. It presents and discusses statistics for the 1982-83 school year, participation trends, instructional impact, expenditure and staffing patterns, inservice training, parent…
FoxO6 regulates memory consolidation and synaptic function
Salih, Dervis A.M.; Rashid, Asim J.; Colas, Damien; de la Torre-Ubieta, Luis; Zhu, Ruo P.; Morgan, Alexander A.; Santo, Evan E.; Ucar, Duygu; Devarajan, Keerthana; Cole, Christina J.; Madison, Daniel V.; Shamloo, Mehrdad; Butte, Atul J.; Bonni, Azad; Josselyn, Sheena A.; Brunet, Anne
2012-01-01
The FoxO family of transcription factors is known to slow aging downstream from the insulin/IGF (insulin-like growth factor) signaling pathway. The most recently discovered FoxO isoform in mammals, FoxO6, is highly enriched in the adult hippocampus. However, the importance of FoxO factors in cognition is largely unknown. Here we generated mice lacking FoxO6 and found that these mice display normal learning but impaired memory consolidation in contextual fear conditioning and novel object recognition. Using stereotactic injection of viruses into the hippocampus of adult wild-type mice, we found that FoxO6 activity in the adult hippocampus is required for memory consolidation. Genome-wide approaches revealed that FoxO6 regulates a program of genes involved in synaptic function upon learning in the hippocampus. Consistently, FoxO6 deficiency results in decreased dendritic spine density in hippocampal neurons in vitro and in vivo. Thus, FoxO6 may promote memory consolidation by regulating a program coordinating neuronal connectivity in the hippocampus, which could have important implications for physiological and pathological age-dependent decline in memory. PMID:23222102
Detection of faults and software reliability analysis
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
Multiversion or N-version programming was proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. Specific topics addressed are: failure probabilities in N-version systems, consistent comparison in N-version systems, descriptions of the faults found in the Knight and Leveson experiment, analytic models of comparison testing, characteristics of the input regions that trigger faults, fault tolerance through data diversity, and the relationship between failures caused by automatically seeded faults.
Coordinated Fault Tolerance for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dongarra, Jack; Bosilca, George; et al.
2013-04-08
Our work to meet our goal of end-to-end fault tolerance has focused on two areas: (1) improving fault tolerance in various software currently available and widely used throughout the HEC domain and (2) using fault information exchange and coordination to achieve holistic, systemwide fault tolerance and understanding how to design and implement interfaces for integrating fault tolerance features for multiple layers of the software stack—from the application, math libraries, and programming language runtime to other common system software such as jobs schedulers, resource managers, and monitoring tools.
Implanted component faults and their effects on gas turbine engine performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacLeod, J.D.; Taylor, V.; Laflamme, J.C.G.
Under the sponsorship of the Canadian Department of National Defence, the Engine Laboratory of the National Research Council of Canada (NRCC) has established a program for the evaluation of component deterioration on gas turbine engine performance. The effect is aimed at investigating the effects of typical in-service faults on the performance characteristics of each individual engine component. The objective of the program is the development of a generalized fault library, which will be used with fault identification techniques in the field, to reduce unscheduled maintenance. To evaluate the effects of implanted faults on the performance of a single spool engine,more » such as an Allison T56 turboprop engine, a series of faulted parts were installed. For this paper the following faults were analyzed: (a) first-stage turbine nozzle erosion damage; (b) first-stage turbine rotor blade untwist; (c) compressor seal wear; (d) first and second-stage compressor blade tip clearance increase. This paper describes the project objectives, the experimental installation, and the results of the fault implantation on engine performance. Discussed are performance variations on both engine and component characteristics. As the performance changes were significant, a rigorous measurement uncertainty analysis is included.« less
Validation environment for AIPS/ALS: Implementation and results
NASA Technical Reports Server (NTRS)
Segall, Zary; Siewiorek, Daniel; Caplan, Eddie; Chung, Alan; Czeck, Edward; Vrsalovic, Dalibor
1990-01-01
The work is presented which was performed in porting the Fault Injection-based Automated Testing (FIAT) and Programming and Instrumentation Environments (PIE) validation tools, to the Advanced Information Processing System (AIPS) in the context of the Ada Language System (ALS) application, as well as an initial fault free validation of the available AIPS system. The PIE components implemented on AIPS provide the monitoring mechanisms required for validation. These mechanisms represent a substantial portion of the FIAT system. Moreover, these are required for the implementation of the FIAT environment on AIPS. Using these components, an initial fault free validation of the AIPS system was performed. The implementation is described of the FIAT/PIE system, configured for fault free validation of the AIPS fault tolerant computer system. The PIE components were modified to support the Ada language. A special purpose AIPS/Ada runtime monitoring and data collection was implemented. A number of initial Ada programs running on the PIE/AIPS system were implemented. The instrumentation of the Ada programs was accomplished automatically inside the PIE programming environment. PIE's on-line graphical views show vividly and accurately the performance characteristics of Ada programs, AIPS kernel and the application's interaction with the AIPS kernel. The data collection mechanisms were written in a high level language, Ada, and provide a high degree of flexibility for implementation under various system conditions.
Distributing Earthquakes Among California's Faults: A Binary Integer Programming Approach
NASA Astrophysics Data System (ADS)
Geist, E. L.; Parsons, T.
2016-12-01
Statement of the problem is simple: given regional seismicity specified by a Gutenber-Richter (G-R) relation, how are earthquakes distributed to match observed fault-slip rates? The objective is to determine the magnitude-frequency relation on individual faults. The California statewide G-R b-value and a-value are estimated from historical seismicity, with the a-value accounting for off-fault seismicity. UCERF3 consensus slip rates are used, based on geologic and geodetic data and include estimates of coupling coefficients. The binary integer programming (BIP) problem is set up such that each earthquake from a synthetic catalog spanning millennia can occur at any location along any fault. The decision vector, therefore, consists of binary variables, with values equal to one indicating the location of each earthquake that results in an optimal match of slip rates, in an L1-norm sense. Rupture area and slip associated with each earthquake are determined from a magnitude-area scaling relation. Uncertainty bounds on the UCERF3 slip rates provide explicit minimum and maximum constraints to the BIP model, with the former more important to feasibility of the problem. There is a maximum magnitude limit associated with each fault, based on fault length, providing an implicit constraint. Solution of integer programming problems with a large number of variables (>105 in this study) has been possible only since the late 1990s. In addition to the classic branch-and-bound technique used for these problems, several other algorithms have been recently developed, including pre-solving, sifting, cutting planes, heuristics, and parallelization. An optimal solution is obtained using a state-of-the-art BIP solver for M≥6 earthquakes and California's faults with slip-rates > 1 mm/yr. Preliminary results indicate a surprising diversity of on-fault magnitude-frequency relations throughout the state.
Mechanical alloying, characterization and consolidation of Ti-Al-Ni alloys
NASA Technical Reports Server (NTRS)
Nash, P.; Higgins, G. T.; Dillinger, N.; Hwang, S. J.; Kim, H.
1989-01-01
Mechanical alloying is being investigated as a processing route for the production of aluminide intermetallics. This program involves powder production and characterization, consolidation and thermal treatments and determination of microstructure-property relationships. An attritor mill is being used to produce powder in lots up to 1000 grams and the processing parameters are being systematically varied to establish the optimum milling conditions. The mill is being instrumented to generate data related to the processing to provide a basis for theoretical modeling. Powder is being characterized using thermal analysis, optical and electron microscopy and X-ray diffraction. Particle size distributions and powder density are being determined. Consolidation of the powder is being approached in several different ways including, cold isostatic pressing, sintering, extrusion and hot pressing. The results of the program so far will be presented and future directions discussed.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Charter. (b) Consolidates in a single document provisions of Secretary of Defense Multiple Addressee... EEO Program, the establishment of Special Emphasis Programs (SEPs) entitled the Federal Women's...
Semi-Markov adjunction to the Computer-Aided Markov Evaluator (CAME)
NASA Technical Reports Server (NTRS)
Rosch, Gene; Hutchins, Monica A.; Leong, Frank J.; Babcock, Philip S., IV
1988-01-01
The rule-based Computer-Aided Markov Evaluator (CAME) program was expanded in its ability to incorporate the effect of fault-handling processes into the construction of a reliability model. The fault-handling processes are modeled as semi-Markov events and CAME constructs and appropriate semi-Markov model. To solve the model, the program outputs it in a form which can be directly solved with the Semi-Markov Unreliability Range Evaluator (SURE) program. As a means of evaluating the alterations made to the CAME program, the program is used to model the reliability of portions of the Integrated Airframe/Propulsion Control System Architecture (IAPSA 2) reference configuration. The reliability predictions are compared with a previous analysis. The results bear out the feasibility of utilizing CAME to generate appropriate semi-Markov models to model fault-handling processes.
[The Application of the Fault Tree Analysis Method in Medical Equipment Maintenance].
Liu, Hongbin
2015-11-01
In this paper, the traditional fault tree analysis method is presented, detailed instructions for its application characteristics in medical instrument maintenance is made. It is made significant changes when the traditional fault tree analysis method is introduced into the medical instrument maintenance: gave up the logic symbolic, logic analysis and calculation, gave up its complicated programs, and only keep its image and practical fault tree diagram, and the fault tree diagram there are also differences: the fault tree is no longer a logical tree but the thinking tree in troubleshooting, the definition of the fault tree's nodes is different, the composition of the fault tree's branches is also different.
Usage of Fault Detection Isolation & Recovery (FDIR) in Constellation (CxP) Launch Operations
NASA Technical Reports Server (NTRS)
Ferrell, Rob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Spirkovska, Lilly; Hall, David; Brown, Barbara
2010-01-01
This paper will explore the usage of Fault Detection Isolation & Recovery (FDIR) in the Constellation Exploration Program (CxP), in particular Launch Operations at Kennedy Space Center (KSC). NASA's Exploration Technology Development Program (ETDP) is currently funding a project that is developing a prototype FDIR to demonstrate the feasibility of incorporating FDIR into the CxP Ground Operations Launch Control System (LCS). An architecture that supports multiple FDIR tools has been formulated that will support integration into the CxP Ground Operation's Launch Control System (LCS). In addition, tools have been selected that provide fault detection, fault isolation, and anomaly detection along with integration between Flight and Ground elements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-12-09
This report summarizes the authors review and evaluation of the existing seismic hazards program at Los Alamos National Laboratory (LANL). The report recommends that the original program be augmented with a probabilistic analysis of seismic hazards involving assignment of weighted probabilities of occurrence to all potential sources. This approach yields a more realistic evaluation of the likelihood of large earthquake occurrence particularly in regions where seismic sources may have recurrent intervals of several thousand years or more. The report reviews the locations and geomorphic expressions of identified fault lines along with the known displacements of these faults and last knowmore » occurrence of seismic activity. Faults are mapped and categorized into by their potential for actual movement. Based on geologic site characterization, recommendations are made for increased seismic monitoring; age-dating studies of faults and geomorphic features; increased use of remote sensing and aerial photography for surface mapping of faults; the development of a landslide susceptibility map; and to develop seismic design standards for all existing and proposed facilities at LANL.« less
Evaluating and extending user-level fault tolerance in MPI applications
Laguna, Ignacio; Richards, David F.; Gamblin, Todd; ...
2016-01-11
The user-level failure mitigation (ULFM) interface has been proposed to provide fault-tolerant semantics in the Message Passing Interface (MPI). Previous work presented performance evaluations of ULFM; yet questions related to its programability and applicability, especially to non-trivial, bulk synchronous applications, remain unanswered. In this article, we present our experiences on using ULFM in a case study with a large, highly scalable, bulk synchronous molecular dynamics application to shed light on the advantages and difficulties of this interface to program fault-tolerant MPI applications. We found that, although ULFM is suitable for master–worker applications, it provides few benefits for more common bulkmore » synchronous MPI applications. Furthermore, to address these limitations, we introduce a new, simpler fault-tolerant interface for complex, bulk synchronous MPI programs with better applicability and support than ULFM for application-level recovery mechanisms, such as global rollback.« less
Algorithm-Based Fault Tolerance for Numerical Subroutines
NASA Technical Reports Server (NTRS)
Tumon, Michael; Granat, Robert; Lou, John
2007-01-01
A software library implements a new methodology of detecting faults in numerical subroutines, thus enabling application programs that contain the subroutines to recover transparently from single-event upsets. The software library in question is fault-detecting middleware that is wrapped around the numericalsubroutines. Conventional serial versions (based on LAPACK and FFTW) and a parallel version (based on ScaLAPACK) exist. The source code of the application program that contains the numerical subroutines is not modified, and the middleware is transparent to the user. The methodology used is a type of algorithm- based fault tolerance (ABFT). In ABFT, a checksum is computed before a computation and compared with the checksum of the computational result; an error is declared if the difference between the checksums exceeds some threshold. Novel normalization methods are used in the checksum comparison to ensure correct fault detections independent of algorithm inputs. In tests of this software reported in the peer-reviewed literature, this library was shown to enable detection of 99.9 percent of significant faults while generating no false alarms.
Consolidated fuel reprocessing program
NASA Astrophysics Data System (ADS)
1985-04-01
A survey of electrochemical methods applications in fuel reprocessing was completed. A dummy fuel assembly shroud was cut using the remotely operated laser disassembly equipment. Operations and engineering efforts have continued to correct equipment operating, software, and procedural problems experienced during the previous uranium compaigns. Fuel cycle options were examined for the liquid metal reactor fuel cycle. In high temperature gas cooled reactor spent fuel studies, preconceptual designs were completed for the concrete storage cask and open field drywell storage concept. These and other tasks operating under the consolidated fuel reprocessing program are examined.
Subsidence Induced Faulting Hazard risk maps in Mexico City and Morelia, central Mexico
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Solano-Rojas, D.; Hernández-Espriu, J.; Cigna, F.; Wdowinski, S.; Osmanoglu, B.; Falorni, G.; Bohane, A.; Colombo, D.
2012-12-01
Subsidence and surface faulting have affected urban areas in Central Mexico for decades and the process has intensified as a consequence of urban sprawl and economic growth. This process causes substantial damages to the urban infrastructure and housing structures and in several cities it is becoming a major factor to be considered when planning urban development, land use zoning and hazard mitigation strategies in the next decades. Subsidence is usually associated with aggressive groundwater extraction rates and a general decrease of aquifer static level that promotes soil consolidation, deformation and ultimately, surface faulting. However, local stratigraphic and structural conditions also play an important role in the development and extension of faults. Despite its potential for damaging housing, and other urban infrastructure, the economic impact of this phenomena is poorly known, in part because detailed, city-wide subsidence induced faulting risk maps have not been published before. Nevertheless, modern remote sensing techniques are most suitable for this task. We present the results of a risk analysis for subsidence induced surface faulting in two cities in central Mexico: Morelia and Mexico City. Our analysis in Mexico City and Morelia is based on a risk matrix using the horizontal subsidence gradient from a Persistent Scatterer InSAR (Morelia) and SqueeSAR (Mexico City) analysis and 2010 census population distribution data from Mexico's National Institute of Statistics and Geography. Defining subsidence induced surface faulting vulnerability within these urbanized areas is best determined using both magnitude and horizontal subsidence gradient. Our Morelia analysis (597,000 inhabitants with localized subsidence rates up to 80 mm/yr) shows that 7% of the urbanized area is under a high to very high risk level, and 14% of its population (11.7% and 2.3% respectively) lives within these areas. In the case of the Mexico City (15'490,000 inhabitants for the Mexico city Metropolitan area included within our map, and up to 370 mm/yr subsidence rate) our risk map shows that 13.5% of the urbanized area is under a high to very high risk level, and 26.2% of its population (22.1% and 4.4% respectively) lives within these areas.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SENIOR FARMERS' MARKET NUTRITION PROGRAM (SFMNP) General § 249.3 Administration...' Market Nutrition Program (FMNP), one consolidated State Plan may be submitted for both programs, in...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SENIOR FARMERS' MARKET NUTRITION PROGRAM (SFMNP) General § 249.3 Administration...' Market Nutrition Program (FMNP), one consolidated State Plan may be submitted for both programs, in...
Code of Federal Regulations, 2011 CFR
2011-01-01
... Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SENIOR FARMERS' MARKET NUTRITION PROGRAM (SFMNP) General § 249.3 Administration...' Market Nutrition Program (FMNP), one consolidated State Plan may be submitted for both programs, in...
Code of Federal Regulations, 2012 CFR
2012-01-01
... Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SENIOR FARMERS' MARKET NUTRITION PROGRAM (SFMNP) General § 249.3 Administration...' Market Nutrition Program (FMNP), one consolidated State Plan may be submitted for both programs, in...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SENIOR FARMERS' MARKET NUTRITION PROGRAM (SFMNP) General § 249.3 Administration...' Market Nutrition Program (FMNP), one consolidated State Plan may be submitted for both programs, in...
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Bramon, Christopher; Inman, Sharon K.; Neeley, James R.; Jones, James V.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discrete programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as testability of the integrated flight vehicle especially problematic. The cost of fully automated diagnostics can be completely justified for a large fleet, but not so for a single flight vehicle. Fault detection is mandatory to assure the vehicle is capable of a safe launch, but fault isolation is another issue. SLS has considered various methods for fault isolation which can provide a reasonable balance between adequacy, timeliness and cost. This paper will address the analyses and decisions the NASA Logistics engineers are making to mitigate risk while providing a reasonable testability solution for fault isolation.
Lithogeochemical character of near-surface bedrock in the New England coastal basins
Robinson, Gilpin R.; Ayotte, Joseph D.; Montgomery, Denise L.; DeSimone, Leslie A.
2002-01-01
This geographic information system (GIS) data layer shows the generalized lithologic and geochemical, termed lithogeochemical, character of near-surface bedrock in the New England Coastal Basin (NECB) study area of the U.S. Geological Survey's National Water Quality Assessment (NAWQA) Program. The area encompasses 23,000 square miles in western and central Maine, eastern Massachusetts, most of Rhode Island, eastern New Hampshire and a small part of eastern Connecticut. The NECB study area includes the Kennebec, Androscoggin, Saco, Merrimack, Charles, and Blackstone River Basins, as well as all of Cape Cod. Bedrock units in the NECB study area are classified into lithogeochemical units based on the relative reactivity of their constituent minerals to dissolution and the presence of carbonate or sulfide minerals. The 38 lithogeochemical units are generalized into 7 major groups: (1) carbonate-bearing metasedimentary rocks; (2) primarily noncalcareous, clastic sedimentary rocks with restricted deposition in discrete fault-bounded sedimentary basins of Mississipian or younger age; (3) primarily noncalcareous, clastic sedimentary rocks at or above biotite-grade of regional metamorphism; (4) mafic igneous rocks and their metamorphic equivalents; (5) ultramafic rocks; (6) felsic igneous rocks and their metamorphic equivalents; and (7) unconsolidated and poorly consolidated sediments.
Acoustic and mechanical properties of Nankai accretionary prism core samples
NASA Astrophysics Data System (ADS)
Raimbourg, Hugues; Hamano, Yozo; Saito, Saneatsu; Kinoshita, Masataka; Kopf, Achim
2011-04-01
We studied undeformed sediment and accreted strata recently recovered by Ocean Drilling Program/Integrated Ocean Drilling Program (ODP/IODP) drilling in Nankai Trough convergent margin to unravel the changes in physical properties from initial deposition to incipient deformation. We have derived acoustic (Vp) and mechanical (uniaxial poroelastic compliance, compaction amplitude) properties of samples from various drill sites along the Muroto (ODP 1173) and Kii transects (IODP C0001, C0002, C0006, and C0007) from isotropic loading tests where confining and pore pressure were independently applied. We quantified the dependence of Vp on both effective (Peff) and confining (Pc) pressure, which can be used to correct atmospheric pressure measurements of Vp. Experimental Vp obtained on core samples extrapolated to in situ conditions are slightly higher than logging-derived velocities, which can be attributed either to velocity dispersion or to the effect of large-scale faults and weak zones on waves with longer wavelength. In the high-porosity (30%-60%) tested sediments, velocities are controlled at first order by porosity and not by lithology, which is in agreement with our static measurements of drained framework incompressibility, much smaller than fluid incompressibility. Rather than framework incompressibility, shear modulus is probably the second-order control on Vp, accounting for most of the difference between actual Vp and the prediction by Wood's (1941) suspension model. We also quantified the mechanical state of Nankai samples in terms of anisotropy, diagenesis, and consolidation. Both acoustic and mechanical parameters reveal similar values in vertical and horizontal directions, attesting to the very low anisotropy of the tested material. When considering the porous samples of the Upper Shikoku Basin sediments (Site 1173) as examples of diagenetically cemented material, several mechanical and acoustic attributes appeared as reliable experimental indicators of the presence of intergrain cementation. We also detected incipient cementation in samples from IODP Site C0001 (accretionary prism unit). In terms of consolidation, we distinguished two classes of material response (shallow, deformable samples and deep, hardly deformable ones) based on the amount of compaction upon application of a Peff large with respect to the inferred in situ value, with a transition that might be related to a critical porosity.
28 CFR 30.12 - How may a state simplify, consolidate, or substitute federally required state plans?
Code of Federal Regulations, 2010 CFR
2010-07-01
... substitute federally required state plans? 30.12 Section 30.12 Judicial Administration DEPARTMENT OF JUSTICE INTERGOVERNMENTAL REVIEW OF DEPARTMENT OF JUSTICE PROGRAMS AND ACTIVITIES § 30.12 How may a state simplify... with law, a state may decide to try to simplify, consolidate, or substitute federally required state...
Ribbon-cutting officially opens Consolidated Support Operations Center at CCAS
NASA Technical Reports Server (NTRS)
1999-01-01
Cutting the ribbon at a ceremony for the opening of the Consolidated Support Operations Center at ROCC, Cape Canaveral Air Station, are (left to right) William P. Hickman, program manager, Space Gateway Support; Ed Gormel, executive director, JPMO; Barbara White, supervisor, Mission Support; KSC Center Director Roy Bridges, and Lt Col Steve Vuresky, USAF.
34 CFR 200.29 - Consolidation of funds in a schoolwide program.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (3) Special education. (i) The school may consolidate funds received under part B of the IDEA. (ii... IDEA for that fiscal year, divided by the number of children with disabilities in the jurisdiction of... under part B of IDEA or section 8003(d) of the ESEA may use those funds for any activities under its...
49 CFR 17.12 - How may a state simplify, consolidate, or substitute federally required state plans?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 1 2010-10-01 2010-10-01 false How may a state simplify, consolidate, or substitute federally required state plans? 17.12 Section 17.12 Transportation Office of the Secretary of Transportation INTERGOVERNMENTAL REVIEW OF DEPARTMENT OF TRANSPORTATION PROGRAMS AND ACTIVITIES § 17.12 How may a...
CPR for Rural School Districts: Emerging Alternatives in Curriculum, Program and Reorganization.
ERIC Educational Resources Information Center
Thurston, Paul; Clauss, Joanne
Justification for school district consolidation is made on the basis of either reducing cost or increasing educational quality. Some cost reduction may be realized through certain economies of scale in some consolidations but it is by no means automatic. The Illinois State Board of Education emphasizes the relationship between high school size and…
Chapter 1 in Delaware. Education Consolidation and Improvement Act. Facts and Figures SY 1986-87.
ERIC Educational Resources Information Center
Delaware State Dept. of Public Instruction, Dover.
Over 12 million dollars was allocated to Delaware during the 1986/87 school year to fund compensatory education programs for educationally disadvantaged students under Chapter 1 of the Education Consolidation and Improvement Act. Chapter 1 supplemental services are targeted at low-income children; the children of migrant workers; and handicapped,…
Software for determining the true displacement of faults
NASA Astrophysics Data System (ADS)
Nieto-Fuentes, R.; Nieto-Samaniego, Á. F.; Xu, S.-S.; Alaniz-Álvarez, S. A.
2014-03-01
One of the most important parameters of faults is the true (or net) displacement, which is measured by restoring two originally adjacent points, called “piercing points”, to their original positions. This measurement is not typically applicable because it is rare to observe piercing points in natural outcrops. Much more common is the measurement of the apparent displacement of a marker. Methods to calculate the true displacement of faults using descriptive geometry, trigonometry or vector algebra are common in the literature, and most of them solve a specific situation from a large amount of possible combinations of the fault parameters. True displacements are not routinely calculated because it is a tedious and tiring task, despite their importance and the relatively simple methodology. We believe that the solution is to develop software capable of performing this work. In a previous publication, our research group proposed a method to calculate the true displacement of faults by solving most combinations of fault parameters using simple trigonometric equations. The purpose of this contribution is to present a computer program for calculating the true displacement of faults. The input data are the dip of the fault; the pitch angles of the markers, slickenlines and observation lines; and the marker separation. To prevent the common difficulties involved in switching between operative systems, the software is developed using the Java programing language. The computer program could be used as a tool in education and will also be useful for the calculation of the true fault displacement in geological and engineering works. The application resolves the cases with known direction of net slip, which commonly is assumed parallel to the slickenlines. This assumption is not always valid and must be used with caution, because the slickenlines are formed during a step of the incremental displacement on the fault surface, whereas the net slip is related to the finite slip.
Consolidation of trauma programs in the era of large health care delivery networks.
Trooskin, S Z; Faucher, M B; Santora, T A; Talucci, R C
1999-03-01
To review the development of an integrated trauma program at two separate campuses brought about by the merger of two medical-affiliated hospitals, each with an integrated program and a common trauma administrator, medical director, and educational coordinator. Each campus has an associate trauma medical director for on-site administrative management, a nurse coordinator, and a registrar. The integration resulted in a reduction of 1.5 full-time equivalents and "cost" savings by consolidated use of the helicopter, outreach, prevention, research, and educational programs. Regular "integration meetings," ad hoc committees, and video-linked conferences were used to institute common quality improvement programs, morbidity and mortality discussions, policies, and clinical management protocols. Reaccreditation by an outside agency, elimination of duplicated services, and maintenance of pre-merger clinical volume results. This integrated trauma program may serve as a model in this era of individual hospitals merging into large health care delivery networks.
Study of a phase-to-ground fault on a 400 kV overhead transmission line
NASA Astrophysics Data System (ADS)
Iagăr, A.; Popa, G. N.; Diniş, C. M.
2018-01-01
Power utilities need to supply their consumers at high power quality level. Because the faults that occur on High-Voltage and Extra-High-Voltage transmission lines can cause serious damages in underlying transmission and distribution systems, it is important to examine each fault in detail. In this work we studied a phase-to-ground fault (on phase 1) of 400 kV overhead transmission line Mintia-Arad. Indactic® 650 fault analyzing system was used to record the history of the fault. Signals (analog and digital) recorded by Indactic® 650 were visualized and analyzed by Focus program. Summary of fault report allowed evaluation of behavior of control and protection equipment and determination of cause and location of the fault.
41 CFR 101-26.501-2 - Standardized buying programs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 26-PROCUREMENT SOURCES AND PROGRAM 26.5-GSA Procurement Programs § 101-26.501-2 Standardized buying programs. Wherever... school age passenger. (4) Sedans and station wagons (based on standardized, consolidated requirements...
Geohydrology and water utilization in the Willcox Basin, Graham and Cochise Counties, Arizona
Brown, S.G.; Schumann, Herbert H.
1969-01-01
The Willcox basin is an area of interior drainage in the northern part of Sulphur Springs Valley, Cochise and Graham Counties, Ariz. The basin comprises about 1,500 square miles, of which the valley floor occupies about 950 square miles. The basin probably formed during middle and late Tertiary time, when the area was subjected to large-scale faulting accompanied by the uplift of the mountain ranges that presently border it. During and after faulting, large quantities of alluvium were deposited in the closed basin. The rocks in the basin are divided into two broad groups--the rocks of the mountain blocks, of Precambrian through Tertiary age, and the rocks of the basin, of Tertiary and Quaternary age. The mountain blocks consist of igneous, metamorphic, and sedimentary rocks; the water-bearing characteristics of these rocks depend primarily on their degree of weathering and fracturing. Even in areas where these rocks are fractured and jointed, only small amounts of water have been developed. The rocks of the basin consist of moderately consolidated alluvium, poorly consolidated alluvium, and unconsolidated alluvium. The water-bearing characteristics of the moderately and poorly consolidated alluvium are not well known. The unconsolidated alluvium underlies most of the valley floor and consists of two facies, stream deposits and lake beds associated with the old playa. The lenticular sand and gravel layers interbedded in silt- and clay-size material of the unconsolidated alluvium constitute the principal aquifer in the basin. The other aquifers, which yield less water, consist of beds of poorly to moderately consolidated sand- and gravel-size material; these beds occur in both the poorly consolidated and moderately consolidated alluvium. In the Stewart area the median specific capacity of wells per 100 feet of saturated unconsolidated alluvium was 20 gallons per minute, and in the Kansas Settlement area the specific capacity of wells penetrating the poorly and moderately consolidated alluvium, undifferentiated, was only 7.4 gallons per minute per 100 feet of saturated material penetrated. The aquifer in the Kansas Settlement area is much less permeable but more homogeneous than the aquifer in the Stewart area. The coefficient of transmissibility of the aquifers, which was estimated from the specific-capacity data, ranged from 58,000 to 160,000 gal. tons per day per foot. Prior to extensive ground-water pumpage, the ground-water system probably was in equilibrium, with discharge equaling recharge. At that time, ground water moved toward the playa, where it was discharged by transpiration and evaporation. The estimate of the evapotranspiration in the playa area before large-scale development was about 75,000 acre-feet per year. On the basis of estimates of coefficients of transmissibility of the aquifer and on the basis of the water-table configuration, underflow toward the playa was computed to be about 54,000 acre-feet per year. By 1963, large-scale pumping had caused marked changes in the shape of the piezometric surface; large cones of depression had developed, and ground-water movement was toward the centers of pumping. The cones of depression caused by large-scale pumping have since expanded, and water-level declines have been measured in the recharge areas along the mountain fronts. Ground water has been used for irrigation since 1910. In 1928, about 4,000 acre-feet of ground water was pumped, and by 1963 180,000 acre-feet per year was being pumped. An estimated 1,860,000 acre-feet of water has been pumped for irrigation in the Willcox basin through 1963; 680,000 acre-feet from the Stewart area, 990,000 acre-feet from the Kansas Settlement area, and 190,000 acre-feet from the Pearce-Cochise area. In the Sierra Bonita Ranch area and the north playa area, ground-water withdrawal for irrigation through 1963 was small. From the spring of 1952 to the spring of 1964 water-level declines resulting from the
All-to-all sequenced fault detection system
Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D.; Smith, Brian Edward
2010-11-02
An apparatus, program product and method enable nodal fault detection by sequencing communications between all system nodes. A master node may coordinate communications between two slave nodes before sequencing to and initiating communications between a new pair of slave nodes. The communications may be analyzed to determine the nodal fault.
Li, Qian; Zhang, Xuchen; Liang, Xitong; Zhang, Fang; Wang, Lianzhang; Zhong, Yi
2016-01-01
Translocation of signaling molecules, MAPK in particular, from the cytosol to nucleus represents a universal key element in initiating the gene program that determines memory consolidation. Translocation mechanisms and their behavioral impact, however, remain to be determined. Here, we report that a highly conserved nuclear transporter, Drosophila importin-7 (DIM-7), regulates import of training-activated MAPK for consolidation of long-term memory (LTM). We show that silencing DIM-7 functions results in impaired LTM, whereas overexpression of DIM-7 enhances LTM. This DIM-7–dependent regulation of LTM is confined to a consolidation time window and in mushroom body neurons. Image data show that bidirectional alteration in DIM-7 expression results in proportional changes in the intensity of training-activated MAPK accumulated within the nuclei of mushroom body neurons during LTM consolidation. Such DIM-7–regulated nuclear accumulation of activated MAPK is observed only in the training specified for LTM induction and determines the amplitude, but not the time course, of memory consolidation. PMID:26929354
Active faulting on the island of Crete (Greece)
NASA Astrophysics Data System (ADS)
Caputo, Riccardo; Catalano, Stefano; Monaco, Carmelo; Romagnoli, Gino; Tortorici, Giuseppe; Tortorici, Luigi
2010-10-01
ABSTRACT In order to characterize and quantify the Middle-Late Quaternary and ongoing deformation within the Southern Aegean forearc, we analyse the major tectonic structures affecting the island of Crete and its offshore. The normal faults typically consist of 4-30-km-long dip-slip segments locally organised in more complex fault zones. They separate carbonate and/or metamorphic massifs, in the footwall block, from loose to poorly consolidated alluvial and colluvial materials within the hangingwall. All these faults show clear evidences of recent re-activations and trend parallel to two principal directions: WNW-ESE and NNE-SSW. Based on all available data for both onland and offshore structures (morphological and structural mapping, satellite imagery and airphotographs remote sensing as well as the analysis of seismic profiles and the investigation of marine terraces and Holocene raised notches along the island coasts), for each fault we estimate and constrain some of the principal seismotectonic parameters and particularly the fault kinematics, the cumulative amount of slip and the slip-rate. Following simple assumptions and empirical relationships, maximum expected magnitudes and mean recurrence periods are also suggested. Summing up the contribution to crustal extension provided by the two major fault sets we calculate both arc-normal and arc-parallel long-term strain rates. The occurrence of slightly deeper and more external low-angle thrust planes associated with the incipient continental collision occurring in western Crete is also analysed. Although these contractional structures can generate stronger seismic events (M ~ 7.5.) they are probably much rarer and thus providing a minor contribution to the overall morphotectonic evolution of the island and the forearc. A comparison of our geologically-based results with those obtained from GPS measurements show a good agreement, therefore suggesting that the present-day crustal deformation is probably active since Middle Quaternary and mainly related to the seismic activity of upper crustal normal faults characterized by frequent shallow (<20 km) moderate-to-strong seismic events seldom alternating with stronger earthquakes occurring along blind low-angle thrust planes probably ramping from a deeper aseismic detachment (ca. 25 km). This apparently contradicting co-existence of juxtaposed upper tensional and lower compressional tectonic regimes is in agreement with the geodynamics of the region characterised by continental collision with Nubia and the Aegean mantle wedging.
NASA Astrophysics Data System (ADS)
Elliott, A. J.; Oskin, M. E.; Banesh, D.; Gold, P. O.; Hinojosa-Corona, A.; Styron, R. H.; Taylor, M. H.
2012-12-01
Differencing repeat terrestrial lidar scans of the 2010 M7.2 El Mayor-Cucapah (EMC) earthquake rupture reveals the rapid onset of surface processes that simultaneously degrade and preserve evidence of coseismic fault rupture in the landscape and paleoseismic record. We surveyed fresh fault rupture two weeks after the 4 April 2010 earthquake, then repeated these surveys one year later. We imaged fault rupture through four substrates varying in degree of consolidation and scarp facing-direction, recording modification due to a range of aeolian, fluvial, and hillslope processes. Using lidar-derived DEM rasters to calculate the topographic differences between years results in aliasing errors because GPS uncertainty between years (~1.5cm) exceeds lidar point-spacing (<1.0cm) shifting the raster sampling of the point cloud. Instead, we coregister each year's scans by iteratively minimizing the horizontal and vertical misfit between neighborhoods of points in each raw point cloud. With the misfit between datasets minimized, we compute the vertical difference between points in each scan within a specified neighborhood. Differencing results reveal two variables controlling the type and extent of erosion: cohesion of the substrate controls the degree to which hillslope processes affect the scarp, while scarp facing direction controls whether more effective fluvial erosion can act on the scarp. In poorly consolidated materials, large portions (>50% along strike distance) of the scarp crest are eroded up to 5cm by a combination of aeolian abrasion and diffusive hillslope processes, such as rainsplash and mass-wasting, while in firmer substrate (i.e., bedrock mantled by fault gouge) there is no detectable hillslope erosion. On the other hand, where small gullies cross downhill-facing scarps (<5% along strike distance), fluvial erosion has caused 5-50cm of headward scarp retreat in bedrock. Thus, although aeolian and hillslope processes operate over a greater along-strike distance, fluvial processes concentrated in pre-existing bedrock gullies transport a far greater volume of material across the scarp. Substrate cohesiveness dictates the degree to which erosive processes act to relax the scarp (e.g., gravels erode more easily than bedrock). However, scarp locations that favor fluvial processes suffer rapid, localized erosion of vertical scarp faces, regardless of substrate. Differential lidar also reveals debris cones formed at the base of the scarp below locations of scarp crest erosion. These indicate the rapid growth of a colluvial wedge. Where a fissure occupies the base of the scarp we observe nearly complete in-filling by silt and sand moved by both mass wasting and fluvial deposition, indicating that fissure fills observed in paleoseismic trenches likely bracket the age of an earthquake to within one year. We find no evidence of differential postseismic tectonic deformation across the fault within the ~100m aperture of our surveys.
Geologic map of the Santa Ana Pueblo quadrangle, Sandoval County, New Mexico
Personius, Stephen F.
2002-01-01
The Santa Ana Pueblo quadrangle is located in the northern part of the Albuquerque basin, which is the largest basin or graben within the Rio Grande rift. The quadrangle is underlain by poorly consolidated sedimentary rocks of the Santa Fe Group and is dominated by Santa Ana Mesa, a volcanic tableland underlain by basalt flows of the San Felipe volcanic field. The San Felipe volcanic field is the largest area of basaltic lavas exposed in the Albuquerque basin. The structural fabric of the quadrangle is dominated by dozens of generally north striking, east- and west-dipping normal faults associated with the Neogene Rio Grande rift.
Evaluation of reliability modeling tools for advanced fault tolerant systems
NASA Technical Reports Server (NTRS)
Baker, Robert; Scheper, Charlotte
1986-01-01
The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.
Groppi, Diane E; Alexis, Claudine E; Sugrue, Chiara F; Bevis, Cynthia C; Bhuiya, Tawfiqul A; Crawford, James M
2013-07-01
To describe our experience, both in meeting challenges and in reporting outcomes, of the consolidation of anatomic pathology services in the North Shore-LIJ Health System in February 2011. We addressed issues of governance, personnel, physical plant, quality programming, connectivity, and education. The highly regulated nature of the laboratory industry and the fact that patient care necessarily never pauses require that such a consolidation take place without a break in service or degradation in turnaround time and quality while engaging personnel at all levels in the extra duties related to consolidation. Subspecialization has allowed us to better meet the needs of our in-system health care community while increasing our access to the competitive outreach marketplace.
Equity in Educational Finance and A Study of the Impact of Block Grants in a Selected State.
ERIC Educational Resources Information Center
Moody, Charles D., Sr.; Kearney, C. Philip
1984-01-01
The 1981 enactment of the Education Consolidation and Improvement Act Chapter 2 (ECIA-Chapter 2), which consolidated 28 separate categorical federal aid programs into a single block grant, has had policy and fiscal impacts in Michigan. Policy debate centers on the inherent tension between equity, particularly equity defined as equal treatment of…
ERIC Educational Resources Information Center
Goodwin, Kenneth L., Jr.
2012-01-01
During the 2010-2011 school year, schools throughout the Red Clay Consolidated School District were expected to implement Professional Learning Communities (PLCs); however, little to no guidance was provided to school-level administrators and teacher teams. Not surprisingly, many schools implemented team meetings that were not aligned with…
Map and data for Quaternary faults and folds in New Mexico
Machette, M.N.; Personius, S.F.; Kelson, K.I.; Haller, K.M.; Dart, R.L.
1998-01-01
The "World Map of Major Active Faults" Task Group is compiling a series of digital maps for the United States and other countries in the Western Hemisphere that show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds; the companion database includes published information on these seismogenic features. The Western Hemisphere effort is sponsored by International Lithosphere Program (ILP) Task Group H-2, whereas the effort to compile a new map and database for the United States is funded by the Earthquake Reduction Program (ERP) through the U.S. Geological Survey. The maps and accompanying databases represent a key contribution to the new Global Seismic Hazards Assessment Program (ILP Task Group II-O) for the International Decade for Natural Disaster Reduction. This compilation, which describes evidence for surface faulting and folding in New Mexico, is the third of many similar State and regional compilations that are planned for the U.S. The compilation for West Texas is available as U.S. Geological Survey Open-File Report 96-002 (Collins and others, 1996 #993) and the compilation for Montana will be released as a Montana Bureau of Mines product (Haller and others, in press #1750).
Technical know-how relevant to planning of borehole investigation for fault characterization
NASA Astrophysics Data System (ADS)
Mizuno, T.; Takeuchi, R.; Tsuruta, T.; Matsuoka, T.; Kunimaru, T.; Saegusa, H.
2011-12-01
As part of the national R&D program for geological disposal of high-level radioactive waste (HLW), the broad scientific study of the deep geological environment, JAEA has established the Mizunami Underground Research Laboratory (MIU) in Central Japan as a generic underground research laboratory (URL) facility. The MIU Project focuses on the crystalline rocks. In the case of fractured rock, a fault is one of the major discontinuity structures which control the groundwater flow conditions. It is important to estimate geological, hydrogeological, hydrochemical and rock mechanical characteristics of faults, and then to evaluate its role in the engineering design of repository and the assessment of long-term safety of HLW disposal. Therefore, investigations for fault characterization have been performed to estimate its characteristics and to evaluate existing conceptual and/or numerical models of the geological environment in the MIU project. Investigations related to faults have been conducted based on the conventional concept that a fault consists of a "fault core (FC)" characterized by distribution of the faulted rocks and a "fractured zone (FZ)" along FC. With the progress of investigations, furthermore, it is clear that there is also a case in which an "altered zone (AZ)" characterized by alteration of host rocks to clay minerals can be developed around the FC. Intensity of alteration in AZ generally decreases with distance from the FC, and AZ transits to FZ. Therefore, the investigation program focusing on properties of AZ is required for revising the existing conceptual and/or numerical models of geological environment. In this study, procedures for planning of fault characterizations have been summarized based on the technical know-how learnt through the MIU Project for the development of Knowledge Management System performed by JAEA under a contract with the Ministry of Economy, Trade and Industry as part of its R&D supporting program for developing geological disposal technology in 2010. Taking into account the experience from the fault characterization in the MIU Project, an optimization procedure for investigation program is summarized as follows; 1) Definition of investigation aim, 2) Confirmation of current understanding of the geological environment, 3) Specification and prioritization of the data to be obtained 4) Selection of the methodology for obtaining the data, 5) Specification of sequence of the investigations, and 6) Establishment of drilling and casing program including optional cases and taking into account potential problems. Several geological conceptual models with uncertainty of geological structures were illustrated to define the investigation aim and to confirm the current uncertainties. These models were also available to establish optional cases by predicting the type and location of potential problems. The procedures and case study related to establishment of the investigation program are summarized in this study and can be available for site characterization works conducted by the implementing body (NUMO) in future candidate areas.
NASA Astrophysics Data System (ADS)
Tao, C.; Liang, J.; Zhang, H.; Li, H.; Egorov, I. V.; Liao, S.
2016-12-01
The Dragon Horn Area (49.7°E), is located at the west end of the EW trending Segment 28 of Southwest Indian Ridge between Indomed and Gallieni FZ. The segment is characterized by highly asymmetric topography. The northern flank is deeper and develops typical parallel linear fault escarpments. Meanwhile, the southern flank, where the Dragon Horn lies, is shallower and bears corrugations. The indicative corrugated surface which extends some 5×5 km was interpreted to be of Dragon Flag OCC origin (Zhao et al., 2013). Neo-volcanic ridge extends along the middle of the rifted valley and is bounded by two non-transform offsets to the east and west. Our investigations revealed 6 hydrothermal fields/anomalies in this area, including 2 confirmed sulfide fields, 1 carbonate field, and 3 inferred hydrothermal anomalies based on methane and turbidity data from 2016 AUV survey. Longqi-1(Dragon Flag) vent system lies to the northwest edge of Dragon Flag OCC. It is one of the largest hydrothermal venting systems along Mid-Ocean Ridges, with maximum temperature at vent site DFF6 of 'M zone' up to 379.3 °C (Tao et al, 2016). Massive sulfides (49.73 °E, 37.78 °S) were sampled 10 km east to Longqi-1, representing independent hydrothermal activities controlled by respective local structures. According to geological mapping and interpretation, both sulfide fields are located on the hanging wall of the Dragon Flag OCC detachment. Combined with the inferred hydrothermal anomaly to the east of the massive sulfide site, we suppose that they are controlled by different fault phases during the detachment of oceanic core complex. Moreover, consolidated carbonate sediments were widely observed and sampled on the corrugated surface and its west side, they are proposed to be precipitated during the serpentinization of ultramafic rocks, representing low-temperature hydrothermal process. These hydrothermal activities, distributed within 20km, may be controlled by the same Dragon Flag OCC. Acknowledgement This work was supported by National Basic Research Program of China (973 Program) under contract No. 2012CB417305, China Ocean Mineral Resources R & D Association "Twelfth Five-Year" Major Program under contract No. DY125-11-R-01 and DY125-11-R-05
Havens: Explicit Reliable Memory Regions for HPC Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hukerikar, Saurabh; Engelmann, Christian
2016-01-01
Supporting error resilience in future exascale-class supercomputing systems is a critical challenge. Due to transistor scaling trends and increasing memory density, scientific simulations are expected to experience more interruptions caused by transient errors in the system memory. Existing hardware-based detection and recovery techniques will be inadequate to manage the presence of high memory fault rates. In this paper we propose a partial memory protection scheme based on region-based memory management. We define the concept of regions called havens that provide fault protection for program objects. We provide reliability for the regions through a software-based parity protection mechanism. Our approach enablesmore » critical program objects to be placed in these havens. The fault coverage provided by our approach is application agnostic, unlike algorithm-based fault tolerance techniques.« less
Physical properties and Consolidation behavior of sediments from the N. Japan subduction zone
NASA Astrophysics Data System (ADS)
Valdez, R. D., II; Lauer, R. M.; Ikari, M.; Kitajima, H.; Saffer, D. M.
2013-12-01
Sediment hydraulic properties, consolidation state, and ambient pore pressure development are key parameters that affect fluid migration, deformation, and the slip behavior and mechanical strength of subduction zone megathrusts. In order to better understand the dynamics and mechanisms of large subduction earthquakes, Integrated Oceanic Drilling Program (IODP) Expedition 343, drilled into the toe of the Japan Trench subduction zone in a region of large shallow slip in the M 9.0 Tohoku earthquake, as part of the Japan Trench Fast Drilling Project (J-FAST). Here, we report on two constant rate of strain (CRS) uniaxial consolidation experiments and two triaxial deformation experiments on bedded claystone and clayey mudstone core samples collected from the frontal prism and subducted sediment section cored at Site C0019, 2.5 km landward of the Japan Trench, from depths of 697.18 and 831.45 mbsf. The goals of our experiments were: (1) to define the hydraulic and acoustic properties of sediments that host the subduction megathrust fault that slipped in the M 9.0 Tohoku earthquake; and (2) to constrain in-situ consolidation state and its implications for in-situ stress. The permeability-porosity trends are similar for the two samples, and both exhibit permeability that decreases systematically with increasing effective stress and decreasing porosity, and which varies log-linearly with porosity. Permeabilities of material from the frontal prism decrease from 5×10-18 m2 at 5 MPa effective stress, to 3.0×10-19 m2 at 70 MPa, and porosities decrease from 51% to 29%, while permeabilities of the subducted sediment sample decrease from 5×10-18 m2 at 5 MPa to 3.6×10-19 m2 at 90 MPa, and porosities decrease from 49% to 36%. In-situ permeabilities for the prism and underthrust sediment samples, estimated using laboratory defined permeability-porosity relationships, are 4.9×10-18 m2 and 3.7×10-18 m2, respectively. Elastic wavespeeds increase systematically with increasing effective stress. P-wave velocities (Vp) in the frontal prism sample increase from 2.1 km/s at 8 MPa to 2.7 km/s at 55 MPa effective stress, and velocities in the underthrust sediment sample increase from 2.3 km/s at 6 MPa to 3.0 km/s at 76.5 MPa. Estimated in-situ Vp for the frontal prism and underthrust sediment sample are 2.1 km/s and 2.4 km/s, respectively. This is slightly higher than both the logging while drilling (LWD) measurements and shipboard velocity measurements on discrete samples. We also estimated pre-consolidation pressures (Pc) for each sample using the work-stress method. Comparing Pc with the present day in-situ vertical stress calculated from shipboard bulk density data, we find that both samples are severely overconsolidated. We report this in terms of overconsolidation ratio (OCR), defined as the ratio of Pc to the in-situ stress expected for the case of normal consolidation. Values of OCR for the prism and underthrust samples are 3.95 and 4.28, respectively. This overconsolidation is broadly consistent with fully drained (non-overpressured) conditions, and may reflect uplift and unroofing of the sediments following peak burial greater than their current depth, a significant contribution from lateral tectonic stresses leading to an effective stress far greater than expected for the case of uniaxial burial, or cementation that leads to apparent overconsolidation.
Advanced information processing system: Fault injection study and results
NASA Technical Reports Server (NTRS)
Burkhardt, Laura F.; Masotto, Thomas K.; Lala, Jaynarayan H.
1992-01-01
The objective of the AIPS program is to achieve a validated fault tolerant distributed computer system. The goals of the AIPS fault injection study were: (1) to present the fault injection study components addressing the AIPS validation objective; (2) to obtain feedback for fault removal from the design implementation; (3) to obtain statistical data regarding fault detection, isolation, and reconfiguration responses; and (4) to obtain data regarding the effects of faults on system performance. The parameters are described that must be varied to create a comprehensive set of fault injection tests, the subset of test cases selected, the test case measurements, and the test case execution. Both pin level hardware faults using a hardware fault injector and software injected memory mutations were used to test the system. An overview is provided of the hardware fault injector and the associated software used to carry out the experiments. Detailed specifications are given of fault and test results for the I/O Network and the AIPS Fault Tolerant Processor, respectively. The results are summarized and conclusions are given.
New Madrid Seismotectonic Study: activities during fiscal year 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buschbach, T.C.
1985-04-01
The New Madrid Seismotectonic Study is a coordinated program of geological, geophysical, and seismological investigations of the area within a 200-mile radius of New Madrid, Missouri. The study is designed to define the structural setting and tectonic history of the area in order to realistically evaluate earthquake risks in the siting of nuclear facilities. Our studies concentrated on defining boundaries of a proposed rift complex in the area, as well as establishing the relationships of the east-west trending fault systems with the northwest-trending faults of the Wabash Valley and New Madrid areas. There were 204 earthquakes located in 1983. Inmore » addition, the earthquake swarm in north-central Arkansas continued throughout the year, and 45,000 earthquakes have been recorded there since January, 1982. Current seismic activity in the Anna, Ohio, area appears to be related to the northwest-trending Fort Wayne rift and possibly with the rift's contact with a low-density pluton. Fault studies of the Rough Creek-Shawneetown Fault System showed mostly high-angle normal faults with a master fault that is a high-angle south-dipping reverse fault. Trenching of terrace deposits along the Kentucky River Fault System confirmed some anomalous conditions in terrace deposits previously indicated by electrical resistivity and augering programs. Thermal and chemical data from groundwater in the Mississippi Embayment appear to be useful in localizing deep faults that cut through the aquifers. Early indications from studies of jointing in Indiana are that the direction of major joint sets will be useful in determining regional stress directions. No Quaternary faulting was found in the Indiana or Illinois fault studies.« less
Drilling Automation Demonstrations in Subsurface Exploration for Astrobiology
NASA Technical Reports Server (NTRS)
Glass, Brian; Cannon, H.; Lee, P.; Hanagud, S.; Davis, K.
2006-01-01
This project proposes to study subsurface permafrost microbial habitats at a relevant Arctic Mars-analog site (Haughton Crater, Devon Island, Canada) while developing and maturing the subsurface drilling and drilling automation technologies that will be required by post-2010 missions. It builds on earlier drilling technology projects to add permafrost and ice-drilling capabilities to 5m with a lightweight drill that will be automatically monitored and controlled in-situ. Frozen cores obtained with this drill under sterilized protocols will be used in testing three hypotheses pertaining to near-surface physical geology and ground H2O ice distribution, viewed as a habitat for microbial life in subsurface ice and ice-consolidated sediments. Automation technologies employed will demonstrate hands-off diagnostics and drill control, using novel vibrational dynamical analysis methods and model-based reasoning to monitor and identify drilling fault states before and during faults. Three field deployments, to a Mars-analog site with frozen impact crater fallback breccia, will support science goals, provide a rigorous test of drilling automation and lightweight permafrost drilling, and leverage past experience with the field site s particular logistics.
Cell death proteomics database: consolidating proteomics data on cell death.
Arntzen, Magnus Ø; Bull, Vibeke H; Thiede, Bernd
2013-05-03
Programmed cell death is a ubiquitous process of utmost importance for the development and maintenance of multicellular organisms. More than 10 different types of programmed cell death forms have been discovered. Several proteomics analyses have been performed to gain insight in proteins involved in the different forms of programmed cell death. To consolidate these studies, we have developed the cell death proteomics (CDP) database, which comprehends data from apoptosis, autophagy, cytotoxic granule-mediated cell death, excitotoxicity, mitotic catastrophe, paraptosis, pyroptosis, and Wallerian degeneration. The CDP database is available as a web-based database to compare protein identifications and quantitative information across different experimental setups. The proteomics data of 73 publications were integrated and unified with protein annotations from UniProt-KB and gene ontology (GO). Currently, more than 6,500 records of more than 3,700 proteins are included in the CDP. Comparing apoptosis and autophagy using overrepresentation analysis of GO terms, the majority of enriched processes were found in both, but also some clear differences were perceived. Furthermore, the analysis revealed differences and similarities of the proteome between autophagosomal and overall autophagy. The CDP database represents a useful tool to consolidate data from proteome analyses of programmed cell death and is available at http://celldeathproteomics.uio.no.
Examining "One Grant, One Loan." NASFAA Task Force Report
ERIC Educational Resources Information Center
National Association of Student Financial Aid Administrators, 2016
2016-01-01
Growing concern over the complexity of the federal financial aid system and a push toward simplification has led to increased attention toward streamlining the federal student aid programs. Specifically, several proposals and policy papers have recommended consolidating the federal aid programs into one grant program and one loan program, commonly…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... Management Program Power Marketing Initiative to the Boulder Canyon Project AGENCY: Western Area [email protected] . Information regarding Western's Boulder Canyon Project (BCP) Post-2017 remarketing efforts, the Energy Management and Planning Program (Program), and the Conformed General Consolidated Power...
China Report, Red Flag, Number 22, 16 November 1982.
1982-12-29
Congress Documents (pp 47-48) (XIN CHANGZHENG commentator) 83 The Intellectuals Are a Force on Which the Modernization Program Depends (p 48) (Zhang De ...and methodical manner under leadership from top to bottom. The consolidation of the party will be linked with the administrative structural reform...work of consolidating the party should be linked with administrative structural reform and the reorganization of enterprises and institutions. A
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Div. of Human Resources.
In response to the requirements of the Higher Education Amendments of 1986, this report addresses the impact of the two-year-old Student Loan Consolidation Program. Principle findings of the investigation concern the higher interst costs to the borrower that are brought about by longer payment plans and the fact that the government's subsidy costs…
Comprehensive School Alienation Program, Guidelines.
ERIC Educational Resources Information Center
Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.
This document presents guidelines developed by the Hawaii State Department of Education's Comprehensive School Alienation Program to consolidate and strengthen the delivery of services to alienated students. It is intended to assist district staff, school administrators, and project personnel in planning and implementing program activities and…
Fault detection and accommodation testing on an F100 engine in an F-15 airplane
NASA Technical Reports Server (NTRS)
Myers, L. P.; Baer-Riedhart, J. L.; Maxwell, M. D.
1985-01-01
The fault detection and accommodation (FDA) methodology for digital engine-control systems may range from simple comparisons of redundant parameters to the more complex and sophisticated observer models of the entire engine system. Evaluations of the various FDA schemes are done using analytical methods, simulation, and limited-altitude-facility testing. Flight testing of the FDA logic has been minimal because of the difficulty of inducing realistic faults in flight. A flight program was conducted to evaluate the fault detection and accommodation capability of a digital electronic engine control in an F-15 aircraft. The objective of the flight program was to induce selected faults and evaluate the resulting actions of the digital engine controller. Comparisons were made between the flight results and predictions. Several anomalies were found in flight and during the ground test. Simulation results showed that the inducement of dual pressure failures was not feasible since the FDA logic was not designed to accommodate these types of failures.
NASA Astrophysics Data System (ADS)
Nedorub, O. I.; Knapp, C. C.
2012-12-01
The tectonic history of the Eastern North American Margin (ENAM) incorporates two cycles of continental assembly, multiple pulses of orogeny, rifting, and post-rift geodynamic evolution. This is reflected in the heterogeneous lithosphere of the ENAM which contains fault structures originated in Paleozoic to Mesozoic eras. The South Georgia Rift basin is probably the largest Mesozoic graben within its boundaries that is associated with the breakup of Pangea. It is composed of smaller sub-basins which appear to be bounded by high-angle normal faults, some of which may have been inverted in late Cretaceous and Cenozoic eras. Paleozoic structures may have been reactivated in Cenozoic time as well. The ENAM is characterized by N-NE maximum horizontal compressive stress direction. This maximum compressional stress field is sub-parallel to the strike of the Atlantic Coast province fault systems. Camden, Augusta, Allendale, and Pen Branch faults are four of the many such reactivated faults along the southern part of ENAM. These faults are now buried under the 0-400 m of loosely consolidated Cretaceous and Cenozoic age sediments and thus are either only partially mapped or currently not recognized. Some of the objectives of this study are to map the subsurface expression and geometry of these faults and to investigate the post Cretaceous deformation and possible causes of fault reactivation on a passive margin. This study employs an integrated geophysical approach to investigate the upper 200 m of identified locations of the above mentioned faults. 2-D high-resolution shallow seismic reflection and refraction methods, gravity surveys, GPR, 2-D electrical resistivity and well data are used for analyses and interpretation. Preliminary results suggest that Camden fault shows signs of Cenozoic reactivation through an approximately 30 m offset NW side up mainly along a steeply dipping fault zone in the basal contact of Coastal Plain sediments with the Carolina Piedmont. Drill-hole and seismic data along the Augusta profile show that there is a significant offset (approximately 7m) down to the SE of Pinehurst and older Cretaceous deposits. The Pen Branch fault seismic profile shows evidence of Cenozoic reactivation and inversion. The youngest discontinuous reflector (the top of the Dry Branch Formation) is offset by 1-4m and constrains the latest fault movement to be Middle Eocene in age. A NW-SE well derived cross-section across the Allendale fault shows that there is no significant offset above 50m below sea level (top of the Late Eocene Black Mingo Group), however a SW-NE cross section shows an approximately 21m offset NE side up across the newly postulated fault striking NW-SE. The top of the oldest undeformed formation (Middle Eocene Santee Limestone) and the top of the youngest deformed unit (Late Eocene Black Mingo Group) constrain a time frame for the latest deformation of the Coastal Plain sediments to be between approximately 50 and 40 Ma. The results of this research provide an opportunity to address the Cenozoic tectonism in SC, advance the knowledge and current understanding of the structure of the rift basins, update the database used for the ongoing CO2 sequestration project, the local hydrology, and the Savannah River Site safety evaluation.
Data report: Compressibility, permeability, and grain size of shallow sediments, sites 1194 and 1198
Dugan, Brandon; Marone, Chris; Hong, Tiancong; Migyanka, Misty; Anselmett, Flavio S.; Isern, Alexandra R.; Blum, Peter; Betzler, Christian
2006-01-01
Uniaxial strain consolidation experiments were conducted to determine elastic and plastic properties and to estimate the permeability of sediments from 0 to 200 meters below seafloor at Ocean Drilling Program Sites 1194 and 1198. Plastic deformation is described by compression indices, which range from 0.19 to 0.37. Expansion indices, the elastic deformation measured during unload/reload cycles on samples, vary from 0.02 to 0.029. Consolidation experiments provide lower bounds on permeability between 5.4 x 10–16 m2 and 1.9 x 10–18m2, depending on the consolidation state of the sample.
Development of a dynamic coupled hydro-geomechanical code and its application to induced seismicity
NASA Astrophysics Data System (ADS)
Miah, Md Mamun
This research describes the importance of a hydro-geomechanical coupling in the geologic sub-surface environment from fluid injection at geothermal plants, large-scale geological CO2 sequestration for climate mitigation, enhanced oil recovery, and hydraulic fracturing during wells construction in the oil and gas industries. A sequential computational code is developed to capture the multiphysics interaction behavior by linking a flow simulation code TOUGH2 and a geomechanics modeling code PyLith. Numerical formulation of each code is discussed to demonstrate their modeling capabilities. The computational framework involves sequential coupling, and solution of two sub-problems- fluid flow through fractured and porous media and reservoir geomechanics. For each time step of flow calculation, pressure field is passed to the geomechanics code to compute effective stress field and fault slips. A simplified permeability model is implemented in the code that accounts for the permeability of porous and saturated rocks subject to confining stresses. The accuracy of the TOUGH-PyLith coupled simulator is tested by simulating Terzaghi's 1D consolidation problem. The modeling capability of coupled poroelasticity is validated by benchmarking it against Mandel's problem. The code is used to simulate both quasi-static and dynamic earthquake nucleation and slip distribution on a fault from the combined effect of far field tectonic loading and fluid injection by using an appropriate fault constitutive friction model. Results from the quasi-static induced earthquake simulations show a delayed response in earthquake nucleation. This is attributed to the increased total stress in the domain and not accounting for pressure on the fault. However, this issue is resolved in the final chapter in simulating a single event earthquake dynamic rupture. Simulation results show that fluid pressure has a positive effect on slip nucleation and subsequent crack propagation. This is confirmed by running a sensitivity analysis that shows an increase in injection well distance results in delayed slip nucleation and rupture propagation on the fault.
Stress-sensitivity of The Hydraulic Properties of A Fault Gouge
NASA Astrophysics Data System (ADS)
Harrington, J. F.; Horseman, S. T.; Hama, K.; Metcalfe, R.
Tono Mine is located about 350 km southwest of Tokyo and is the site of the most extensive uranium deposits in Japan. The geological setting comprises Tertiary (Mizu- nami Group) sedimentary rocks overlying Cretaceous granitic basement rocks. In as- cending order, the sedimentary rocks are the Toki Lignite-bearing Formation (con- glomerate, interbedded sandstone and mudstone), the Akeyo Formation (tuffaceous sandstone) and the Oidawara Formation (siltstone and mudstone). The Tsukiyoshi Fault cuts through this sequence and is a reverse fault, dipping to the south at 60- 70 degrees, with a throw of about 30 metres. As part of its hydrogeological studies, JNC is evaluating the impact of the fault on groundwater flow in the Tertiary sedi- ments. A sample was taken from a borehole in the NATM Drift, where the fault zone contains gouge material with two clay-bearing layers around 2 to 3 cm thick, separated by a 10 to 20 cm thick layer of unconsolidated fine sandy material. The sample was obtained using a triple-tube core barrel fitted with a split sample tube and a diamond bit. A specimen was prepared and consolidated at successive effective stress levels of 2, 6 and 12 MPa. The plot of void ratio against the logarithm of effective stress was found to be sensibly linear with a negative slope, kappa, of 0.036 rising to 0.044 at higher stress levels. The evidence suggests that the gouge is overconsolidated. Hy- draulic conductivity and specific storage were also measured at each stress level using the constant flow rate method. Hydraulic conductivity was found to be strongly stress sensitive, falling from 1.84 x 10-12 m.s-1 at 2 MPa to 7.9 x 10-14 m.s-1 at 12 MPa. Specific storage values were analysed using the critical state soil mechanics approach assuming a stress-dependent pore compressibility. Reasonable agreement was found between the theoretical curve with kappa = 0.036 and the measured values.
Fault tolerant architectures for integrated aircraft electronics systems, task 2
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Melliar-Smith, P. M.; Schwartz, R. L.
1984-01-01
The architectural basis for an advanced fault tolerant on-board computer to succeed the current generation of fault tolerant computers is examined. The network error tolerant system architecture is studied with particular attention to intercluster configurations and communication protocols, and to refined reliability estimates. The diagnosis of faults, so that appropriate choices for reconfiguration can be made is discussed. The analysis relates particularly to the recognition of transient faults in a system with tasks at many levels of priority. The demand driven data-flow architecture, which appears to have possible application in fault tolerant systems is described and work investigating the feasibility of automatic generation of aircraft flight control programs from abstract specifications is reported.
Detailed Geophysical Fault Characterization in Yucca Flat, Nevada Test Site, Nevada
Asch, Theodore H.; Sweetkind, Donald S.; Burton, Bethany L.; Wallin, Erin L.
2009-01-01
Yucca Flat is a topographic and structural basin in the northeastern part of the Nevada Test Site (NTS) in Nye County, Nevada. Between the years 1951 and 1992, 659 underground nuclear tests took place in Yucca Flat; most were conducted in large, vertical excavations that penetrated alluvium and the underlying Cenozoic volcanic rocks. Radioactive and other potential chemical contaminants at the NTS are the subject of a long-term program of investigation and remediation by the U.S. Department of Energy (DOE), National Nuclear Security Administration, Nevada Site Office, under its Environmental Restoration Program. As part of the program, the DOE seeks to assess the extent of contamination and to evaluate the potential risks to humans and the environment from byproducts of weapons testing. To accomplish this objective, the DOE Environmental Restoration Program is constructing and calibrating a ground-water flow model to predict hydrologic flow in Yucca Flat as part of an effort to quantify the subsurface hydrology of the Nevada Test Site. A necessary part of calibrating and evaluating a model of the flow system is an understanding of the location and characteristics of faults that may influence ground-water flow. In addition, knowledge of fault-zone architecture and physical properties is a fundamental component of the containment of the contamination from underground nuclear tests, should such testing ever resume at the Nevada Test Site. The goal of the present investigation is to develop a detailed understanding of the geometry and physical properties of fault zones in Yucca Flat. This study was designed to investigate faults in greater detail and to characterize fault geometry, the presence of fault splays, and the fault-zone width. Integrated geological and geophysical studies have been designed and implemented to work toward this goal. This report describes the geophysical surveys conducted near two drill holes in Yucca Flat, the data analyses performed, and the integrated interpretations developed from the suite of geophysical methodologies utilized in this investigation. Data collection for this activity started in the spring of 2005 and continued into 2006. A suite of electrical geophysical surveys were run in combination with ground magnetic surveys; these surveys resulted in high-resolution subsurface data that portray subsurface fault geometry at the two sites and have identified structures not readily apparent from surface geologic mapping, potential field geophysical data, or surface effects fracture maps.
Remotely-triggered Slip in Mexico City Induced by the September 2017 Mw=7.1 Puebla Earthquake.
NASA Astrophysics Data System (ADS)
Solano Rojas, D. E.; Havazli, E.; Cabral-Cano, E.; Wdowinski, S.
2017-12-01
Although the epicenter of the September 19th, 2017 Mw=7.1 Puebla earthquake is located 100 km from Mexico City, the earthquake caused severe destruction in the city, leading to life loss and property damage. Mexico City is built on a thick clay-rich sedimentary sequence and, hence, is susceptible to seismic acceleration during earthquakes. The sediment layer also causes land subsidence, at rates as high as 350 mm/yr, and surface faulting. The earthquake damage in the eastern part of the city, characterized by the collapse of several buildings, can be explained by seismic amplification. However, the damage in the southern part of the city, characterized by the collapse of small houses and surface faulting, requires a different explanation. We present here geodetic observations suggesting that the surface faulting in Mexico City triggered by the Puebla earthquake occurred in areas already experiencing differential displacements. Our study is based on Sentinel-1A satellite data from before and after the earthquake (September 17th and 29th, 2017). We process the data using Interferometric Synthetic Aperture Radar (InSAR) to produce a coseismic interferogram. We also identify phase discontinuities that can be interpreted as surface faulting using the phase gradient technique (Price and Sandwell, 1998). The results of our analysis reveal the locations and patterns of coseismic phase discontinuities, mainly in the piedmont of the Sierra de Santa Catarina, which agree with the location of earthquake's damage reported by official and unofficial sources (GCDMX, 2017; OSM, 2017). The observed phase discontinuities also agree well with the location of preexisting, subsidence-related faults identified during 10 years of field surveys (GCDMX, 2017) and coincide with differential displacements identified using a Fast Fourier Transform residual technique on high-resolution InSAR results from 2012 (Solano-Rojas et. al, 2017). We propose that the seismic energy released by the 2017 Mw=7.1 Puebla earthquake induced fast soil consolidation, which remotely triggered slip on the preexisting subsidence-related faults. The slip observed during this earthquake represents a hazard that needs to be considered in future urban development plans of Mexico City.
NASA Technical Reports Server (NTRS)
Wood, M. E.
1980-01-01
Four wire Wye connected ac power systems exhibit peculiar steady state fault characteristics when the fourth wire of three phase induction motors is connected. The loss of one phase of power source due to a series or shunt fault results in currents higher than anticipated on the remaining two phases. A theoretical approach to compute the fault currents and voltages is developed. A FORTRAN program is included in the appendix.
Morrow, C.A.; Byerlee, J.D.
1989-01-01
Transient strength changes are observed in fault gouge materials when the velocity of shearing is varied. A transient stress peak is produced when the strain rate in the gouge is suddenly increased, whereas a transient stress drop results from a sudden change to a slower strain rate. We have studied the mechanism responsible for these observations by performing frictional sliding experiments on sawcut granite samples filled with a layer of several different fault gouge types. Changes in pore volume and strength were monitored as the sliding velocity alternated between fast and slow rates. Pore volume increased at the faster strain rate, indicating a dilation of the gouge layer, whereas volume decreased at the slower rate indicating compaction. These results verify that gouge dilation is a function of strain rate. Pore volume changed until an equilibrium void ratio of the granular material was reached for a particular rate of strain. Using arguments from soil mechanics, we find that the dense gouge was initially overconsolidated relative to the equilibrium level, whereas the loose gouge was initially underconsolidated relative to this level. Therefore, the transient stress behavior must be due to the overconsolidated state of the gouge at the new rate when the velocity is increased and to the underconsolidated state when the velocity is lowered. Time-dependent compaction was also shown to cause a transient stress response similar to the velocity-dependent behavior. This may be important in natural fault gouges as they become consolidated and stronger with time. In addition, the strain hardening of the gouge during shearing was found to be a function of velocity, rendering it difficult to quantify the change in equilibrium shear stress when velocity is varied under certain conditions. ?? 1989.
Mercer County (N.J.) Coordination/Consolidation Demonstration Program
DOT National Transportation Integrated Search
1982-03-01
From November 1977 through June 1981, Mercer County in New Jersey, was the site of an Urban Mass Transportation Administration Service and Methods Demonstration, which coordinated human service agency transportation programs. The Mercer County Coordi...
ECDA of Cased Pipeline Segments
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56-07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on pipeline safety program. Although, not...
Effective Chapter 1 Programs in Oregon.
ERIC Educational Resources Information Center
Berrum, Phyllis
This report describes 11 effective compensatory education programs in Oregon schools funded under Chapter 1 of the Education Consolidation and Improvement Act. One high school, four middle school, and six elementary school programs are profiled. Each profile includes the following information: (1) demographics; (2) staffing; (3) parent…
Care 3 phase 2 report, maintenance manual
NASA Technical Reports Server (NTRS)
Bryant, L. A.; Stiffler, J. J.
1982-01-01
CARE 3 (Computer-Aided Reliability Estimation, version three) is a computer program designed to help estimate the reliability of complex, redundant systems. Although the program can model a wide variety of redundant structures, it was developed specifically for fault-tolerant avionics systems--systems distinguished by the need for extremely reliable performance since a system failure could well result in the loss of human life. It substantially generalizes the class of redundant configurations that could be accommodated, and includes a coverage model to determine the various coverage probabilities as a function of the applicable fault recovery mechanisms (detection delay, diagnostic scheduling interval, isolation and recovery delay, etc.). CARE 3 further generalizes the class of system structures that can be modeled and greatly expands the coverage model to take into account such effects as intermittent and transient faults, latent faults, error propagation, etc.
Networks consolidation program: Maintenance and Operations (M&O) staffing estimates
NASA Technical Reports Server (NTRS)
Goodwin, J. P.
1981-01-01
The Mark IV-A consolidate deep space and high elliptical Earth orbiter (HEEO) missions tracking and implements centralized control and monitoring at the deep space communications complexes (DSCC). One of the objectives of the network design is to reduce maintenance and operations (M&O) costs. To determine if the system design meets this objective an M&O staffing model for Goldstone was developed which was used to estimate the staffing levels required to support the Mark IV-A configuration. The study was performed for the Goldstone complex and the program office translated these estimates for the overseas complexes to derive the network estimates.
Fault diagnostic instrumentation design for environmental control and life support systems
NASA Technical Reports Server (NTRS)
Yang, P. Y.; You, K. C.; Wynveen, R. A.; Powell, J. D., Jr.
1979-01-01
As a development phase moves toward flight hardware, the system availability becomes an important design aspect which requires high reliability and maintainability. As part of continous development efforts, a program to evaluate, design, and demonstrate advanced instrumentation fault diagnostics was successfully completed. Fault tolerance designs for reliability and other instrumenation capabilities to increase maintainability were evaluated and studied.
NASA Technical Reports Server (NTRS)
Landano, M. R.; Easter, R. W.
1984-01-01
Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.
The Realization of Drilling Fault Diagnosis Based on Hybrid Programming with Matlab and VB
NASA Astrophysics Data System (ADS)
Wang, Jiangping; Hu, Yingcai
This paper presents a method using hybrid programming with Matlab and VB based on ActiveX to design the system of drilling accident prediction and diagnosis. So that the powerful calculating function and graphical display function of Matlab and visual development interface of VB are combined fully. The main interface of the diagnosis system is compiled in VB,and the analysis and fault diagnosis are implemented by neural network tool boxes in Matlab.The system has favorable interactive interface,and the fault example validation shows that the diagnosis result is feasible and can meet the demands of drilling accident prediction and diagnosis.
[The strict sense nursing postgraduation in Brazil: advances and perspectives].
Scochi, Carmen Gracinda Silvan; Munari, Denize Bouttelet; Gelbcke, Francine Lima; Erdmann, Alacoque Lorenzini; de Gutiérrez, Maria Gaby Rivero; Rodrigues, Rosalina Aparecida Partezani
2013-09-01
Nursing is a specific field of knowledge and social practice that has been consolidated and strengthened as science. In Brazil, it has been developed due to the increase and qualification of strict sense post-graduate programs. This study aims to present a historical review of the strict sense post-graduate nursing courses in Brazil and to reflect on their evolution, progress, challenges and future perspectives. It explores the creation of strict sense post-graduate courses, highlighting the movement to build a culture of academic and professional post-graduation in nursing. The historical path of their consolidation, expansion, conquest of excellence and international visibility over four decades, and the challenges and future perspectives are showed. It is found that the post-graduate programs in the field has contributed to the advancement and consolidation of scientific, technological knowledge and innovation in nursing and health care, having as philosophy the respect for diversity and the free exchange of ideas, the improvement of quality of life and health, and the effectiveness of citizenship.
Coastal Inlets Research Program. Barrier Island Migration Over a Consolidating Substrate
2009-09-01
the toe of the dune to the high water line) for full development of eolian transport. However, the original Shore Protection Manual (1984...tested. Barrier islands overlying a compressible substrate are more likely to have reduced dune elevations due to consolidation, incur overall...migra- tion when the dune reaches a critical elevation with respect to the prev- alent storm conditions. Initial large-scale infusion of sand from an
E.C.I.A. Chapter 1, Part B, Institutionalized Facilities Program, 1989-90. OREA Report.
ERIC Educational Resources Information Center
Miller, Ronald C.
This report evaluates a program funded under the Educational Consolidation and Improvement Act (ECIA), Chapter 1, Part B, in New York (New York). The Program for Neglected and Delinquent Children, District 75/Citywide Institutionalized Facilities Program provides after-school supplementary instruction in prevocational skills, activities of daily…
ERIC Educational Resources Information Center
Turner, W. E.; Riley, Gerald
Education Consolidation Improvement Act (ECIA) Chapter 1 programs were conducted in 29 elementary schools in the Wichita (Kansas) Public Schools during the 1983-1984 school year. Major programs were corrective reading, mathematics, and prekindergarten. Smaller programs for children in neglected and delinquent institutions were conducted. A reduced…
Long-Term Pavement Performance Automated Faulting Measurement
DOT National Transportation Integrated Search
2015-02-01
This study focused on identifying transverse joint locations on jointed plain concrete pavements using an automated joint detection algorithm and computing faulting at these locations using Long-Term Pavement Performance (LTPP) Program profile data c...
Consolidated principles for screening based on a systematic review and consensus process.
Dobrow, Mark J; Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-04-09
In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner's seminal publication, and to conduct a Delphi consensus process to assess the review results. We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner's 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. Wilson and Jungner's principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. © 2018 Joule Inc. or its licensors.
Consolidated principles for screening based on a systematic review and consensus process
Hagens, Victoria; Chafe, Roger; Sullivan, Terrence; Rabeneck, Linda
2018-01-01
BACKGROUND: In 1968, Wilson and Jungner published 10 principles of screening that often represent the de facto starting point for screening decisions today; 50 years on, are these principles still the right ones? Our objectives were to review published work that presents principles for population-based screening decisions since Wilson and Jungner’s seminal publication, and to conduct a Delphi consensus process to assess the review results. METHODS: We conducted a systematic review and modified Delphi consensus process. We searched multiple databases for articles published in English in 1968 or later that were intended to guide population-based screening decisions, described development and modification of principles, and presented principles as a set or list. Identified sets were compared for basic characteristics (e.g., number, categorization), a citation analysis was conducted, and principles were iteratively synthesized and consolidated into categories to assess evolution. Participants in the consensus process assessed the level of agreement with the importance and interpretability of the consolidated screening principles. RESULTS: We identified 41 sets and 367 unique principles. Each unique principle was coded to 12 consolidated decision principles that were further categorized as disease/condition, test/intervention or program/system principles. Program or system issues were the focus of 3 of Wilson and Jungner’s 10 principles, but comprised almost half of all unique principles identified in the review. The 12 consolidated principles were assessed through 2 rounds of the consensus process, leading to specific refinements to improve their relevance and interpretability. No gaps or missing principles were identified. INTERPRETATION: Wilson and Jungner’s principles are remarkably enduring, but increasingly reflect a truncated version of contemporary thinking on screening that does not fully capture subsequent focus on program or system principles. Ultimately, this review and consensus process provides a comprehensive and iterative modernization of guidance to inform population-based screening decisions. PMID:29632037
Code of Federal Regulations, 2010 CFR
2010-04-01
... BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION MINIMUM ACADEMIC STANDARDS FOR THE BASIC EDUCATION OF INDIAN CHILDREN AND NATIONAL CRITERIA FOR DORMITORY SITUATIONS Homeliving Programs Waivers and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-17
... predicted topological properties of superconductors in two dimensions, to program fundamental couplings at... topological properties of superconductors in two dimensions, to program fundamental couplings at near-atomic...
Air Traffic Control: Remote Radar For Grand Junction
DOT National Transportation Integrated Search
1996-11-01
In 1983, the Federal Aviation Administration (FAA) began a nationwide program : of consolidating air traffic control facilities to gain the benefits of : automation and any attendant cost savings. As part of this program, FAA : conducted several stud...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 23 Highways 1 2010-04-01 2010-04-01 false Purpose. 230.401 Section 230.401 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CIVIL RIGHTS EXTERNAL PROGRAMS Construction Contract... contract compliance program, including compliance reviews, consolidated compliance reviews, and the...
NASA Technical Reports Server (NTRS)
Rogers, William H.
1993-01-01
In rare instances, flight crews of commercial aircraft must manage complex systems faults in addition to all their normal flight tasks. Pilot errors in fault management have been attributed, at least in part, to an incomplete or inaccurate awareness of the fault situation. The current study is part of a program aimed at assuring that the types of information potentially available from an intelligent fault management aiding concept developed at NASA Langley called 'Faultfinde' (see Abbott, Schutte, Palmer, and Ricks, 1987) are an asset rather than a liability: additional information should improve pilot performance and aircraft safety, but it should not confuse, distract, overload, mislead, or generally exacerbate already difficult circumstances.
NASA Technical Reports Server (NTRS)
Lee, Charles; Alena, Richard L.; Robinson, Peter
2004-01-01
We started from ISS fault trees example to migrate to decision trees, presented a method to convert fault trees to decision trees. The method shows that the visualizations of root cause of fault are easier and the tree manipulating becomes more programmatic via available decision tree programs. The visualization of decision trees for the diagnostic shows a format of straight forward and easy understands. For ISS real time fault diagnostic, the status of the systems could be shown by mining the signals through the trees and see where it stops at. The other advantage to use decision trees is that the trees can learn the fault patterns and predict the future fault from the historic data. The learning is not only on the static data sets but also can be online, through accumulating the real time data sets, the decision trees can gain and store faults patterns in the trees and recognize them when they come.
NASA Technical Reports Server (NTRS)
Shaver, Charles; Williamson, Michael
1986-01-01
The NASA Ames Research Center sponsors a research program for the investigation of Intelligent Flight Control Actuation systems. The use of artificial intelligence techniques in conjunction with algorithmic techniques for autonomous, decentralized fault management of flight-control actuation systems is explored under this program. The design, development, and operation of the interface for laboratory investigation of this program is documented. The interface, architecturally based on the Intel 8751 microcontroller, is an interrupt-driven system designed to receive a digital message from an ultrareliable fault-tolerant control system (UFTCS). The interface links the UFTCS to an electronic servo-control unit, which controls a set of hydraulic actuators. It was necessary to build a UFTCS emulator (also based on the Intel 8751) to provide signal sources for testing the equipment.
NASA Technical Reports Server (NTRS)
Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.
2009-01-01
As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.
NASA Astrophysics Data System (ADS)
Sahin, Sefa; Yildirim, Cengiz; Akif Sarikaya, Mehmet; Tuysuz, Okan; Genc, S. Can; Ersen Aksoy, Murat; Ertekin Doksanalti, Mustafa
2016-04-01
Cosmogenic surface exposure dating is based on the production of rare nuclides in exposed rocks, which interact with cosmic rays. Through modelling of measured 36Cl concentrations, we might obtain information of the history of the earthquake activity. Yet, there are several factors which may impact production of rare nuclides such as geometry of the fault, topography, geographic location of the study area, temporal variations of the Earth's magnetic field, self-cover and denudation rate on the scarp. Recently developed models provides a method to infer timing of earthquakes and slip rates on limited scales by taking into account these parameters. Our study area, the Knidos Fault Zone, is located on the Datça Peninsula in Southwestern Anatolia and contains several normal fault scarps formed within the limestone, which are appropriate to generate cosmogenic chlorine-36 (36Cl) dating models. Since it has a well-preserved scarp, we have focused on the Mezarlık Segment of the fault zone, which has an average length of 300 m and height 12-15 m. 128 continuous samples from top to bottom of the fault scarp were collected to carry out analysis of cosmic 36Cl isotopes concentrations. The main purpose of this study is to analyze factors affecting the production rates and amount of cosmogenic 36Cl nuclides concentration. Concentration of Cl36 isotopes are measured by AMS laboratories. Through the local production rates and concentration of the cosmic isotopes, we can calculate exposure ages of the samples. Recent research elucidated each step of the application of this method by the Matlab programming language (e.g. Schlagenhauf et al., 2010). It is vitally helpful to generate models of Quaternary activity of the normal faults. We, however, wanted to build a user-friendly program through an open source programing language "R" (GNU Project) that might be able to help those without knowledge of complex math programming, making calculations as easy and understandable as possible. Through our codes, physical parameters, statistical analysis and graphics production of the fault models can be generated for each platform. This project is supported by the Scientific and Technological Research Council of Turkey (TUBITAK, Grant number: 113Y436) This study was conducted with the Decision of the Council of Ministers with No. 2013/5387 on the date 30.09.2013 and was done with the permission of Knidos Presidency of excavation in accordance with the scope of Knidos Excavation and Research carried out on behalf of Selcuk University and Ministry of Culture and Tourism. Keywords: Knidos, geomorphology, modelling, cosmogenic surface exposure dating, chlorine36
Model-Based Fault Tolerant Control
NASA Technical Reports Server (NTRS)
Kumar, Aditya; Viassolo, Daniel
2008-01-01
The Model Based Fault Tolerant Control (MBFTC) task was conducted under the NASA Aviation Safety and Security Program. The goal of MBFTC is to develop and demonstrate real-time strategies to diagnose and accommodate anomalous aircraft engine events such as sensor faults, actuator faults, or turbine gas-path component damage that can lead to in-flight shutdowns, aborted take offs, asymmetric thrust/loss of thrust control, or engine surge/stall events. A suite of model-based fault detection algorithms were developed and evaluated. Based on the performance and maturity of the developed algorithms two approaches were selected for further analysis: (i) multiple-hypothesis testing, and (ii) neural networks; both used residuals from an Extended Kalman Filter to detect the occurrence of the selected faults. A simple fusion algorithm was implemented to combine the results from each algorithm to obtain an overall estimate of the identified fault type and magnitude. The identification of the fault type and magnitude enabled the use of an online fault accommodation strategy to correct for the adverse impact of these faults on engine operability thereby enabling continued engine operation in the presence of these faults. The performance of the fault detection and accommodation algorithm was extensively tested in a simulation environment.
Adaptive Control Allocation for Fault Tolerant Overactuated Autonomous Vehicles
2007-11-01
Tolerant Overactuated Autonomous Vehicles Casavola, A.; Garone, E. (2007) Adaptive Control Allocation for Fault Tolerant Overactuated Autonomous ...Adaptive Control Allocation for Fault Tolerant Overactuated Autonomous Vehicles 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...Tolerant Overactuated Autonomous Vehicles 3.2 - 2 RTO-MP-AVT-145 UNCLASSIFIED/UNLIMITED Control allocation problem (CAP) - Given a virtual input v(t
Reliability model derivation of a fault-tolerant, dual, spare-switching, digital computer system
NASA Technical Reports Server (NTRS)
1974-01-01
A computer based reliability projection aid, tailored specifically for application in the design of fault-tolerant computer systems, is described. Its more pronounced characteristics include the facility for modeling systems with two distinct operational modes, measuring the effect of both permanent and transient faults, and calculating conditional system coverage factors. The underlying conceptual principles, mathematical models, and computer program implementation are presented.
Computer-Aided Reliability Estimation
NASA Technical Reports Server (NTRS)
Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.
1986-01-01
CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.
An experiment in software reliability
NASA Technical Reports Server (NTRS)
Dunham, J. R.; Pierce, J. L.
1986-01-01
The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.
Australian Personal Enrichment Education and Training Programs. Statistics 1996. An Overview.
ERIC Educational Resources Information Center
National Centre for Vocational Education Research, Leabrook (Australia).
This publication presents a consolidated national picture of activity in recreation, leisure, and personal enrichment programs in Australia. It also details highlights, key features, and characteristics of activity in personal enrichment programs in 1996. Information has been collected from two main training provider groups: adult community…
PAWS/STEM - PADE APPROXIMATION WITH SCALING AND SCALED TAYLOR EXPONENTIAL MATRIX (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Butler, R. W.
1994-01-01
Traditional fault-tree techniques for analyzing the reliability of large, complex systems fail to model the dynamic reconfiguration capabilities of modern computer systems. Markov models, on the other hand, can describe fault-recovery (via system reconfiguration) as well as fault-occurrence. The Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs provide a flexible, user-friendly, language-based interface for the creation and evaluation of Markov models describing the behavior of fault-tolerant reconfigurable computer systems. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. The calculation of the probability of entering a death state of a Markov model (representing system failure) requires the solution of a set of coupled differential equations. Because of the large disparity between the rates of fault arrivals and system recoveries, Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. PAWS/STEM was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The package is written in PASCAL, ANSI compliant C-language, and FORTRAN 77. The standard distribution medium for the VMS version of PAWS/STEM (LAR-14165) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of PAWS/STEM (LAR-14920) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. PAWS/STEM was developed in 1989 and last updated in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.
PAWS/STEM - PADE APPROXIMATION WITH SCALING AND SCALED TAYLOR EXPONENTIAL MATRIX (SUN VERSION)
NASA Technical Reports Server (NTRS)
Butler, R. W.
1994-01-01
Traditional fault-tree techniques for analyzing the reliability of large, complex systems fail to model the dynamic reconfiguration capabilities of modern computer systems. Markov models, on the other hand, can describe fault-recovery (via system reconfiguration) as well as fault-occurrence. The Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs provide a flexible, user-friendly, language-based interface for the creation and evaluation of Markov models describing the behavior of fault-tolerant reconfigurable computer systems. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. The calculation of the probability of entering a death state of a Markov model (representing system failure) requires the solution of a set of coupled differential equations. Because of the large disparity between the rates of fault arrivals and system recoveries, Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. PAWS/STEM was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The package is written in PASCAL, ANSI compliant C-language, and FORTRAN 77. The standard distribution medium for the VMS version of PAWS/STEM (LAR-14165) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of PAWS/STEM (LAR-14920) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. PAWS/STEM was developed in 1989 and last updated in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.
Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy
2009-01-01
Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.
NASA Technical Reports Server (NTRS)
Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven
1994-01-01
The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.
CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS
NASA Technical Reports Server (NTRS)
Iverson, D. L.
1994-01-01
Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same techniques as the fault tree cut set code, except it includes all upstream digraph nodes in the cut sets for a given node and checks for cycles in the digraph during the solution process. CUTSETS solves for specified nodes and will not automatically solve for all upstream digraph nodes. The cut sets will be output as a text file. CUTSETS includes a utility program that will convert the popular COD format digraph model description files into text input files suitable for use with the CUTSETS programs. FEAT (MSC-21873) and FIRM (MSC-21860) available from COSMIC are examples of programs that produce COD format digraph model description files that may be converted for use with the CUTSETS programs. CUTSETS is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. CUTSETS is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is included on the distribution medium. Sun and SunOS are trademarks of Sun Microsystems, Inc. DEC, DeCstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc.
Permeability of the San Andreas Fault Zone at Depth
NASA Astrophysics Data System (ADS)
Rathbun, A. P.; Song, I.; Saffer, D.
2010-12-01
Quantifying fault rock permeability is important toward understanding both the regional hydrologic behavior of fault zones, and poro-elastic processes that affect fault mechanics by mediating effective stress. These include long-term fault strength as well as dynamic processes that may occur during earthquake slip, including thermal pressurization and dilatancy hardening. Despite its importance, measurements of fault zone permeability for relevant natural materials are scarce, owing to the difficulty of coring through active fault zones seismogenic depths. Most existing measurements of fault zone permeability are from altered surface samples or from thinner, lower displacement faults than the SAF. Here, we report on permeability measurements conducted on gouge from the actively creeping Central Deformation Zone (CDZ) of the San Andreas Fault, sampled in the SAFOD borehole at a depth of ~2.7 km (Hole G, Run 4, sections 4,5). The matrix of the gouge in this interval is predominantly composed of particles <10 µm, with ~5 vol% clasts of serpentinite, very fine-grained sandstone, and siltstone. The 2.6 m-thick CDZ represents the main fault trace and hosts ~90% of the active slip on the SAF at this location, as documented by repeated casing deformation surveys. We measured permeability in two different configurations: (1) in a uniaxial pressure cell, in which a sample is placed into a rigid steel ring which imposes a zero lateral strain condition and subjected to axial load, and (2) in a standard triaxial system under isostatic stress conditions. In the uniaxial configuration, we obtained permeabilities at axial effective stresses up to 90 MPa, and in the triaxial system up to 10 MPa. All experiments were conducted on cylindrical subsamples of the SAFOD core 25 mm in diameter, with lengths ranging from 18mm to 40mm, oriented for flow approximately perpendicular to the fault. In uniaxial tests, permeability is determined by running constant rate of strain (CRS) tests up to 90 MPa axial stress. In these tests, axial stress is increased via a constant rate of displacement, and the excess pore pressure build up at the base of the sample is measured. Stress, pore pressure and strain are monitored to calculate coefficient of consolidation and volumetric compressibility in addition to permeability. In triaxial experiments, permeability is measured from by flow through tests under constant head boundary conditions. Permeability of the CDZ rapidly decreases to ~10-19 m2 by 20 MPa axial stress in our CRS tests. Over axial stresses from 20-85 MPa, permeability decreases log-linearly with effective stress from 8x10-20 m2 to 1x10-20 m2. Flow-through tests in the triaxial system under isostatic conditions yield permeabilities of 2.2x10-19 m2 and 1x10-20 m2 at 5 and 10 MPa, respectively. Our results are consistent with published geochemical data from SAFOD mud gas samples and inferred pore pressures during drilling [Zoback et al., 2010], which together suggest that the fault is a barrier to regional fluid flow. Our results indicate that the permeability of the fault core is sufficiently low to result in effectively undrained behavior during slip, thus allowing dynamic processes including thermal pressurization and dilatancy hardening to affect slip behavior.
75 FR 35962 - Special Evaluation Assistance for Rural Communities and Households Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-24
... and Households (SEARCH) Program as authorized by Section 306(a)(2) of the Consolidated Farm and Rural Development Act (CONACT) (7 U.S.C. 1926(a)(2)). The amendment added the new SEARCH grant program under which... Assistance for Rural Communities and Households Program (SEARCH). This catalog is available on a subscription...
Software Implemented Fault-Tolerant (SIFT) user's guide
NASA Technical Reports Server (NTRS)
Green, D. F., Jr.; Palumbo, D. L.; Baltrus, D. W.
1984-01-01
Program development for a Software Implemented Fault Tolerant (SIFT) computer system is accomplished in the NASA LaRC AIRLAB facility using a DEC VAX-11 to interface with eight Bendix BDX 930 flight control processors. The interface software which provides this SIFT program development capability was developed by AIRLAB personnel. This technical memorandum describes the application and design of this software in detail, and is intended to assist both the user in performance of SIFT research and the systems programmer responsible for maintaining and/or upgrading the SIFT programming environment.
The use of automatic programming techniques for fault tolerant computing systems
NASA Technical Reports Server (NTRS)
Wild, C.
1985-01-01
It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.
Hartzell, S.; Leeds, A.; Frankel, A.; Williams, R.A.; Odum, J.; Stephenson, W.; Silva, W.
2002-01-01
The Seattle fault poses a significant seismic hazard to the city of Seattle, Washington. A hybrid, low-frequency, high-frequency method is used to calculate broadband (0-20 Hz) ground-motion time histories for a M 6.5 earthquake on the Seattle fault. Low frequencies (1 Hz) are calculated by a stochastic method that uses a fractal subevent size distribution to give an ω-2 displacement spectrum. Time histories are calculated for a grid of stations and then corrected for the local site response using a classification scheme based on the surficial geology. Average shear-wave velocity profiles are developed for six surficial geologic units: artificial fill, modified land, Esperance sand, Lawton clay, till, and Tertiary sandstone. These profiles together with other soil parameters are used to compare linear, equivalent-linear, and nonlinear predictions of ground motion in the frequency band 0-15 Hz. Linear site-response corrections are found to yield unreasonably large ground motions. Equivalent-linear and nonlinear calculations give peak values similar to the 1994 Northridge, California, earthquake and those predicted by regression relationships. Ground-motion variance is estimated for (1) randomization of the velocity profiles, (2) variation in source parameters, and (3) choice of nonlinear model. Within the limits of the models tested, the results are found to be most sensitive to the nonlinear model and soil parameters, notably the over consolidation ratio.
Cell boundary fault detection system
Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN
2011-04-19
An apparatus and program product determine a nodal fault along the boundary, or face, of a computing cell. Nodes on adjacent cell boundaries communicate with each other, and the communications are analyzed to determine if a node or connection is faulty.
Design and Implementation of a Distributed Version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.
1994-01-01
Distributed NEPP is a new version of the NASA Engine Performance Program that runs in parallel on a collection of Unix workstations connected through a network. The program is fault-tolerant, efficient, and shows significant speed-up in a multi-user, heterogeneous environment. This report describes the issues involved in designing distributed NEPP, the algorithms the program uses, and the performance distributed NEPP achieves. It develops an analytical model to predict and measure the performance of the simple distribution, multiple distribution, and fault-tolerant distribution algorithms that distributed NEPP incorporates. Finally, the appendices explain how to use distributed NEPP and document the organization of the program's source code.
ERIC Educational Resources Information Center
Collier, Herbert I.
1978-01-01
Energy conservation programs at Louisiana State University reduced energy use 23 percent. The programs involved computer controlled power management systems, adjustment of building temperatures and lighting levels to prescribed standards, consolidation of night classes, centralization of chilled water systems, and manual monitoring of heating and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-31
... School Closure and Consolidation, currently authorized by OMB Control Number 1076-0164, to the Office of... do so. III. Data OMB Control Number: 1076-0164. Title: Home-living Programs and School Closure and...
Study of fault-tolerant software technology
NASA Technical Reports Server (NTRS)
Slivinski, T.; Broglio, C.; Wild, C.; Goldberg, J.; Levitt, K.; Hitt, E.; Webb, J.
1984-01-01
Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance.
NASA Technical Reports Server (NTRS)
Smith, T. B., III; Lala, J. H.
1984-01-01
The FTMP architecture is a high reliability computer concept modeled after a homogeneous multiprocessor architecture. Elements of the FTMP are operated in tight synchronism with one another and hardware fault-detection and fault-masking is provided which is transparent to the software. Operating system design and user software design is thus greatly simplified. Performance of the FTMP is also comparable to that of a simplex equivalent due to the efficiency of fault handling hardware. The FTMP project constructed an engineering module of the FTMP, programmed the machine and extensively tested the architecture through fault injection and other stress testing. This testing confirmed the soundness of the FTMP concepts.
Simulation of an Air Cushion Vehicle
1977-03-01
Massachusetts 02139 ! DDC Niov 219T March 1977 Final Report for Period January 1975 - December 1976 DOD DISTRIBUTION STATEMENT Approved for public...or in ,art is permitted for any purpose of the United States Government. II II JI UNCLASSI FIED SECURITY CLASSIFICATiON OF TIlS PAGE flWhen Dato...overflow Floating point fault Decimal arithmetic fault Watch Dog timer runout 186 NAVTRAEQUIPCEN 75-C-0057- 1 PROGRAM ENi\\TRY Initial Program - LOAD Inhibit
Dynamics of hydrocarbon vents: Focus on primary porosity
NASA Astrophysics Data System (ADS)
Johansen, C.; Shedd, W.; Abichou, T.; Pineda-Garcia, O.; Silva, M.; MacDonald, I. R.
2012-12-01
This study investigated the dynamics of hydrocarbon release by monitoring activity of a single vent at a 1215m deep site in the Gulf of Mexico (GC600). An autonomous camera, deployed by the submersible ALVIN, was programmed to capture a close-up image every 4 seconds for approximately 3.5 hours. The images provided the ability to study the gas hydrate outcrop site (that measured 5.2x16.3cm3) in an undisturbed state. The outcrop included an array of 38 tube-like vents through which dark brown oil bubbles are released at a rate ranging from 8 bubbles per minute to 0 bubbles per minute. The average release of bubbles from all the separate vents was 59.5 bubbles per minute, equating the total volume released to 106.38cm per minute. The rate of bubble release decreased toward the end of the observation interval, which coincided approximately with the tidal minimum. Ice worms (Hesiocaeca methanicola, Desbruyères & Toulmond, 1998) were abundant at the vent site. The image sequence showed the ice-worms actively moving in and out of burrows in the mound. It has been speculated that Hesiocaeca methanicola contribute to gas hydrate decomposition by creating burrows and depressions in the gas hydrate matrix (Fisher et al, 2000). Ice worm burrows could generate pathways for the passage of oil and gas through the gas hydrate mound. Gas hydrates commonly occur along active and/or passive continental margins (Kennicutt et al, 1988a). The release of oil and gas at this particular hydrocarbon seep site is along a passive continental margin, and controlled primarily by active salt tectonics as opposed to the movement of continental tectonic plates (Salvador, 1987). We propose a descriptive model governing the release of gas and oil from deep sub-bottom reservoirs at depths of 3000-5000m (MacDonald, 1998), through consolidated and unconsolidated sediments, and finally through gas hydrate deposits at the sea floor. The oil and gas escape from the source rock and/or reservoir through at least three degrees of porosity (i.e. traveling through faulted consolidated sediment, unconsolidated sediment, and finally the gas hydrate outcroppings as described here). The oil and gas travel from the sub-bottom reservoir along, what is thought, an interface between the salt and sediment, and then up a fault in the consolidated sediment. When it reaches the unconsolidated sediments, vertical pathways bifurcate due to lack of sediment strength to allow for the oil and gas to reach different clusters of hydrocarbon vents at the sea floor. Hydrocarbon vents are formed and sustained by a combination of pressure, temperature, and gas solubility (Peltzer & Brewer, 2000) creating persistent primary porosity conduits, from which the bubbles escape at different rates depending on the size of the tubes. Previous research has been carried out in order to determine the effect of temperature fluxes on hydrocarbon outcroppings (MacDonald et al, 2005), however, a focus on the dynamics at this level of primary porosity is lacking. By determining the rate and size of bubbles and pore size distribution of the hydrocarbon outcropping, we can explore the hydraulic properties. Therefore, examination of biological and physical effects, such as the role of ice-worms, and the effect of tides, allow for a better understanding of the dynamics and persistency of hydrocarbon vent outcroppings.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-24
... Activities; Submission to OMB for Review and Approval; Comment Request; Landfill Methane Outreach Program... the electronic docket, go to www.regulations.gov . Title: Landfill Methane Outreach Program (Renewal... consolidated in 40 CFR part 9. Abstract: The Landfill Methane Outreach Program (LMOP), created by EPA as part...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-08
... Laundering Compliance Program) and adopt new Rule 3310--NYSE Amex Equities (Anti-Money Laundering Compliance... amendments, NASD Rule 3011 (Anti- Money Laundering Compliance Program) and related Interpretive Material NASD IM-3011-1 and 3011-2 as consolidated FINRA Rule 3310 (Anti-Money Laundering Compliance Program), and...
1985-07-01
5695 BF(I) - BETA(LDF) * GOTO 7 5700 6 NN - N-i 5705 Ch - CI / (ES(N)-ES(NN)) 5710 AF(I) w ALPHA(N) + CM*(ALPHA(N)-ALPHA(NN)) 5715 DF(I) a BETA(N) 4...1OXP34t4SETTLEMENT DUE TO CONSOLIDATION - PF1O.4) 7845 113 FORMAT(/1OX932HSETTLEMENT DUE TO DESICCATION - PF1O.4) 7850 114 FORNAT(/1OX920HSURFACE
Martin, Peter
1984-01-01
From July 1978 to January 1980, water levels in the southern part of the Santa Barbara ground-water basin declined more than 100 feet. These water-level declines resulted from increases in municipal pumping since July 1978. The increase in municipal pumping was part of a basin-testing program designed to determine the usable quantity of ground water in storage. The pumping, centered in the city less than 1 mile from the coast, has caused water-level declines to altitudes below sea level in the main water-bearing zones. As a result, the ground-water basin would be subject to saltwater intrusion if the study-period pumpage were maintained or increased. Data indicate that saltwater intrusion has degraded the quality of the water yielded from six coastal wells. During the study period, the six coastal wells all yielded water with chloride concentrations in excess of 250 milligrams per liter, and four of the wells yielded water with chloride concentrations in excess of 1,000 milligrams per liter. Previous investigators believed that saltwater intrusion was limited to the shallow part of the aquifer, directly adjacent to the coast. The possibility of saltwater intrusion into the deeper water-bearing deposits in the aquifer was thought to be remote because an offshore fault truncates these deeper deposits so that they lie against consolidated rocks on the seaward side of the fault. Results of this study indicate, however, that ocean water has intruded the deeper water-bearing deposits, and to a much greater extent than in the shallow part of the aquifer. Apparently the offshore fault is not an effective barrier to saltwater intrusion. No physical barriers are known to exist between the coast and the municipal well field. Therefore, if the pumping rate maintained during the basin-testing program were continued, the degraded water along the coast could move inland and contaminate the municipal supply wells. The time required for the degraded water to move from the coast to the nearest supply well is estimated, using Darcy's equation, to be about 20 years. Management alternatives for controlling saltwater intrusion in the Santa Barbara area include (1) decreasing municipal pumping, (2) increasing the quantity of water available for recharge by releasing surplus water from surface reservoirs to Mission Creek, (3) artificially recharging the basin using injection wells, and (4) locating municipal supply wells farther from the coast and spacing them farther apart in order to minimize drawdown. Continued monitoring of water levels and water quality would enable assessment of the effectiveness of the control measures employed.
NASA Astrophysics Data System (ADS)
Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei
2013-02-01
Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.
Shao, Renping; Li, Jing; Hu, Wentao; Dong, Feifei
2013-02-01
Higher order cumulants (HOC) is a new kind of modern signal analysis of theory and technology. Spectrum entropy clustering (SEC) is a data mining method of statistics, extracting useful characteristics from a mass of nonlinear and non-stationary data. Following a discussion on the characteristics of HOC theory and SEC method in this paper, the study of signal processing techniques and the unique merits of nonlinear coupling characteristic analysis in processing random and non-stationary signals are introduced. Also, a new clustering analysis and diagnosis method is proposed for detecting multi-damage on gear by introducing the combination of HOC and SEC into the damage-detection and diagnosis of the gear system. The noise is restrained by HOC and by extracting coupling features and separating the characteristic signal at different speeds and frequency bands. Under such circumstances, the weak signal characteristics in the system are emphasized and the characteristic of multi-fault is extracted. Adopting a data-mining method of SEC conducts an analysis and diagnosis at various running states, such as the speed of 300 r/min, 900 r/min, 1200 r/min, and 1500 r/min of the following six signals: no-fault, short crack-fault in tooth root, long crack-fault in tooth root, short crack-fault in pitch circle, long crack-fault in pitch circle, and wear-fault on tooth. Research shows that this combined method of detection and diagnosis can also identify the degree of damage of some faults. On this basis, the virtual instrument of the gear system which detects damage and diagnoses faults is developed by combining with advantages of MATLAB and VC++, employing component object module technology, adopting mixed programming methods, and calling the program transformed from an *.m file under VC++. This software system possesses functions of collecting and introducing vibration signals of gear, analyzing and processing signals, extracting features, visualizing graphics, detecting and diagnosing faults, detecting and monitoring, etc. Finally, the results of testing and verifying show that the developed system can effectively be used to detect and diagnose faults in an actual operating gear transmission system.
NASA Technical Reports Server (NTRS)
Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.
1990-01-01
The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.
ERIC Educational Resources Information Center
Harrison, Helene W.
Comprised of 27 classrooms from grade levels 1-6, the program was primarily designed to provide bilingual education for pupils with limited English speaking ability. However, due to parental requests, almost 16% monolingual English-speakers were accepted into the program. Of the 717 pupils participating in the program 84.6% were Mexican American.…
ERIC Educational Resources Information Center
Harrison, Helene W.
The program is primarily designed to provide bilingual education for pupils, in grades K-6, with limited English-speaking ability. Due to parental requests, approximately 16 percent monolingual English-speakers have been accepted into the program. Of the 529 children enrolled in the program, 89 percent have Spanish surnames. Objectives for the…
1986-06-01
THE NAVY DEPARTMENT: AN OVERVIEW OF THE PROCESS ---------------------- 21 A. THE DOD CONTEXT ------------------------------ 23 1. The Planning ...Process in Summary ----------- 74 a. Program Planning (August-January) ------------ 74 b. Programming (January-April) ---------------- 77 c. Final POM...Input--------------------------------126 7. Department of the Navy Consolidated Planning and Programming Guidance -------------- 130 8. Defense Guidance
Influence of overconsolidated condition on permeability evolution in silica sand
NASA Astrophysics Data System (ADS)
Kimura, S.; Kaneko, H.; Ito, T.; Nishimura, O.; Minagawa, H.
2013-12-01
Permeability of sediments is important factors for production of natural gas from natural gas hydrate bearing layers. Methane-hydrate is regarded as one of the potential resources of natural gas. As results of coring and logging, the existence of a large amount of methane-hydrate is estimated in the Nankai Trough, offshore central Japan, where many folds and faults have been observed. In the present study, we investigate the permeability of silica sand specimen forming the artificial fault zone after large displacement shear in the ring-shear test under two different normal consolidated and overconsolidated conditions. The significant influence of overconsolidation ratio (OCR) on permeability evolution is not found. The permeability reduction is influenced a great deal by the magnitude of normal stress during large displacement shearing. The grain size distribution and structure observation in the shear zone of specimen after shearing at each normal stress level are analyzed by laser scattering type particle analyzer and scanning electron microscope, respectively. It is indicated that the grain size and porosity reduction due to the particle crushing are the factor of the permeability reduction. This study is financially supported by METI and Research Consortium for Methane Hydrate Resources in Japan (the MH21 Research Consortium).
SB 1082 -- Unified hazardous materials/waste program: Local implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, W.
California Senate Bill 1082 was signed into law in the fall of 1993 because business and industry believed there were too many hazardous materials inspectors asking the same questions, looking at the same items and requiring similar information on several variations of the same form. Industry was not happy with the large diversity of programs, each with its own inspectors, permits and fees, essentially doing what industry believed was the same inspection. SB 1082 will allow local city and county agencies to apply to the California Environmental Protection Agency to become a Certified Unified Program Agency (CUPA) or work withmore » a CUPA as a Participating Agency (PA) to manage specific program elements. The CUPA will unify six regulatory programs including hazardous waste/tiered permitting, aboveground storage tanks, underground storage tanks, business and area plans/inventory or disclosure, acutely hazardous materials/risk management prevention and Uniform Fire Code programs related to hazardous materials inventory/plan requirements. The bill requires the CUPA to (1) implement a permit consolidation program; (2) implement a single fee system with a state surcharge; (3) consolidate, coordinate and make consistent any local or regional requirements or guidance documents; and (4) implement a single unified inspection and enforcement program.« less
Flight Crew Integration (FCI) ISS Crew Comments Database & Products Summary
NASA Technical Reports Server (NTRS)
Schuh, Susan
2016-01-01
This Crew Debrief Data provides support for design and development of vehicles, hardware, requirements, procedures, processes, issue resolution, lessons learned, consolidation and trending for current Programs; and much of the data is also used to support development of future Programs.
Reflections on Educating Educational Administrators.
ERIC Educational Resources Information Center
Miklos, Erwin; Chapman, Donald
This paper presents a history of educational administration among Canada's provinces, discusses the status of university preparation programs, and explores theories of formalized and alternative approaches. Before the 1950's, little interest existed in administrators' formal preparation. Consolidation of schools led to graduate programs, but the…
NASA Astrophysics Data System (ADS)
Lin, Y. K.; Ke, M. C.; Ke, S. S.
2016-12-01
An active fault is commonly considered to be active if they have moved one or more times in the last 10,000 years and likely to have another earthquake sometime in the future. The relationship between the fault reactivation and the surface deformation after the Chi-Chi earthquake (M=7.2) in 1999 has been concerned up to now. According to the investigations of well-known disastrous earthquakes in recent years, indicated that surface deformation is controlled by the 3D fault geometric shape. Because the surface deformation may cause dangerous damage to critical infrastructures, buildings, roads, power, water and gas lines etc. Therefore it's very important to make pre-disaster risk assessment via the 3D active fault model to decrease serious economic losses, people injuries and deaths caused by large earthquake. The approaches to build up the 3D active fault model can be categorized as (1) field investigation (2) digitized profile data and (3) build the 3D modeling. In this research, we tracked the location of the fault scarp in the field first, then combined the seismic profiles (had been balanced) and historical earthquake data to build the underground fault plane model by using SKUA-GOCAD program. Finally compared the results come from trishear model (written by Richard W. Allmendinger, 2012) and PFC-3D program (Itasca) and got the calculated range of the deformation area. By analysis of the surface deformation area made from Hsin-Chu Fault, we concluded the result the damage zone is approaching 68 286m, the magnitude is 6.43, the offset is 0.6m. base on that to estimate the population casualties, building damage by the M=6.43 earthquake in Hsin-Chu area, Taiwan. In the future, in order to be applied accurately on earthquake disaster prevention, we need to consider further the groundwater effect and the soil structure interaction inducing by faulting.
On boundary-element models of elastic fault interaction
NASA Astrophysics Data System (ADS)
Becker, T. W.; Schott, B.
2002-12-01
We present the freely available, modular, and UNIX command-line based boundary-element program interact. It is yet another implementation of Crouch and Starfield's (1983) 2-D and Okada's (1992) half-space solutions for constant slip on planar fault segments in an elastic medium. Using unconstrained or non-negative, standard-package matrix routines, the code can solve for slip distributions on faults given stress boundary conditions, or vice versa, both in a local or global reference frame. Based on examples of complex fault geometries from structural geology, we discuss the effects of different stress boundary conditions on the predicted slip distributions of interacting fault systems. Such one-step calculations can be useful to estimate the moment-release efficiency of alternative fault geometries, and so to evaluate the likelihood which system may be realized in nature. A further application of the program is the simulation of cyclic fault rupture based on simple static-kinetic friction laws. We comment on two issues: First, that of the appropriate rupture algorithm. Cellular models of seismicity often employ an exhaustive rupture scheme: fault cells fail if some critical stress is reached, then cells slip once-only by a given amount, and subsequently the redistributed stress is used to check for triggered activations on other cells. We show that this procedure can lead to artificial complexity in seismicity if time-to-failure is not calculated carefully because of numerical noise. Second, we address the question if foreshocks can be viewed as direct expressions of a simple statistical distribution of frictional strength on individual faults. Repetitive failure models based on a random distribution of frictional coefficients initially show irregular seismicity. By repeatedly selecting weaker patches, the fault then evolves into a quasi-periodic cycle. Each time, the pre-mainshock events build up the cumulative moment release in a non-linear fashion. These temporal seismicity patterns roughly resemble the accelerated moment-release features which are sometimes observed in nature.
The PAWS and STEM reliability analysis programs
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Stevenson, Philip H.
1988-01-01
The PAWS and STEM programs are new design/validation tools. These programs provide a flexible, user-friendly, language-based interface for the input of Markov models describing the behavior of fault-tolerant computer systems. These programs produce exact solutions of the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. PAWS uses a Pade approximation as a solution technique; STEM uses a Taylor series as a solution technique. Both programs have the capability to solve numerically stiff models. PAWS and STEM possess complementary properties with regard to their input space; and, an additional strength of these programs is that they accept input compatible with the SURE program. If used in conjunction with SURE, PAWS and STEM provide a powerful suite of programs to analyze the reliability of fault-tolerant computer systems.
Algorithm-Based Fault Tolerance Integrated with Replication
NASA Technical Reports Server (NTRS)
Some, Raphael; Rennels, David
2008-01-01
In a proposed approach to programming and utilization of commercial off-the-shelf computing equipment, a combination of algorithm-based fault tolerance (ABFT) and replication would be utilized to obtain high degrees of fault tolerance without incurring excessive costs. The basic idea of the proposed approach is to integrate ABFT with replication such that the algorithmic portions of computations would be protected by ABFT, and the logical portions by replication. ABFT is an extremely efficient, inexpensive, high-coverage technique for detecting and mitigating faults in computer systems used for algorithmic computations, but does not protect against errors in logical operations surrounding algorithms.
Mechanical Models of Fault-Related Folding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, A. M.
2003-01-09
The subject of the proposed research is fault-related folding and ground deformation. The results are relevant to oil-producing structures throughout the world, to understanding of damage that has been observed along and near earthquake ruptures, and to earthquake-producing structures in California and other tectonically-active areas. The objectives of the proposed research were to provide both a unified, mechanical infrastructure for studies of fault-related foldings and to present the results in computer programs that have graphical users interfaces (GUIs) so that structural geologists and geophysicists can model a wide variety of fault-related folds (FaRFs).
NASA Technical Reports Server (NTRS)
Clune, E.; Segall, Z.; Siewiorek, D.
1984-01-01
A program of experiments has been conducted at NASA-Langley to test the fault-free performance of a Fault-Tolerant Multiprocessor (FTMP) avionics system for next-generation aircraft. Baseline measurements of an operating FTMP system were obtained with respect to the following parameters: instruction execution time, frame size, and the variation of clock ticks. The mechanisms of frame stretching were also investigated. The experimental results are summarized in a table. Areas of interest for future tests are identified, with emphasis given to the implementation of a synthetic workload generation mechanism on FTMP.
Analytical Approaches to Guide SLS Fault Management (FM) Development
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.
2012-01-01
Extensive analysis is needed to determine the right set of FM capabilities to provide the most coverage without significantly increasing the cost, reliability (FP/FN), and complexity of the overall vehicle systems. Strong collaboration with the stakeholders is required to support the determination of the best triggers and response options. The SLS Fault Management process has been documented in the Space Launch System Program (SLSP) Fault Management Plan (SLS-PLAN-085).
Lane, Michael
2013-06-28
Proposed drill sites for intermediate depth temperature gradient holes and/or deep resource confirmation wells. Temperature gradient contours based on shallow TG program and faults interpreted from seismic reflection survey are shown, as are two faults interpreted by seismic contractor Optim but not by Oski Energy, LLC.
Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN
2008-10-14
An apparatus, program product and method checks for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.
Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN
2012-02-07
An apparatus, program product and method check for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.
Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D.; Smith, Brian Edward
2010-02-23
An apparatus and program product check for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.
NASA Technical Reports Server (NTRS)
Ferell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Goerz, Jesse; Brown, Barbara
2010-01-01
This paper's main purpose is to detail issues and lessons learned regarding designing, integrating, and implementing Fault Detection Isolation and Recovery (FDIR) for Constellation Exploration Program (CxP) Ground Operations at Kennedy Space Center (KSC).
Hukerikar, Saurabh; Teranishi, Keita; Diniz, Pedro C.; ...
2017-02-11
In the presence of accelerated fault rates, which are projected to be the norm on future exascale systems, it will become increasingly difficult for high-performance computing (HPC) applications to accomplish useful computation. Due to the fault-oblivious nature of current HPC programming paradigms and execution environments, HPC applications are insufficiently equipped to deal with errors. We believe that HPC applications should be enabled with capabilities to actively search for and correct errors in their computations. The redundant multithreading (RMT) approach offers lightweight replicated execution streams of program instructions within the context of a single application process. Furthermore, the use of completemore » redundancy incurs significant overhead to the application performance.« less
Software fault-tolerance by design diversity DEDIX: A tool for experiments
NASA Technical Reports Server (NTRS)
Avizienis, A.; Gunningberg, P.; Kelly, J. P. J.; Lyu, R. T.; Strigini, L.; Traverse, P. J.; Tso, K. S.; Voges, U.
1986-01-01
The use of multiple versions of a computer program, independently designed from a common specification, to reduce the effects of an error is discussed. If these versions are designed by independent programming teams, it is expected that a fault in one version will not have the same behavior as any fault in the other versions. Since the errors in the output of the versions are different and uncorrelated, it is possible to run the versions concurrently, cross-check their results at prespecified points, and mask errors. A DEsign DIversity eXperiments (DEDIX) testbed was implemented to study the influence of common mode errors which can result in a failure of the entire system. The layered design of DEDIX and its decision algorithm are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hukerikar, Saurabh; Teranishi, Keita; Diniz, Pedro C.
In the presence of accelerated fault rates, which are projected to be the norm on future exascale systems, it will become increasingly difficult for high-performance computing (HPC) applications to accomplish useful computation. Due to the fault-oblivious nature of current HPC programming paradigms and execution environments, HPC applications are insufficiently equipped to deal with errors. We believe that HPC applications should be enabled with capabilities to actively search for and correct errors in their computations. The redundant multithreading (RMT) approach offers lightweight replicated execution streams of program instructions within the context of a single application process. Furthermore, the use of completemore » redundancy incurs significant overhead to the application performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix B provides a description of Browns Ferry, Unit 1, plant systems and the failure evaluation of those systems as they apply to accidents at Browns Ferry. Information is presented concerning front-line system fault analysis; support system fault analysis; human error models andmore » probabilities; and generic control circuit analyses.« less
Formal specification and mechanical verification of SIFT - A fault-tolerant flight control system
NASA Technical Reports Server (NTRS)
Melliar-Smith, P. M.; Schwartz, R. L.
1982-01-01
The paper describes the methodology being employed to demonstrate rigorously that the SIFT (software-implemented fault-tolerant) computer meets its requirements. The methodology uses a hierarchy of design specifications, expressed in the mathematical domain of multisorted first-order predicate calculus. The most abstract of these, from which almost all details of mechanization have been removed, represents the requirements on the system for reliability and intended functionality. Successive specifications in the hierarchy add design and implementation detail until the PASCAL programs implementing the SIFT executive are reached. A formal proof that a SIFT system in a 'safe' state operates correctly despite the presence of arbitrary faults has been completed all the way from the most abstract specifications to the PASCAL program.
Evaluating The Role Of Payment Policy In Driving Vertical Integration In The Oncology Market.
Alpert, Abby; Hsi, Helen; Jacobson, Mireille
2017-04-01
The health care industry has experienced massive consolidation over the past decade. Much of the consolidation has been vertical (with hospitals acquiring physician practices) instead of horizontal (with physician practices or hospitals merging with similar entities). We documented the increase in vertical integration in the market for cancer care in the period 2003-15, finding that the rate of hospital or health system ownership of practices doubled from about 30 percent to about 60 percent. The two most commonly cited explanations for this consolidation are a 2005 Medicare Part B payment reform that dramatically reduced reimbursement for chemotherapy drugs, and the expansion of hospital eligibility for the 340B Drug Discount Program under the Affordable Care Act (ACA). To evaluate the evidence for these explanations, we used difference-in-differences methods to assess whether consolidation increased more in areas with greater exposure to each policy than in areas with less exposure. We found little evidence that either policy contributed to vertical integration. Rather, increased consolidation in the market for cancer care may be part of a broader post-ACA trend toward integrated health care systems. Project HOPE—The People-to-People Health Foundation, Inc.
Geologic map of the Bernalillo NW quadrangle, Sandoval County, New Mexico
Koning, Daniel J.; Personius, Stephen F.
2002-01-01
The Bernalillo NW quadrangle is located in the northern part of the Albuquerque basin, which is the largest basin or graben within the Rio Grande rift. The quadrangle is underlain by poorly consolidated sedimentary rocks of the Santa Fe Group. These rocks are best exposed in the southwestern part of the quadrangle in the Rincones de Zia, a badland topography cut by northward-flowing tributary arroyos of the Jemez River. The Jemez River flows through the northern half of the quadrangle; extensive fluvial and eolian deposits cover bedrock units along the river. The structural fabric of the quadrangle is dominated by dozens of generally north striking, east and west-dipping normal faults and minor folds associated with the Neogene Rio Grande rift.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1984-01-01
Tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for new projects; networks consolidation program; and network sustaining are described.
Shea, Katheryn E; Wagner, Elizabeth L; Marchesani, Leah; Meagher, Kevin; Giffen, Carol
2017-02-01
Reducing costs by improving storage efficiency has been a focus of the National Heart, Lung, and Blood Institute (NHLBI) Biologic Specimen Repository (Biorepository) and Biologic Specimen and Data Repositories Information Coordinating Center (BioLINCC) programs for several years. Study specimen profiles were compiled using the BioLINCC collection catalog. Cost assessments and calculations on the return on investments to consolidate or reduce a collection, were developed and implemented. Over the course of 8 months, the NHLBI Biorepository evaluated 35 collections that consisted of 1.8 million biospecimens. A total of 23 collections were selected for consolidation, with a total of 1.2 million specimens located in 21,355 storage boxes. The consolidation resulted in a savings of 4055 boxes of various sizes and 10.2 mechanical freezers (∼275 cubic feet) worth of space. As storage costs in a biorepository increase over time, the development and use of information technology tools to assess the potential advantage and feasiblity of vial consolidation can reduce maintenance expenses.
Feng, Xing Lin
2017-02-01
Policy makers in China are considering consolidating the country's fragmented health insurance programs. This system consists of three components. The Urban Employee Basic Medical Insurance (UEBMI) covers formal employees, the New Cooperative Medical Scheme (NCMS) covers rural residents, and the Urban Resident Basic Medical Insurance (URBMI) covers urban residents. Consolidation could, in theory, create a more efficient health system that is better able to address noncommunicable diseases. Using national survey data during 2011 to 2013, I found that 44% to 76% cases of hypertension, diabetes, and dyslipidemia went undiagnosed among Chinese adults aged 45 and older. I found that the UEBMI enrollees had a greater number of health checks and 10% higher rates of diagnosis. Assuming that this level of efficiency would be possible under an integrated system, I conducted microsimulation analyses to project future benefits. Such consolidation could result in 46.2 million new diagnoses, and 30.0 million of these cases would be controlled.
Industry evolution through consolidation: Implications for addiction treatment.
Corredoira, Rafael A; Kimberly, John R
2006-10-01
Drawing on experiences in other industries, this article argues that the business of addiction treatment is likely to be transformed by the advent of a period of consolidation, in which a number of small independent programs will be acquired by larger, better capitalized, and managerially more sophisticated enterprises. Consolidation will be driven by opportunities to leverage new technologies, to exploit new regulatory initiatives, and to introduce economies of scale and scope into an industry that is currently highly fragmented. The process is likely to result in segmentation of the market, with the coexistence of large, generalist, highly standardized firms and a number of small highly specialized firms. When an industry consolidates, the types and quality of services provided can improve through the adoption of best practices and through increased competition among larger providers. If these larger providers are publicly traded, however, efforts to improve will inevitably be influenced by pressures to maintain or increase quarter-to-quarter earnings and share prices, leaving open the long-term impact on service quality.
Use of Virtual Mission Operations Center Technology to Achieve JPDO's Virtual Tower Vision
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Paulsen, Phillip E.
2006-01-01
The Joint Program Development Office has proposed that the Next Generation Air Transportation System (NGATS) consolidate control centers. NGATS would be managed from a few strategically located facilities with virtual towers and TRACONS. This consolidation is about combining the delivery locations for these services not about decreasing service. By consolidating these locations, cost savings in the order of $500 million have been projected. Evolving to spaced-based communication, navigation, and surveillance offers the opportunity to reduce or eliminate much of the ground-based infrastructure cost. Dynamically adjusted airspace offers the opportunity to reduce the number of sectors and boundary inconsistencies; eliminate or reduce "handoffs;" and eliminate the distinction between Towers, TRACONS, and Enroute Centers. To realize a consolidation vision for air traffic management there must be investment in networking. One technology that holds great potential is the use of Virtual Mission Operations Centers to provide secure, automated, intelligent management of the NGATS. This paper provides a conceptual framework for incorporating VMOC into the NGATS.
A coverage and slicing dependencies analysis for seeking software security defects.
He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin
2014-01-01
Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.
Progress in Computational Simulation of Earthquakes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert
2006-01-01
GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).
Cost and benefits design optimization model for fault tolerant flight control systems
NASA Technical Reports Server (NTRS)
Rose, J.
1982-01-01
Requirements and specifications for a method of optimizing the design of fault-tolerant flight control systems are provided. Algorithms that could be used for developing new and modifying existing computer programs are also provided, with recommendations for follow-on work.
Bruning, Oliver
2018-05-23
Overview of the operation and upgrade plans for the machine. Upgrade studies and taskforces. The Chamonix 2010 discussions led to five new task forces: planning for a long shut down in 2012 for splice consolidation; long term consolidation planning for the injector complex; SPS upgrade task force (accelerated program for SPS upgrade); PSB upgrade and its implications for the PS (e.g. radiation etc.); LHC High Luminosity project (investigate planning for ONE upgrade by 2018-2020); Launch of a dedicated study for doubling the beam energy in the LHC->HE-LHC.
V&V of Fault Management: Challenges and Successes
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Costello, Ken; Ohi, Don; Lu, Tiffany; Newhouse, Marilyn
2013-01-01
This paper describes the results of a special breakout session of the NASA Independent Verification and Validation (IV&V) Workshop held in the fall of 2012 entitled "V&V of Fault Management: Challenges and Successes." The NASA IV&V Program is in a unique position to interact with projects across all of the NASA development domains. Using this unique opportunity, the IV&V program convened a breakout session to enable IV&V teams to share their challenges and successes with respect to the V&V of Fault Management (FM) architectures and software. The presentations and discussions provided practical examples of pitfalls encountered while performing V&V of FM including the lack of consistent designs for implementing faults monitors and the fact that FM information is not centralized but scattered among many diverse project artifacts. The discussions also solidified the need for an early commitment to developing FM in parallel with the spacecraft systems as well as clearly defining FM terminology within a project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
The systems resilience research community has developed methods to manually insert additional source-program level assertions to trap errors, and also devised tools to conduct fault injection studies for scalar program codes. In this work, we contribute the first vector oriented LLVM-level fault injector VULFI to help study the effects of faults in vector architectures that are of growing importance, especially for vectorizing loops. Using VULFI, we conduct a resiliency study of nine real-world vector benchmarks using Intel’s AVX and SSE extensions as the target vector instruction sets, and offer the first reported understanding of how faults affect vector instruction sets.more » We take this work further toward automating the insertion of resilience assertions during compilation. This is based on our observation that during intermediate (e.g., LLVM-level) code generation to handle full and partial vectorization, modern compilers exploit (and explicate in their code-documentation) critical invariants. These invariants are turned into error-checking code. We confirm the efficacy of these automatically inserted low-overhead error detectors for vectorized for-loops.« less
NASA Astrophysics Data System (ADS)
Rundle, J.; Rundle, P.; Donnellan, A.; Li, P.
2003-12-01
We consider the problem of the complex dynamics of earthquake fault systems, and whether numerical simulations can be used to define an ensemble forecasting technology similar to that used in weather and climate research. To effectively carry out such a program, we need 1) a topological realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention of a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults extending throughout California, from the Mexico-California border to the Mendocino Triple Junction. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of all 654 fault segments (degrees of freedom) in the model. Previous versions of Virtual California had used only 215 fault segments to model the strike slip faults in southern California. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a small Beowulf cluster consisting of 10 cpus. We are also planning to run the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We also compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems.
Consolidated environmental regulation in West Virginia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flannery, D.M.; Beckett, K.G.; McThomas, M.P.
1995-05-01
In 1994, West Virginia enacted the single largest piece of legislation in its history. The 1,400-page bill that made up this legislation was the crowning achievement of more than a decade of efforts to consolidate and streamline West Virginia`s environmental regulatory programs. The result has been the empowerment of the West Virginia Division of Environmental Protection (DEP) as the centerpiece of environmental regulation in West Virginia. This Article explores the principal initiatives leading to the passage of the legislation empowering the DEP. In addition, it analyzes the substantive provisions of the DEP`s legislative authority and the relationship of that authoritymore » to other agencies. Finally, this Article identifies additional areas for the refinement of West Virginia`s environmental regulatory programs.« less
ERIC Educational Resources Information Center
Deutsch, Nancy L.; Jones, Jeffrey N.
2008-01-01
Authority is an important component of adult-youth relations. Little work has been done exploring authority outside of families and classrooms. This article consolidates findings from two studies of urban after-school programs. The article examines youths' experiences of authority in after-school programs, compares those with their reports of…
Block Grant-Funded Educational Programs: An Untapped Source of Employees for the Healthcare Industry
ERIC Educational Resources Information Center
Gordon, Claudette Veronica
2009-01-01
The Personal Responsibility and Work Opportunity Reconciliation Act of 1996 (PRWORA) placed a lifetime capitation of five years on welfare benefits. This Bill consolidated welfare and employment programs into the block grant known as Temporary Assistance for Needy Families (TANF). The benefit package allows operation of programs in accordance with…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-06
... Program. This announcement contains the consolidated names and addresses of the award recipients under... Awards, HOPE VI Main Street Grant Program, Fiscal Year (FY) 2011 and 2012 AGENCY: Office of the Assistant Secretary for Public and Indian Housing, HUD. ACTION: Announcement of funding awards. SUMMARY: In accordance...
ERIC Educational Resources Information Center
Harte, Wendy; Reitano, Paul
2016-01-01
The assessment task of the final course in a bachelor of secondary education program is examined for opportunities for preservice geography teachers to achieve the course aims of integrating, consolidating, applying, and reflecting on the knowledge and skills they have learned during their initial teacher education program. The results show that…
NASA Technical Reports Server (NTRS)
Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John
1994-01-01
This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.
Can grain size sensitive creep lubricate faults during earthquake propagation?
NASA Astrophysics Data System (ADS)
De Paola, N.; Holdsworth, R.; Viti, C.; Collettini, C.; Bullock, R. J.; Faoro, I.
2014-12-01
In the shallow portion of crustal fault zones, fracturing and cataclasis are thought to be the dominant processes during earthquake propagation. In the lower crust/upper mantle, viscous flow is inferred to facilitate aseismic creep along shear zones. Recent studies show that slip zones (SZs), in natural and experimental carbonate seismic faults, are made of nanograins with a polygonal texture, a microstructure consistent with deformation by grain boundary sliding (GBS) mechanisms. Friction experiments performed on calcite fine-grained gouges, at speed v = 1 ms-1, normal stress sn = 18 MPa, displacements d = 0.009-1.46 m, and room temperature and humidity, show a four stage-evolution of the fault strength: SI) attainment of initial value, f = 0.67; SII) increase up to peak value f = 0.82; SIII) sudden decrease to low steady-state value, f = 0.18; and SIV) sudden increase to final value, f = 0.44, during sample deceleration. Samples recovered at the end of each displacement-controlled experiments (Stages I-IV) show the following microstructures evolution of the SZ material, which is: SI) poorly consolidated, and made of fine-grained (1 < D < 5 microns), angular clasts formed by brittle fracturing and cataclasis; SII) cohesive, and made of larger clasts of calcite (D ≈ 1 microns), exhibiting a high density of free dislocations and hosting subgrains (D ≤ 200 nm), dispersed within calcite nanograins. SIII) made of nanograin aggregates exhibiting polygonal grain boundaries, and 120° triple junctions between equiaxial grains. The grains display no preferred elongation, no crystal preferred orientation and low free dislocation densities, possibly due to high temperature (> 900 C) GBS creep deformation. Our microstructural observations suggest that GBS mechanisms can operate in geological materials deformed at high strain rates along frictionally heated seismogenic slip surfaces. The observed microstructures in experimental slip zones are strikingly similar to those predicted by theoretical studies, and to those observed during experiments on metals and fine-grained carbonates deformed at T > 900 °C, where superplastic behaviour due to GBS has been inferred. A regime of frictionally-induced GBS could thus account for the dynamic weakening of carbonate faults during earthquake propagation in nature.
NASA Astrophysics Data System (ADS)
Olive-Garcia, C.; van Wyk de Vries, B.
2014-12-01
The Chaîne des Puys volcanic field in central France, became a celebrated mecca for 18/19th Century Scientists, only once the volcanoes were 'discovered'. Beforehand they were only hills, but the ability to interpret landscape with prior knowledge allowed these early geologists to create a popular understanding of the geology. Since that time, the Chaîne des Puys has become a well-known volcanic site to a worldwide audience through textbooks, tourism, and commerce (Look at a Volvic water bottle!). To the 19th century geologists, the Limagne escarpment was just as fascinating, but lacking the ability to fully interpret this rift margin, the idea of a fault did not percolate down to the general public. With the advent of the current UNESCO project, it became clear that the geological link between the volcanoes and the fault could be exploited, not only to raise the profile of the volcanoes, but to create a greater awareness of the tectonics in the greater public. Not only have the volcanoes, become better known and more clearly understood than previously, but the fault has begun to emerge as a feature in public consciousness. We will demonstrate the many communication techniques at all levels that have been used in the project. We explain the rationale between creating a geological scale model that works on processes as well as landforms to raise the public awareness. The success is that we show how geological features can be made readable by the general public, something highly important for conservation of heritage, but also for risk perception. The increased education efforts of the scientists have also lead to an increase in science. The Chaîne des Puys and Limagne fault project was discussed at the 38th session of the World Heritage UNESCO committee in June 2014 and was acknowledged to have Universal Exceptional Value: the future challenge for this project is to consolidate the outreach, and to work with other sites to increase the public perception of earth sciences. The more informed and participatory the public is the more efficient the communication can be.
Perspectives on the Chaine Des Puys and Limagne Fault UNESCO World Heritage Project
NASA Astrophysics Data System (ADS)
van Wyk de Vries, Benjamin; Olive, Cécile
2015-04-01
The Chaîne des Puys and Limagne fault project is acknowledged to have Outstanding Universal Value (38th session of the World Heritage UNESCO committee, June 2014). One ongoing challenge for the project is to consolidate the outreach, and to work with other sites to increase the public perception of Earth sciences. The Chaîne des Puys volcanic field in central France, became a celebrated mecca for 18/19th Century scientists, only once the volcanoes were 'discovered'. Beforehand they were only hills, but the ability to interpret landscape with prior knowledge allowed these early geologists to create a popular understanding of the geology. Since that time, the Chaîne des Puys has become a well-known volcanic site to a worldwide audience through textbooks, tourism, and commerce. To the 19th century geologists, the Limagne escarpment was just as fascinating, but lacking the ability to fully interpret this rift margin, the idea of a fault did not percolate down to the general public. With the advent of the current UNESCO project, it became clear that the geological link between the volcanoes and the fault could be exploited, not only to raise the profile of the volcanoes, but to create a greater awareness of the tectonics in the greater public. Not only have the volcanoes, become better known and more clearly understood than previously, but the fault has begun to emerge as a feature in public consciousness. We will demonstrate the many communication techniques at all levels that have been used in the project. We explain the rationale between creating a geological scale model that works on processes as well as landforms to raise the public awareness. The success is that we show how geological features can be made readable by the general public, something highly important for conservation of heritage, but also for risk perception. The increased education efforts of the scientists have also lead to an increase in science. The more informed and participatory the public is, the more efficient the communication can be.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.
This report describes and evaluates high school programs funded under Chapter 1, Part B, of the Education Consolidation and Improvement Act (ECIA) and administered by the Institutionalized Facilities Program of the New York City Public Schools in 1989-90. The program is designed to address the educational needs of students in facilities for…
NASA Technical Reports Server (NTRS)
Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.
1992-01-01
Digital computing systems needed for Army programs such as the Computer-Aided Low Altitude Helicopter Flight Program and the Armored Systems Modernization (ASM) vehicles may be characterized by high computational throughput and input/output bandwidth, hard real-time response, high reliability and availability, and maintainability, testability, and producibility requirements. In addition, such a system should be affordable to produce, procure, maintain, and upgrade. To address these needs, the Army Fault Tolerant Architecture (AFTA) is being designed and constructed under a three-year program comprised of a conceptual study, detailed design and fabrication, and demonstration and validation phases. Described here are the results of the conceptual study phase of the AFTA development. Given here is an introduction to the AFTA program, its objectives, and key elements of its technical approach. A format is designed for representing mission requirements in a manner suitable for first order AFTA sizing and analysis, followed by a discussion of the current state of mission requirements acquisition for the targeted Army missions. An overview is given of AFTA's architectural theory of operation.
NASA Technical Reports Server (NTRS)
Abdel-Gawad, M. (Principal Investigator); Silverstein, J.; Tubbesing, L.
1973-01-01
The author has identified the following significant results. ERTS-1 imagery covering the eastern California-Nevada seismic belt were utilized to study the fault pattern in relation to the distribution of earthquake epicenters and Quaternary volcanic rocks. Many suspected faults not previously mapped were identified. These include several suspected shear zones in Nevada, faults showing evidence of recent breakage, and major lineaments. Highly seismic areas are generally characterized by Holocene faulting and Quaternary volcanic activity. However, several major fault segments showing evidence of recent breakage are associated with little or no seismicity. The tectonic pattern strongly suggests that the eastern California-Nevada seismic belt coincides with a major crustal rift associated with zones of lateral shear. New data on potentially active fault zones have direct practical applications in national and local earthquake hazard reduction programs. Positive contacts have been made with Kern and Ventura Counties to make results of this investigation available for application to their earthquake hazards definition projects.
Clustered, rectangular lakes of the Canadian Old Crow Basin
NASA Astrophysics Data System (ADS)
Allenby, Richard J.
1989-12-01
This paper investigates the origin and development of the tightly clustered lakes within the Old Crow and Bluefish basins utilizing Landsat imagery, SEASAT Synthetic Aperture Radar (SAR), and the available scientific literature. The Old Crow Basin and the smaller, neighboring, Bluefish Basin are located in the northwest Yukon Territory of Canada, 150 km south of the Beaufort Sea and just east of the Canadian-Alaskan border. Both basins, situated in Pleistocene lake deposits of sand, gravel, silt, and peat, are characterized by numerous, densely clustered, rectangular or arrowhead-shaped, shallow lakes with linear shore lines. The straight edges of these lakes exhibit strong, nearly orthogonal, preferred alignments directed northwest and northeast. These lakes evidently originated as relatively small thaw or thermokarst lakes that subsequently coalesced into larger lakes with edges and orientations controlled by a fracture pattern in the consolidated, underlying rocks-possibly the Old Crow Granite. The fracture pattern may be the result of horizontal tertiary or later compressional forces along the Kaltag/Porcupine Fault or it may have originated in the relatively undeformed, consolidated, basinal sediments as a result of downwarping and subsequent uplifting. The lake forming process is ongoing with new lakes being formed to replace older lakes in all stages of being obliterated.
Multi-directional fault detection system
Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D.; Smith, Brian Edward
2010-11-23
An apparatus, program product and method checks for nodal faults in a group of nodes comprising a center node and all adjacent nodes. The center node concurrently communicates with the immediately adjacent nodes in three dimensions. The communications are analyzed to determine a presence of a faulty node or connection.
Multi-directional fault detection system
Archer, Charles Jens [Rochester, MN; Pinnow, Kurt Walter [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian Edward [Rochester, MN
2009-03-17
An apparatus, program product and method checks for nodal faults in a group of nodes comprising a center node and all adjacent nodes. The center node concurrently communicates with the immediately adjacent nodes in three dimensions. The communications are analyzed to determine a presence of a faulty node or connection.
Multi-directional fault detection system
Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D.; Smith, Brian Edward
2010-06-29
An apparatus, program product and method checks for nodal faults in a group of nodes comprising a center node and all adjacent nodes. The center node concurrently communicates with the immediately adjacent nodes in three dimensions. The communications are analyzed to determine a presence of a faulty node or connection.
Library and Program Information Services. A HUD Handbook.
ERIC Educational Resources Information Center
Department of Housing and Urban Development, Washington, DC.
This Handbook explains the services and policies of the Library and Information Division. Since October 1957, the Public Housing Administration, the Federal Housing Administration and the Office of the Administrator Libraries were consolidated into one Library. The Program Information Center was added, additional Regional Office libraries were…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-06
... percent of youth with individualized education programs (IEPs) graduating with a regular high school... 100,000 schools through EDFacts. \\4\\ Education Department General Administrative Regulations (EDGAR... Department programs (e.g., Consolidated State Performance Report under the Elementary and Secondary Education...
40 CFR 282.84 - North Dakota State-Administered Program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... administered by the North Dakota Department of Health and Consolidated Laboratories, was approved by EPA... 11, 1991 and it was effective on December 10, 1991. (b) North Dakota has primary responsibility for enforcing its underground storage tank program. However, EPA retains the authority to exercise its...
Information Studies for the Business Sector in Spanish Universities
ERIC Educational Resources Information Center
Canavate, Antonio Munoz; Hipola, Pedro
2008-01-01
The management of information in the business world constitutes a single consolidated area within undergraduate and graduate study programs in Librarianship and Information Science. This article describes information studies for the business sector within Spain, including the university programs known as Diplomatura in Librarianship and…
Cash Management Program Reaps Financial Rewards.
ERIC Educational Resources Information Center
Saylor, Joan Nesenkar
1984-01-01
Basic components of a New Jersey district's profitable cash management program include consolidating funds using a negotiated bank agreement, a short term investment policy, accurate flowcharts for precise planning, and revenue and expenditure analysis. Data collection and analysis and the alternative of using a bank service agreement are…
Reduced Resources and the Academic Program
ERIC Educational Resources Information Center
DeCosmo, Richard
1978-01-01
Analyzes the concepts of "planning and reallocation" and "pruning and grafting" as approaches to reducing academic costs when reductions are inevitable. The former concept involves long-range planning and organizational analysis of present programs and future goals; the latter focuses on elimination and consolidation of courses in accordance with…
Planetary Gearbox Fault Detection Using Vibration Separation Techniques
NASA Technical Reports Server (NTRS)
Lewicki, David G.; LaBerge, Kelsen E.; Ehinger, Ryan T.; Fetty, Jason
2011-01-01
Studies were performed to demonstrate the capability to detect planetary gear and bearing faults in helicopter main-rotor transmissions. The work supported the Operations Support and Sustainment (OSST) program with the U.S. Army Aviation Applied Technology Directorate (AATD) and Bell Helicopter Textron. Vibration data from the OH-58C planetary system were collected on a healthy transmission as well as with various seeded-fault components. Planetary fault detection algorithms were used with the collected data to evaluate fault detection effectiveness. Planet gear tooth cracks and spalls were detectable using the vibration separation techniques. Sun gear tooth cracks were not discernibly detectable from the vibration separation process. Sun gear tooth spall defects were detectable. Ring gear tooth cracks were only clearly detectable by accelerometers located near the crack location or directly across from the crack. Enveloping provided an effective method for planet bearing inner- and outer-race spalling fault detection.
Map and database of Quaternary faults and folds in Colombia and its offshore regions
Paris, Gabriel; Machette, Michael N.; Dart, Richard L.; Haller, Kathleen M.
2000-01-01
As part of the International Lithosphere Program’s “World Map of Major Active Faults,” the U.S. Geological Survey (USGS) is assisting in the compilation of a series of digital maps of Quaternary faults and folds in Western Hemisphere countries. The maps show the locations, ages, and activity rates of major earthquake-related features such as faults and fault-related folds. They are accompanied by databases that describe these features and document current information on their activity in the Quaternary. Top date, the project has published fault and fold maps for Costa Rica (Montero and others, 1998), Panama (Cowan and others, 1998), Venezuela (Audemard and others, 2000), Bolovia/Chile (Lavenu, and others, 2000), and Argentina (Costa and others, 2000). The project is a key part of the Global Seismic Hazards Assessment Program (ILP Project II-0) for the International Decade for Natural Hazard Disaster Reduction.
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shippert, Tim; Gaustad, Krista
Consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. These challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of data consolidation methods, present a frameworkmore » for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
Geotechnical properties of sediments from North Pacific and Northern Bermuda Rise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, A J; Laine, E P; Lipkin, J
1980-01-01
Studies of geotechnical properties for the Sub-seabed Disposal Program have been oriented toward sediment characterization related to effectiveness as a containment media and determination of detailed engineering behavior. Consolidation tests of the deeper samples in the North Pacific clays indicate that the sediment column is normally consolidated. The in-situ coefficient of permeability (k) within the cored depth of 25 meters is relatively constant at 10/sup -7/ cm/sec. Consolidated undrained (CIU) triaxial tests indicate stress-strain properties characteristic of saturated clays with effective angles of friction of 35/sup 0/ for smectite and 31/sup 0/ for illite. These results are being used inmore » computer modeling efforts. Some general geotechnical property data from the Bermuda Rise are also discussed.« less
24 CFR 91.15 - Submission date.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., and homeless needs assessment, market analysis, and strategic plan must be submitted at least once... program management, coordinate consolidated plans with time periods used for cooperation agreements, other...
Fault Management Technology Maturation for NASA's Constellation Program
NASA Technical Reports Server (NTRS)
Waterman, Robert D.
2010-01-01
This slide presentation reviews the maturation of fault management technology in preparation for the Constellation Program. There is a review of the Space Shuttle Main Engine (SSME) and a discussion of a couple of incidents with the shuttle main engine and tanking that indicated the necessity for predictive maintenance. Included is a review of the planned Ares I-X Ground Diagnostic Prototype (GDP) and further information about detection and isolation of faults using Testability Engineering and Maintenance System (TEAMS). Another system that being readied for use that detects anomalies, the Inductive Monitoring System (IMS). The IMS automatically learns how the system behaves and alerts operations it the current behavior is anomalous. The comparison of STS-83 and STS-107 (i.e., the Columbia accident) is shown as an example of the anomaly detection capabilities.
Boundary integral solutions for faults in flowing rock
NASA Astrophysics Data System (ADS)
Wei, Wei
We develop new boundary-integral solutions for faulting in viscous rock and implement solutions numerically with a boundary-element computer program, called Faux_Pas. In the solutions, large permanent rock deformations near faults are treated with velocity discontinuities within linear, incompressible, creeping, viscous flows. The faults may have zero strength or a finite strength that can be a constant or varying with deformation. Large deformations are achieved by integrating step by step with the fourth-order Runge-Kutta method. With this method, the boundaries and passive markers are updated dynamically. Faux_Pas has been applied to straight and curved elementary faults, and to listric and dish compound faults, composed of two or more elementary faults, such as listric faults and dish faults, all subjected to simple shear, shortening and lengthening. It reproduces the essential geometric elements seen in seismic profiles of fault-related folds associated with listric thrust faults in the Bighorn Basin of Wyoming, with dish faults in the Appalachians in Pennsylvania, Parry Islands of Canada and San Fernando Valley, California, and with listric normal faults in the Gulf of Mexico. Faux_Pas also predicts that some of these fault-related structures will include fascinating minor folds, especially in the footwall of the fault, that have been recognized earlier but have not been known to be related to the faulting. Some of these minor folds are potential structural traps. Faux_Pas is superior in several respects to current geometric techniques of balancing profiles, such as the "fault-bend fold" construction. With Faux_Pas, both the hanging wall and footwall are deformable, the faults are mechanical features, the cross sections are automatically balanced and, most important, the solutions are based on the first principles of mechanics. With the geometric techniques, folds are drawn only in the hanging wall, the faults are simply lines, the cross sections are arbitrarily balanced and, most important, the drawings are based on unsubstantiated rules of thumb. Faux_Pas provides the first rational tool for the study of fault-related folds.
NASA Astrophysics Data System (ADS)
Lawson, M. J.; Yin, A.; Rhodes, E. J.
2015-12-01
Steep landscapes are known to provide sediment to sink regions, but often petrological factors can dominate basin sedimentation. Within Eureka Valley, in northwestern Death Valley National Park, normal faulting has exposed a steep cliff face on the western margin of the Last Chance range with four kilometers of vertical relief from the valley floor and an angle of repose of nearly 38 degrees. The cliff face is composed of Cambrian limestone and dolomite, including the Bonanza King, Carrara and Wood Canyon formations. Interacting with local normal faulting, these units preferentially break off the cliff face in coherent blocks, which result in landslide deposits rather than as finer grained material found within the basin. The valley is well known for a large sand dune, which derives its sediment from distal sources to the north, instead of from the adjacent Last Chance Range cliff face. During the Holocene, sediment is sourced primary from the northerly Willow Wash and Cucomungo canyon, a relatively small drainage (less than 80 km2) within the Sylvan Mountains. Within this drainage, the Jurassic quartz monzonite of Beer Creek is heavily fractured due to motion of the Fish Valley Lake - Death Valley fault zone. Thus, the quartz monzonite is more easily eroded than the well-consolidated limestone and dolomite that forms the Last Change Range cliff face. As well, the resultant eroded material is smaller grained, and thus more easily transported than the limestone. Consequently, this work highlights an excellent example of the strong influence that source material can have on basin sedimentation.
Factors That Affect Software Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.
1991-01-01
Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.
New Madrid seismotectonic study. Activities during fiscal year 1982
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buschbach, T.C.
1984-04-01
The New Madrid Seismotectonic Study is a coordinated program of geological, geophysical, and seismological investigations of the area within a 200-mile radius of New Madrid, Missouri. The study is designed to define the structural setting and tectonic history of the area in order to realistically evaluate earthquake risks in the siting of nuclear facilities. Fiscal year 1982 included geological and geophysical studies aimed at better definition of the east-west trending fault systems - the Rough Creek and Cottage Grove systems - and the northwest-trending Ste. Genevieve faulting. A prime objective was to determine the nature and history of faulting andmore » to establish the relationship with that faulting and the northeast-trending faults of the Wabash Valley and New Madrid areas. 27 references, 61 figures.« less
Object-Oriented Algorithm For Evaluation Of Fault Trees
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1992-01-01
Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).
1988-08-20
34 William A. Link, Patuxent Wildlife Research Center "Increasing reliability of multiversion fault-tolerant software design by modulation," Junryo 3... Multiversion lault-Tolerant Software Design by Modularization Junryo Miyashita Department of Computer Science California state University at san Bernardino Fault...They shall beE refered to as " multiversion fault-tolerant software design". Onel problem of developing multi-versions of a program is the high cost
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
DOT National Transportation Integrated Search
2001-05-01
The purpose of this project was to undertake a consolidated comprehensive review of the Florida Department of Transportation Transit Corridor Program. Technical Memorandum Number One provides a summary of all transit corridor projects either under wa...
Assessing the Agriculture Teacher Workforce in New England
ERIC Educational Resources Information Center
Uricchio, Cassandra Kay
2011-01-01
High quality teachers are an essential piece of the agricultural education model and directly influence the quality of the total program. However, there has been a steady consolidation and elimination of agricultural education teacher preparation programs in New England. In order to understand how this trend affected agricultural education in this…
Financing Excellence in Public Education. Focus 13.
ERIC Educational Resources Information Center
Benderson, Albert
"A Nation at Risk" and other recent reports have focused public attention on excellence in education. During the same period, the federal government has cut aid to education by almost 20 percent and consolidated federal funding into block grant programs, which some critics have claimed are less efficient than programs before…
24 CFR 982.154 - ACC reserve account.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false ACC reserve account. 982.154... and PHA Administration of Program § 982.154 ACC reserve account. (a) HUD may establish and maintain an unfunded reserve account for the PHA program from available budget authority under the consolidated ACC...
24 CFR 982.154 - ACC reserve account.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false ACC reserve account. 982.154... and PHA Administration of Program § 982.154 ACC reserve account. (a) HUD may establish and maintain an unfunded reserve account for the PHA program from available budget authority under the consolidated ACC...
DOT National Transportation Integrated Search
2010-06-01
On June 28, 2007, PHMSA released a Broad Agency Announcement (BAA), DTPH56- 07-BAA-000002, seeking white papers on individual projects and consolidated Research and Development (R&D) programs addressing topics on their pipeline safety program. Althou...
Student Selection for Selective Educational Programs Using Multiple Criteria.
ERIC Educational Resources Information Center
Jenkins, Jerry A.
This paper shows that multiple sources of data reflecting educational progress may be used with relative ease to systematically, objectively, and accurately place students in selective educational programs, such as those funded under Chapter 1 of the Education Consolidation and Improvement Act, using readily available commercial microcomputer…
77 FR 37910 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-25
... Program Instructions (PIs). The training and data grants are governed by the ``new grant'' PI and the basic grant is governed by the ``basic grant'' PI. Current PIs require separate applications and program... and reporting processes by consolidating the PIs into one single PI and requiring one single...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
... questions please contact: Emily Andrew (703-235-2182), Privacy Officer, National Protection and Programs... U.S.C. 552a, the Department of Homeland Security (DHS)/National Protection and Programs Directorate... Screening Database (TSDB). The TSDB is the Federal government's consolidated and integrated terrorist...
School-to-Work Apprenticeship. Project Manual 1993-1994.
ERIC Educational Resources Information Center
Lee Coll., Baytown, TX.
With Perkins tech prep funds, Lee College (Baytown, Texas), working with the Gulf Coast Tech Prep Consortium and the Goose Creek Consolidated Independent School District, developed a school-to-work apprenticeship model for tech prep programs. An advisory committe provided guidance in identifying targeted apprenticeable jobs, program content, and…
76 FR 38943 - Notice of Order Soliciting Community Proposals
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
... in receiving a grant under the Small Community Air Service Development Program. The full text of the... 2011 Full-Year Continuing Appropriations Act (Pub. L. 112-10 (extending the FY 2010 Consolidated... recipient. Full community participation is a key goal of this program as demonstrated by the statute's focus...
The aircraft energy efficiency active controls technology program
NASA Technical Reports Server (NTRS)
Hood, R. V., Jr.
1977-01-01
Broad outlines of the NASA Aircraft Energy Efficiency Program for expediting the application of active controls technology to civil transport aircraft are presented. Advances in propulsion and airframe technology to cut down on fuel consumption and fuel costs, a program for an energy-efficient transport, and integrated analysis and design technology in aerodynamics, structures, and active controls are envisaged. Fault-tolerant computer systems and fault-tolerant flight control system architectures are under study. Contracts with leading manufacturers for research and development work on wing-tip extensions and winglets for the B-747, a wing load alleviation system, elastic mode suppression, maneuver-load control, and gust alleviation are mentioned.
ERIC Educational Resources Information Center
Berney, Tomi D.; Stern, Lucia
Chapter I/Pupils with Compensatory Educational Needs programs in English as a Second Language (ESL) served students at 78 high schools in New York City, supplementing tax-levy-funded ESL classes in those schools serving limited-English-proficient (LEP) students. Chapter I of the Educational Consolidation and Improvement Act funded ESL and…
ERIC Educational Resources Information Center
Marks, Ellen L.
This paper reviews and summarizes available information on the neglected or delinquent youth population, on education programs for delinquent youth, and on the Neglected or Delinquent (N or D) program funded under Chapter 1 of the Education Consolidation and Improvement Act that serves a portion of that population residing in state-operated or…
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Research, Evaluation, and Assessment.
This report evaluates a program funded under Chapter 1, Part B, of the Education Consolidation and Improvement Act in New York (New York). The New York City Division of Special Education administers the Institutionalized Facilities Program to provide instruction to neglected and delinquent children and adolescents residing in group homes and…
ERIC Educational Resources Information Center
Shaul, Marnie S.
This report evaluates how effectively four federally funded bilingual education block grant programs--Program Development and Implementation Grants, Program Enhancement Projects, Comprehensive School Grants, Systemwide Improvement Grants--used $163 million in fiscal year 2000 to serve children with limited English proficiency (LEP). There are four…
Tanriverdi, Ozgur; Barista, Ibrahim; Paydas, Semra; Nayir, Erdinc; Karakas, Yusuf
2017-01-01
In this study, we aimed to determine the perspectives of medical and radiation oncologists regarding consolidation radiotherapy in patients with a complete response after chemotherapy for Hodgkin’s and non-Hodgkin’s lymphomas. The survey was designed to identify demographic and occupational features of medical and radiation oncologists and their views on application of consolidation radiotherapy in their clinical practices, as based on a five-point Likert scale (never, rarely, sometimes, often, and always). The study covered 263, out of 935, physicians working in the oncology field as either medical or radiation oncologists; the rate of return on the invitations to participate was 28%. The majority of the participants were male radiation oncologists, with a duration of between 5 and 10 years of work as a university hospital official, and the mean age was 38 ± 14 (years). Although the most commonly followed international guidelines were NCCN, among the physicians, the majority of the respondents suggested that the guidelines were unclear regarding recommendations for consolidative radiotherapy. The administered dose for consolidative radiotherapy in lymphoma patients was indicated as 40 Gy by 49% of all the physicians and the most common cause of hesitancy concerning consolidative radiation treatment was the risk of secondary malignancies as a long-term adverse effect (54%). In conclusion, we suggest that medical oncologists could be most active in the treatment of lymphoma through a continuous training program about lymphomas and current national guidelines. PMID:29172293
24 CFR 982.157 - Budget and expenditure.
Code of Federal Regulations, 2011 CFR
2011-04-01
... amounts contracted under the consolidated ACC. (c) Intellectual property rights. Program receipts may not... judgment of infringement of intellectual property rights. (Approved by the Office of Management and Budget...
Consolidated fuel reprocessing program
NASA Astrophysics Data System (ADS)
1985-02-01
Improved processes and components for the Breeder Reprocessing Engineering Test (BRET) were identified and developed as well as the design, procurement and development of prototypic equipment. The integrated testing of process equipment and flowsheets prototypical of a pilot scale full reprocessing plant, and also for testing prototypical remote features of specific complex components in the system are provided. Information to guide the long range activities of the Consolidated Fuel Reprocessing Program (CERP), a focal point for foreign exchange activities, and support in specialized technical areas are described. Research and development activities in HTGR fuel treatment technology are being conducted. Head-end process and laboratory scale development efforts, as well as studies specific to HTGR fuel, are reported. The development of off-gas treatment processes has generic application to fuel reprocessing, progress in this work is also reported.
Proposal for Land Consolidation Project Solutions for Selected Problem Areas
NASA Astrophysics Data System (ADS)
Wojcik-Len, Justyna; Strek, Zanna
2017-12-01
One of the economic tools for supporting agricultural policy are the activities implemented under the Rural Development Program (RDP). By encouraging agricultural activities and creating equal opportunities for development of farms, among others in areas with unfavourable environmental conditions characterized by low productivity of soils exposed to degradation, decision makers can contribute to improving the spatial structure of rural areas. In Poland, one of the major concerns are agricultural problem areas (regions). In view of this situation, the aim of this article was to characterize the problem areas in question and propose land consolidation project solutions for selected fragments of those areas. This paper presents the results of a review of literature and an analysis of geodetic and cartographic data regarding the problem areas. The process of land consolidation, which is one of the technical and legal instruments supporting the development of rural areas, was characterized. The study allowed the present authors to establish criteria for selecting agricultural problem areas for land consolidation. To develop a proposal for rational management of the problem areas, key general criteria (location, topography, soil quality and usefulness) and specific criteria were defined and assigned weights. A conception of alternative development of the agricultural problem areas was created as part of a land consolidation project. The results were used to create a methodology for the development of agricultural problem areas to be employed during land consolidation in rural areas. Every agricultural space includes areas with unfavourable environmental and soil conditions determined by natural or anthropogenic factors. Development of agricultural problem areas through land consolidation should take into account the specific functions assigned to these areas in land use plans, as well as to comply with legal regulations.
Optimization of Second Fault Detection Thresholds to Maximize Mission POS
NASA Technical Reports Server (NTRS)
Anzalone, Evan
2018-01-01
In order to support manned spaceflight safety requirements, the Space Launch System (SLS) has defined program-level requirements for key systems to ensure successful operation under single fault conditions. To accommodate this with regards to Navigation, the SLS utilizes an internally redundant Inertial Navigation System (INS) with built-in capability to detect, isolate, and recover from first failure conditions and still maintain adherence to performance requirements. The unit utilizes multiple hardware- and software-level techniques to enable detection, isolation, and recovery from these events in terms of its built-in Fault Detection, Isolation, and Recovery (FDIR) algorithms. Successful operation is defined in terms of sufficient navigation accuracy at insertion while operating under worst case single sensor outages (gyroscope and accelerometer faults at launch). In addition to first fault detection and recovery, the SLS program has also levied requirements relating to the capability of the INS to detect a second fault, tracking any unacceptable uncertainty in knowledge of the vehicle's state. This detection functionality is required in order to feed abort analysis and ensure crew safety. Increases in navigation state error and sensor faults can drive the vehicle outside of its operational as-designed environments and outside of its performance envelope causing loss of mission, or worse, loss of crew. The criteria for operation under second faults allows for a larger set of achievable missions in terms of potential fault conditions, due to the INS operating at the edge of its capability. As this performance is defined and controlled at the vehicle level, it allows for the use of system level margins to increase probability of mission success on the operational edges of the design space. Due to the implications of the vehicle response to abort conditions (such as a potentially failed INS), it is important to consider a wide range of failure scenarios in terms of both magnitude and time. As such, the Navigation team is taking advantage of the INS's capability to schedule and change fault detection thresholds in flight. These values are optimized along a nominal trajectory in order to maximize probability of mission success, and reducing the probability of false positives (defined as when the INS would report a second fault condition resulting in loss of mission, but the vehicle would still meet insertion requirements within system-level margins). This paper will describe an optimization approach using Genetic Algorithms to tune the threshold parameters to maximize vehicle resilience to second fault events as a function of potential fault magnitude and time of fault over an ascent mission profile. The analysis approach, and performance assessment of the results will be presented to demonstrate the applicability of this process to second fault detection to maximize mission probability of success.
NASA Astrophysics Data System (ADS)
Rusu-Anghel, S.; Ene, A.
2017-05-01
The quality of electric energy capture and also the equipment operational safety depend essentially of the technical state of the contact line (CL). The present method for determining the technical state of CL based on advance programming is no longer efficient, due to the faults which can occur into the not programmed areas. Therefore, they cannot be remediated. It is expected another management method for the repairing and maintenance of CL based on its real state which must be very well known. In this paper a new method for determining the faults in CL is described. It is based on the analysis of the variation of pantograph-CL contact force in dynamical regime. Using mathematical modelling and also experimental tests, it was established that each type of fault is able to generate ‘signatures’ into the contact force diagram. The identification of these signatures can be accomplished by an informatics system which will provide the fault location, its type and also in the future, the probable evolution of the CL technical state. The measuring of the contact force is realized in optical manner using a railway inspection trolley which has appropriate equipment. The analysis of the desired parameters can be accomplished in real time by a data acquisition system, based on dedicated software.
NASA Astrophysics Data System (ADS)
Saldaña, S. C.; Snelson, C. M.; Taylor, W. J.; Beachly, M.; Cox, C. M.; Davis, R.; Stropky, M.; Phillips, R.; Robins, C.; Cothrun, C.
2007-12-01
The Pahrump Fault system is located in the central Basin and Range region and consists of three main fault zones: the Nopah range front fault zone, the State Line fault zone and the Spring Mountains range fault zone. The State Line fault zone is made up north-west trending dextral strike-slip faults that run parallel to the Nevada- California border. Previous geologic and geophysical studies conducted in and around Stewart Valley, located ~90 km from Las Vegas, Nevada, have constrained the location of the State Line fault zone to within a few kilometers. The goals of this project were to use seismic methods to definitively locate the northwestern most trace of the State Line fault and produce pseudo 3-D seismic cross-sections that can then be used to characterize the subsurface geometry and determine the slip of the State Line fault. During July 2007, four seismic lines were acquired in Stewart Valley: two normal and two parallel to the mapped traces of the State Line fault. Presented here are preliminary results from the two seismic lines acquired normal to the fault. These lines were acquired utilizing a 144-channel geode system with each of the 4.5 Hz vertical geophones set out at 5 m intervals to produce a 595 m long profile to the north and a 715 m long profile to the south. The vibroseis was programmed to produce an 8 s linear sweep from 20-160 Hz. These data returned excellent signal to noise and reveal subsurface lithology that will subsequently be used to resolve the subsurface geometry of the State Line fault. This knowledge will then enhance our understanding of the evolution of the State Line fault. Knowing how the State Line fault has evolved gives insight into the stick-slip fault evolution for the region and may improve understanding of how stress has been partitioned from larger strike-slip systems such as the San Andreas fault.
NASA Astrophysics Data System (ADS)
Mahya, M. J.; Sanny, T. A.
2017-04-01
Lembang and Cimandiri fault are active faults in West Java that thread people near the faults with earthquake and surface deformation risk. To determine the deformation, GPS measurements around Lembang and Cimandiri fault was conducted then the data was processed to get the horizontal velocity at each GPS stations by Graduate Research of Earthquake and Active Tectonics (GREAT) Department of Geodesy and Geomatics Engineering Study Program, ITB. The purpose of this study is to model the displacement distribution as deformation parameter in the area along Lembang and Cimandiri fault using 2-dimensional boundary element method (BEM) using the horizontal velocity that has been corrected by the effect of Sunda plate horizontal movement as the input. The assumptions that used at the modeling stage are the deformation occurs in homogeneous and isotropic medium, and the stresses that acted on faults are in elastostatic condition. The results of modeling show that Lembang fault had left-lateral slip component and divided into two segments. A lineament oriented in southwest-northeast direction is observed near Tangkuban Perahu Mountain separating the eastern and the western segments of Lembang fault. The displacement pattern of Cimandiri fault shows that Cimandiri fault is divided into the eastern segment with right-lateral slip component and the western segment with left-lateral slip component separated by a northwest-southeast oriented lineament at the western part of Gede Pangrango Mountain. The displacement value between Lembang and Cimandiri fault is nearly zero indicating that Lembang and Cimandiri fault are not connected each other and this area is relatively safe for infrastructure development.
ERIC Educational Resources Information Center
Department of Education, Washington, DC.
Chapter 1 of the Education Consolidation and Improvement Act of 1981 (ECIA) provides financial assistance to state (SEAs) and local educational agencies (LEAs) to meet special educational needs under the same formula that governed the allocation of Title I funds. The guidance in this document only pertains to the Chapter 1 program that provides…
Enhancing Security by System-Level Virtualization in Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei
Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.
Implementation of a research prototype onboard fault monitoring and diagnosis system
NASA Technical Reports Server (NTRS)
Palmer, Michael T.; Abbott, Kathy H.; Schutte, Paul C.; Ricks, Wendell R.
1987-01-01
Due to the dynamic and complex nature of in-flight fault monitoring and diagnosis, a research effort was undertaken at NASA Langley Research Center to investigate the application of artificial intelligence techniques for improved situational awareness. Under this research effort, concepts were developed and a software architecture was designed to address the complexities of onboard monitoring and diagnosis. This paper describes the implementation of these concepts in a computer program called FaultFinder. The implementation of the monitoring, diagnosis, and interface functions as separate modules is discussed, as well as the blackboard designed for the communication of these modules. Some related issues concerning the future installation of FaultFinder in an aircraft are also discussed.
Advanced instrumentation concepts for environmental control subsystems
NASA Technical Reports Server (NTRS)
Yang, P. Y.; Schubert, F. H.; Gyorki, J. R.; Wynveen, R. A.
1978-01-01
Design, evaluation and demonstration of advanced instrumentation concepts for improving performance of manned spacecraft environmental control and life support systems were successfully completed. Concepts to aid maintenance following fault detection and isolation were defined. A computer-guided fault correction instruction program was developed and demonstrated in a packaged unit which also contains the operator/system interface.
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shippert, Tim; Gaustad, Krista
In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
Shippert, Tim; Gaustad, Krista
2016-12-16
In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
NASA Technical Reports Server (NTRS)
Shontz, W. D.; Records, R. M.; Antonelli, D. R.
1992-01-01
The focus of this project is on alerting pilots to impending events in such a way as to provide the additional time required for the crew to make critical decisions concerning non-normal operations. The project addresses pilots' need for support in diagnosis and trend monitoring of faults as they affect decisions that must be made within the context of the current flight. Monitoring and diagnostic modules developed under the NASA Faultfinder program were restructured and enhanced using input data from an engine model and real engine fault data. Fault scenarios were prepared to support knowledge base development activities on the MONITAUR and DRAPhyS modules of Faultfinder. An analysis of the information requirements for fault management was included in each scenario. A conceptual framework was developed for systematic evaluation of the impact of context variables on pilot action alternatives as a function of event/fault combinations.
NASA Astrophysics Data System (ADS)
Sahin, S.; Yıldırım, C.; Sarıkaya, M. A.; Tuysuz, O.; Genç, S. C.; Aksoy, M. E.; Doksanaltı, M. E.; Benedetti, L.
2016-12-01
Cosmogenic surface exposure dating is based on the production of rare nuclides in exposed rocks, which interact with cosmic rays. Through modelling of measured 36Cl concentrations, we might obtain information of the history of the earthquake activity. Yet, there are several factors which may impact production of rare nuclides such as geometry of fault, topography, geographic location of study area, temporal variations of the Earth's magnetic field, self-cover and denudation rate on the scarp. Our study area, the Knidos Fault Zone, is located on the Datça Peninsula in the Southwestern Anatolia and contains several normal fault scarps formed within the limestone, which are appropriate to apply cosmogenic chlorine-36 dating. Since it has a well-preserved scarp, we have focused on the Mezarlık Segment of the fault zone, which has an average length of 300 m and height 12-15 m. 128 continuous samples from top to bottom of the fault scarp were collected to carry out analysis of cosmic 36Cl isotopes concentrations. Recent research elucidated each step of the application of this method by the Matlab (e.g. Schlagenhauf et al., 2010). It is vitally helpful to generate models activity of normal faults. We, however, wanted to build a user-friendly program through an open source programing language R that might be able to help those without knowledge of complex math, programming, making calculations as easy as possible. We have set out to obtain accurate conclusions to compare and contrast our results with synthetic profiles and previous studies of limestone fault scarps. The preliminary results indicate at least three major or more earthquakes/earthquakes cluster events occurred on the Mezarlık fault within the past 20 kyr; over 10 meters of displacement took place between early Holocene and late Pleistocene. Estimated ages of those three large slip events are 18.7, 15.1 and 10.8 ka respectively. This study was conducted with the Decision of the Council of Ministers with No. 2013/5387 on the date 30.09.2013 and was done with the permission of Knidos Presidency of excavation in accordance with the scope of Knidos Excavation and Research carried out on behalf of Selçuk University and Ministry of Culture and Tourism. This study was supported by the TÜBİTAK. (Project No: 113Y436)
Crone, Anthony J.; Haller, Kathleen M.; Maharrey, Joseph Z.
2009-01-01
The U.S. Geological Survey's (USGS) Earthquake Hazards Program (EHP) has the responsibility to provide nationwide information and knowledge about earthquakes and earthquake hazards as a step to mitigating earthquake-related losses. As part of this mission, USGS geologists and geophysicists continue to study faults and structures that have the potential to generate large and damaging earthquakes. In addition, the EHP, through its External Grants Program (hereinafter called Program), supports similar studies by scientists employed by state agencies, academic institutions, and independent employers. For the purposes of earthquake hazard investigations, the Nation is geographically subdivided into tectonic regions. One such region is the Intermountain West (IMW), which here is broadly defined as starting at the eastern margin of the Rocky Mountains in New Mexico, Colorado, Wyoming, and Montana and extending westward to the east side of the Sierra Nevada mountains in eastern California and into the Basin and Range-High Plateaus of eastern Oregon and Washington. The IMW contains thousands of faults that have moved in Cenozoic time, hundreds of which have evidence of Quaternary movement, and thus are considered to be potential seismic sources. Ideally, each Quaternary fault should be studied in detail to evaluate its rate of activity in order to model the hazard it poses. The study of a single fault requires a major commitment of time and resources, and given the large number of IMW faults that ideally should be studied, it is impractical to expect that all IMW Quaternary faults can be fully evaluated in detail. A more realistic approach is to prioritize a list of IMW structures that potentially pose a significant hazard and to focus future studies on those structures. Accordingly, in June 2008, a two-day workshop was convened at the USGS offices in Golden, Colorado, to seek information from representatives of selected State Geological Surveys in the IMW and with knowledgeable regional experts to identify the important structures for future studies. Such a priority list allows Program managers to guide the limited resources toward studies of features that are deemed to potentially pose the most serious hazards in the IMW. It also provides the scientific community with a list of structures to investigate because they are deemed to pose a substantial hazard to population centers or critical structures. The IMW encompasses all or large parts of 12 states, including Arizona, New Mexico, extreme west Texas, Colorado, Utah, Nevada, eastern California, eastern Oregon, eastern Washington, Idaho, western Wyoming, and western Montana. In Utah, and more recently in Nevada, geoscientists have taken steps to evaluate geologic data related to well-studied faults and to develop a statewide priority list of hazardous structures. In contrast to Utah and Nevada, the other IMW states contain substantially fewer Quaternary faults, so there have not been any previous efforts to develop similar priority lists. This workshop was organized to address this matter and create a more balanced perspective of priorities throughout the entire IMW region. Because working groups and workshops had already been convened to specifically deal with Quaternary fault priorities in Utah and Nevada, this workshop specifically emphasized structures outside of these two states.
SEISRISK II; a computer program for seismic hazard estimation
Bender, Bernice; Perkins, D.M.
1982-01-01
The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.
NPS Government Purchase Card Program: An Analysis of Internal Controls
2014-03-01
approving official APC agency program coordinator CCPMD Consolidated Card Program Management Division CH cardholder COSO Committee of Sponsoring...correct, and minimize fraud, waste, and abuse” (DPAP, 2011, p. 2-2). To minimize risks , the management and internal controls should have support from...three interrelated subjects: enterprise risk management (ERM), internal control, and fraud deterrence” (para. 6). The 23 five components of an
ERIC Educational Resources Information Center
Reimers, Fernando
Three case studies show innovative education programs that provide quality basic education with equity. After explaining the significance of educational innovation of democracy in Latin America and the constraints to educational development, the investigation of the three programs follows. The program of Fe y Alegria (Faith and Joy) in 12…
41 CFR 101-26.501-3 - Consolidated purchase program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 26-PROCUREMENT..., purchased in the aggregate by group to the extent practical. These procurements are designed to obtain the...
21 CFR 1002.7 - Submission of data and reports.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Document Control (HFZ-309), Office of Communication, Education, and Radiation Programs, 9200 Corporate Blvd..., or model family of the same product category, a “common aspects report” consolidating similar...
76 FR 31299 - Request for Applications for the Veterinary Medicine Loan Repayment Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-31
... release of the Veterinary Medicine Loan Repayment Program (VMLRP) Request for Applications (RFA) at http...) application package has been made available at http://www.nifa.usda.gov/vmlrp and applications are due by... Education. The NSLDS website can be found at http://www.nslds.ed.gov . Individuals who consolidated their...
Direct Loans: A Simple, Convenient, Flexible Way To Finance Your Education.
ERIC Educational Resources Information Center
Department of Education, Washington, DC. Student Financial Assistance.
This pamphlet provides basic facts about the U.S. Department of Education's Direct Loan Program for students. These loans include Federal Direct Stafford/Ford Loans, Federal Direct Unsubsidized Stafford/Ford Loans, Direct PLUS Loans, and Direct Consolidation Loans. The pamphlet's sections are: (1) "The Direct Loan Program: What Is It?";…
34 CFR 79.3 - What programs and activities of the Department are subject to these regulations?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Metropolitan Development Act. (b) If a program or activity of the Department that provides Federal financial.... (5) Direct payments to individuals. (6) Financial transfers for which the Department has no funding... the Education Consolidation and Improvement Act of 1981). (7) Research and development national in...
17 CFR 23.600 - Risk Management Program for swap dealers and major swap participants.
Code of Federal Regulations, 2013 CFR
2013-04-01
... at the consolidated entity level. (iii) The Risk Management Program shall include policies and...; and whether the product requires a novel pricing methodology or presents novel legal and regulatory... management unit, as to whether the new product would materially alter the overall entity-wide risk profile of...
17 CFR 23.600 - Risk Management Program for swap dealers and major swap participants.
Code of Federal Regulations, 2014 CFR
2014-04-01
... at the consolidated entity level. (iii) The Risk Management Program shall include policies and...; and whether the product requires a novel pricing methodology or presents novel legal and regulatory... management unit, as to whether the new product would materially alter the overall entity-wide risk profile of...
77 FR 14538 - Announcement of Funding Awards Family Unification Program (FUP) Fiscal Year (FY) 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... announcement contains the consolidated names and addresses of the award recipients for this year under the FUP.... Appendix A Fiscal Year 2010 Funding Awards for the Family Unification Program Recipient Address City State... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. 5415-FA-15] Announcement of Funding Awards...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-19
... Program Instructions (PIs). The training and data grants are governed by the ``new grant'' PI and the basic grant is governed by the ``basic grant'' PI. Current PIs require separate applications and program... and reporting processes by consolidating the PIs into one single PI and requiring one single...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-26
... participating in any one of CPD's four formula grant programs to determine each jurisdiciton's compliance with... proposal. The information is collected from all localities and states participating in any one of CPD's four formula grant programs to determine each jurisdiciton's compliance with statutory and regulatory...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-08
... delete NYSE Rule 445 (Anti-Money Laundering Compliance Program) and adopt new Rule 3310 (Anti-Money..., subject to certain amendments, NASD Rule 3011 (Anti- Money Laundering Compliance Program) and related Interpretive Material NASD IM-3011-1 and 3011-2 as consolidated FINRA Rule 3310 (Anti-Money Laundering...
Technology survey of computer software as applicable to the MIUS project
NASA Technical Reports Server (NTRS)
Fulbright, B. E.
1975-01-01
Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.
Chapter 1 In Action. Virginia Evaluation Summary, 1979-83.
ERIC Educational Resources Information Center
Hill, Macio H.
This booklet summarizes evaluation information from the first year of Chapter 1 programs (funded under the Education Consolidation and Improvement Act of 1981) in Virginia, and presents highlights from the last 18 years of the Title I program (funded under the Elementary and Secondary Education Act). The summary, which reflects the results of…
NASA Technical Reports Server (NTRS)
Stiffler, J. J.; Bryant, L. A.; Guccione, L.
1979-01-01
A computer program was developed as a general purpose reliability tool for fault tolerant avionics systems. The computer program requirements, together with several appendices containing computer printouts are presented.
Active, capable, and potentially active faults - a paleoseismic perspective
Machette, M.N.
2000-01-01
Maps of faults (geologically defined source zones) may portray seismic hazards in a wide range of completeness depending on which types of faults are shown. Three fault terms - active, capable, and potential - are used in a variety of ways for different reasons or applications. Nevertheless, to be useful for seismic-hazards analysis, fault maps should encompass a time interval that includes several earthquake cycles. For example, if the common recurrence in an area is 20,000-50,000 years, then maps should include faults that are 50,000-100,000 years old (two to five typical earthquake cycles), thus allowing for temporal variability in slip rate and recurrence intervals. Conversely, in more active areas such as plate boundaries, maps showing faults that are <10,000 years old should include those with at least 2 to as many as 20 paleoearthquakes. For the International Lithosphere Programs' Task Group II-2 Project on Major Active Faults of the World our maps and database will show five age categories and four slip rate categories that allow one to select differing time spans and activity rates for seismic-hazard analysis depending on tectonic regime. The maps are accompanied by a database that describes evidence for Quaternary faulting, geomorphic expression, and paleoseismic parameters (slip rate, recurrence interval and time of most recent surface faulting). These maps and databases provide an inventory of faults that would be defined as active, capable, and potentially active for seismic-hazard assessments.
Validation of Helicopter Gear Condition Indicators Using Seeded Fault Tests
NASA Technical Reports Server (NTRS)
Dempsey, Paula; Brandon, E. Bruce
2013-01-01
A "seeded fault test" in support of a rotorcraft condition based maintenance program (CBM), is an experiment in which a component is tested with a known fault while health monitoring data is collected. These tests are performed at operating conditions comparable to operating conditions the component would be exposed to while installed on the aircraft. Performance of seeded fault tests is one method used to provide evidence that a Health Usage Monitoring System (HUMS) can replace current maintenance practices required for aircraft airworthiness. Actual in-service experience of the HUMS detecting a component fault is another validation method. This paper will discuss a hybrid validation approach that combines in service-data with seeded fault tests. For this approach, existing in-service HUMS flight data from a naturally occurring component fault will be used to define a component seeded fault test. An example, using spiral bevel gears as the targeted component, will be presented. Since the U.S. Army has begun to develop standards for using seeded fault tests for HUMS validation, the hybrid approach will be mapped to the steps defined within their Aeronautical Design Standard Handbook for CBM. This paper will step through their defined processes, and identify additional steps that may be required when using component test rig fault tests to demonstrate helicopter CI performance. The discussion within this paper will provide the reader with a better appreciation for the challenges faced when defining a seeded fault test for HUMS validation.
Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi
2013-01-01
Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study. PMID:24351636
Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi
2013-12-13
Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.
Investigating an API for resilient exascale computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stearley, Jon R.; Tomkins, James; VanDyke, John P.
2013-05-01
Increased HPC capability comes with increased complexity, part counts, and fault occurrences. In- creasing the resilience of systems and applications to faults is a critical requirement facing the viability of exascale systems, as the overhead of traditional checkpoint/restart is projected to outweigh its bene ts due to fault rates outpacing I/O bandwidths. As faults occur and propagate throughout hardware and software layers, pervasive noti cation and handling mechanisms are necessary. This report describes an initial investigation of fault types and programming interfaces to mitigate them. Proof-of-concept APIs are presented for the frequent and important cases of memory errors and nodemore » failures, and a strategy proposed for lesystem failures. These involve changes to the operating system, runtime, I/O library, and application layers. While a single API for fault handling among hardware and OS and application system-wide remains elusive, the e ort increased our understanding of both the mountainous challenges and the promising trailheads. 3« less
NASA Astrophysics Data System (ADS)
Luther, A. L.; Axen, G. J.; Selverstone, J.; Khalsa, N.
2009-12-01
Classical fault mechanic theory does not adequately explain slip on “weak” faults oriented at high angles to the regional maximum stress direction, such as the San Andreas Fault and low-angle normal faults. One hypothesis is that stress rotation due to fault-weakening mechanisms allows slip, which may be testable using detailed paleostress analyses of minor faults and tensile fractures. Preliminary data from the footwalls of the Whipple detachment (WD) and the West Salton detachment (WSD) suggest lateral and/or vertical stress rotations. Three inversion programs that use different fault-slip datasets are compared. 1) FaultKin (Marrett and Allmendinger ‘90; Cladouhos and Allmendinger ‘93) determines the principal strain directions using only faults with striae and known slip senses; principal stress orientations are determined assuming coaxiality. To date, FaultKin results appear to be the most reproducible, but it is difficult to find enough faults with striae and slip sense in the small outcrop areas of our study. 2) Slick.bas (Ramsey and Lisle ‘00) uses a grid search to find the best-fit stress tensor from fault and striae orientations, but does not accept slip sense. This program can yield erroneous stress fields that predict slip senses opposite those known for some faults (particularly faults at a high angle to sigma 1). 3) T-TECTO 2.0 (Zalohar and Vrabec ‘07) applies a Gaussian approach, using orientations of faults and striae, the slip senses of any faults for which it is known, plus tensile fractures. We expect that this flexibility of input data types will be best, but testing is preliminary. Paleostress analyses assume that minor faults slipped in response to constant, homogeneous stress fields. We use shear and tensile fractures and cross-cutting relationships from the upper ~25 m of both footwalls to test for spatial and temporal changes to the paleostress field. Paleostress analysis of fractures ~0.3 - 2 m below the WSD on the N limb of an antiform suggests that sigma 3 plunges moderately (~45 degrees) W, sigma 1 plunges gently S, and sigma 2 is steep, consistent with wrench-related folding about E-W trends during WSD slip. However, tensile fractures in the immediately overlying ultracataclasite yield sigma 3 with a shallow W plunge (~4 degrees). In a synformal trough, Reidel shears in the upper 1-2 m of the WSD footwall suggest a moderately (~50 degrees) E plunging sigma 1. Deeper (2-10 m) in the footwall, shear fractures have different but consistent orientations, suggesting a change in the stress field. Preliminary results from several sets of shear fractures in the WD footwall suggest that sigma 1 is steep (~75-90 degrees) in the chlorite breccia zone (implying low shear traction) but is shallower (~45 degrees) in the deeper damage zone. Prior work (Axen & Selverstone ‘94) found that sigma 1 becomes steep again at greater depths. Continued testing of paleostress analysis methods and several other datasets are in progress to confirm our results.
Topal, Savaş; Özkul, Mehmet
2014-01-01
The NW-trending Denizli basin of the SW Turkey is one of the neotectonic grabens in the Aegean extensional province. It is bounded by normal faults on both southern and northern margins. The basin is filled by Neogene and Quaternary terrestrial deposits. Late Miocene- Late Pliocene aged Kolankaya formation crops out along the NW trending Karakova uplift in the Denizli basin. It is a typical fluviolacustrine succession that thickens and coarsens upward, comprising poorly consolidated sand, gravelly sand, siltstone and marl. Various soft-sediment deformation structures occur in the formation, especially in fine- to medium grained sands, silts and marls: load structures, flame structures, clastic dikes (sand and gravely-sand dike), disturbed layers, laminated convolute beds, slumps and synsedimentary faulting. The deformation mechanism and driving force for the soft-sediment deformation are related essentially to gravitational instability, dewatering, liquefaction-liquidization, and brittle deformation. Field data and the wide lateral extent of the structures as well as regional geological data show that most of the deformation is related to seismicity and the structures are interpreted as seismites. The existence of seismites in the Kolankaya Formation is evidence for continuing tectonic activity in the study area during the Neogene and is consistent with the occurrence of the paleoearthquakes of magnitude >5. PMID:25152909
Seismic Microzonation of the City of Cali (Western Colombia)
NASA Astrophysics Data System (ADS)
Dimate, C.; Romero, J.; Ojeda, A.; Garcia, J.; Alvarado, C.
2007-05-01
The city of Cali is located in the western margin of the Cauca Valley in the flat area between the Western and Central cordilleras of the Colombian Andes, at 70 km east of the Eastern Pacific Subduction Zone. Even though present seismic activity associated with nearest faults is low, historical records demonstrate that earthquakes have caused damage in the city going up to intensity VIII (EMS). Those earthquakes have had origin on diverse sources: the intermediate-depth Benioff zone, near and far continental crustal faults and the Pacific Subduction Zone. Taking into account the location of the city and the seismologic history of the region, neotectonic and seismological studies extending over a region of about 120000 km2 were required to compute seismic hazard. Construction of the geotechnical model of the city included detailed geological mapping, geophysical profiling, single station ambient vibration essays and the deployment of a 12-stations accelerographic network. Geotechnical properties of the soils were determined by mechanical perforations, CPTU (piezocone) and CPT (static penetration) essays, flat plate dilatometer (DMT) tests and down-hole essays which were complemented in the Lab by analysis of consolidation and static and cyclic three-axial essays. As a result, ten geotechnical zones were outlined and characterized. Finally, expected ground motions were calculated at 39 sites in the city using numerical modeling methods.
FINDS: A fault inferring nonlinear detection system programmers manual, version 3.0
NASA Technical Reports Server (NTRS)
Lancraft, R. E.
1985-01-01
Detailed software documentation of the digital computer program FINDS (Fault Inferring Nonlinear Detection System) Version 3.0 is provided. FINDS is a highly modular and extensible computer program designed to monitor and detect sensor failures, while at the same time providing reliable state estimates. In this version of the program the FINDS methodology is used to detect, isolate, and compensate for failures in simulated avionics sensors used by the Advanced Transport Operating Systems (ATOPS) Transport System Research Vehicle (TSRV) in a Microwave Landing System (MLS) environment. It is intended that this report serve as a programmers guide to aid in the maintenance, modification, and revision of the FINDS software.
Fault detection and initial state verification by linear programming for a class of Petri nets
NASA Technical Reports Server (NTRS)
Rachell, Traxon; Meyer, David G.
1992-01-01
The authors present an algorithmic approach to determining when the marking of a LSMG (live safe marked graph) or a LSFC (live safe free choice) net is in the set of live safe markings M. Hence, once the marking of a net is determined to be in M, then if at some time thereafter the marking of this net is determined not to be in M, this indicates a fault. It is shown how linear programming can be used to determine if m is an element of M. The worst-case computational complexity of each algorithm is bounded by the number of linear programs necessary to compute.
Identification of the Polaris Fault using lidar and shallow geophysical methods
Hunter, Lewis E.; Powers, Michael H.; Burton, Bethany L.
2017-01-01
As part of the U.S. Army Corps of Engineers' (USACE) Dam Safety Assurance Program, Martis Creek Dam near Truckee, CA, is under evaluation for earthquake and seepage hazards. The investigations to date have included LiDAR (Light Detection and Ranging) and a wide range of geophysical surveys. The LiDAR data led to the discovery of an important and previously unknown fault tracing very near and possibly under Martis Creek Dam. The geophysical surveys of the dam foundation area confirm evidence of the fault in the area.
New cooperative seismograph networks established in southern California
Hill, D.P.
1974-01-01
Southern California has more active faults located close to large, urban population centers than any other region in the United States. Reduction of risk to life and property posed by potential earthquakes along these active faults is a primary motivation for a cooperative earthquake research program between the U.S Geological Survey and major universities in Southern California.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... corresponding terms in alphabetical order below: AIDS Acquired Immune Deficiency Syndrome ARD Assessment... resource-intensive than SNF residents, especially SNF post-acute care patients. These commenters stated...
40 CFR 65.102 - Alternative means of emission limitation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROGRAMS (CONTINUED) CONSOLIDATED FEDERAL AIR RULE Equipment Leaks § 65.102 Alternative means of emission... equipment leaks of a regulated material may apply to the Administrator for approval of an alternative means...
NASA Astrophysics Data System (ADS)
Rohmer, J.; Tremosa, J.; Marty, N. C. M.; Audigane, P.
2017-10-01
In the present study, we assess the potential for initiating ductile failure in a fractured caprock due to the chemical alteration of its mechanical properties under pressure increase induced by CO2 leakage and fixed in situ boundary conditions. In this view, 2D numerically coupled reactive-transport simulations were set up by using the Opalinus Clay formation as an analogue for a caprock layer. The fractured system was viewed as a compartmentalised system that consists of a main highly permeable pathway, a moderately permeable damage zone and the intact rock. The outputs of the numerical simulations (mineral fraction, porosity changes, gas saturation, pore-fluid pressure) were converted into parameter changes of the yield surface by viewing the rock material of the three compartments (fault, damage zone and intact rock) as a composite system that consists of a clayey solid material, pores and mineral inclusions (such as carbonate and quartz). Three alteration processes were considered: (1) the effect of the mineral fraction and porosity evolution on the yield surface, (2) changes in the resulting poro-elastic properties and (3) the suction effect, i.e. the bounding effect induced by the presence of two phases, water and CO2. Our numerical investigations showed that the decrease in the friction coefficient remained negligible during leakage, while the pre-consolidation stress mainly decreased. Consequently, the damage zone of the fractured system became more collapsible over time, which was driven by low-to-moderate pressure build-up of the fluid penetrating the fault (1 MPa in our case). For the considered case, the initiation of ductile failure is likely under conditions of fixed vertical stress and zero lateral strain. This process could potentially limit the spatial spreading of CO2-induced alteration, although this remains very site specific. We recommend that characterisation efforts be intensified to obtain better insight into the properties of fracture systems in caprock-like formations (with special attention to their initial over consolidation ratio).
Tanriverdi, Ozgur; Barista, Ibrahim; Paydas, Semra; Nayir, Erdinc; Karakas, Yusuf
2017-11-26
In this study, we aimed to determine the perspectives of medical and radiation oncologists regarding consolidation radiotherapy in patients with a complete response after chemotherapy for Hodgkin’s and non-Hodgkin’s lymphomas. The survey was designed to identify demographic and occupational features of medical and radiation oncologists and their views on application of consolidation radiotherapy in their clinical practices, as based on a five-point Likert scale (never, rarely, sometimes, often, and always). The study covered 263, out of 935, physicians working in the oncology field as either medical or radiation oncologists; the rate of return on the invitations to participate was 28%. The majority of the participants were male radiation oncologists, with a duration of between 5 and 10 years of work as a university hospital official, and the mean age was 38 ± 14 (years). Although the most commonly followed international guidelines were NCCN, among the physicians, the majority of the respondents suggested that the guidelines were unclear regarding recommendations for consolidative radiotherapy. The administered dose for consolidative radiotherapy in lymphoma patients was indicated as 40 Gy by 49% of all the physicians and the most common cause of hesitancy concerning consolidative radiation treatment was the risk of secondary malignancies as a long-term adverse effect (54%). In conclusion, we suggest that medical oncologists could be most active in the treatment of lymphoma through a continuous training program about lymphomas and current national guidelines. Creative Commons Attribution License
Computing Fault Displacements from Surface Deformations
NASA Technical Reports Server (NTRS)
Lyzenga, Gregory; Parker, Jay; Donnellan, Andrea; Panero, Wendy
2006-01-01
Simplex is a computer program that calculates locations and displacements of subterranean faults from data on Earth-surface deformations. The calculation involves inversion of a forward model (given a point source representing a fault, a forward model calculates the surface deformations) for displacements, and strains caused by a fault located in isotropic, elastic half-space. The inversion involves the use of nonlinear, multiparameter estimation techniques. The input surface-deformation data can be in multiple formats, with absolute or differential positioning. The input data can be derived from multiple sources, including interferometric synthetic-aperture radar, the Global Positioning System, and strain meters. Parameters can be constrained or free. Estimates can be calculated for single or multiple faults. Estimates of parameters are accompanied by reports of their covariances and uncertainties. Simplex has been tested extensively against forward models and against other means of inverting geodetic data and seismic observations. This work
Development and analysis of the Software Implemented Fault-Tolerance (SIFT) computer
NASA Technical Reports Server (NTRS)
Goldberg, J.; Kautz, W. H.; Melliar-Smith, P. M.; Green, M. W.; Levitt, K. N.; Schwartz, R. L.; Weinstock, C. B.
1984-01-01
SIFT (Software Implemented Fault Tolerance) is an experimental, fault-tolerant computer system designed to meet the extreme reliability requirements for safety-critical functions in advanced aircraft. Errors are masked by performing a majority voting operation over the results of identical computations, and faulty processors are removed from service by reassigning computations to the nonfaulty processors. This scheme has been implemented in a special architecture using a set of standard Bendix BDX930 processors, augmented by a special asynchronous-broadcast communication interface that provides direct, processor to processor communication among all processors. Fault isolation is accomplished in hardware; all other fault-tolerance functions, together with scheduling and synchronization are implemented exclusively by executive system software. The system reliability is predicted by a Markov model. Mathematical consistency of the system software with respect to the reliability model has been partially verified, using recently developed tools for machine-aided proof of program correctness.
Runtime Verification in Context : Can Optimizing Error Detection Improve Fault Diagnosis
NASA Technical Reports Server (NTRS)
Dwyer, Matthew B.; Purandare, Rahul; Person, Suzette
2010-01-01
Runtime verification has primarily been developed and evaluated as a means of enriching the software testing process. While many researchers have pointed to its potential applicability in online approaches to software fault tolerance, there has been a dearth of work exploring the details of how that might be accomplished. In this paper, we describe how a component-oriented approach to software health management exposes the connections between program execution, error detection, fault diagnosis, and recovery. We identify both research challenges and opportunities in exploiting those connections. Specifically, we describe how recent approaches to reducing the overhead of runtime monitoring aimed at error detection might be adapted to reduce the overhead and improve the effectiveness of fault diagnosis.
Earthquake-caused subsidence events of the Duck Flats at the eastern end of the Knik Arm, Alaska
NASA Astrophysics Data System (ADS)
Reeder, J. W.
2012-12-01
A 5 km NS-trending gas pipeline trench, excavated in 1984 across the Duck Flats of the eastern end of the Knik Arm about 50 km NE of Anchorage, Alaska, exposed two continuous buried peat horizons. Two bulk C-14 dates for the upper buried peat horizon were determined to be 790 ± 160 and 775 ± 170 ybp. The depth of this peat horizon varied from 1.0 to 1.8 m. The deeper paleopeat horizon had a single bulk C-14 date of 1190 ± 80 ybp and varied from 1.7 to greater than 2.4 m (depth of trench). A deeper third paleopeat horizon was confirmed in 2012 by hand auger, which was found at a depth of 3.7 m. Turbulent organic (principally grass) mixing with tidal silt and clay immediately above both of the trench paleopeat horizons is interpreted to reflect tsunami flooding. The March 27, 1964, earthquake caused recognized subsidence of up to 0.3 m at the southern end of the trench as based on tidal deposits above 1964 peats. This was caused by consolidation of Matanuska and Knik fluvial deposits immediately to the S and by some tectonic subsidence. The 1964 peat horizon was not recognized for the rest of the trench possibly because of poor near-surface winter exposures or more simply because the 1964 peat horizon is also part of the present surface. The existence of the above continuous paleopeat horizons is significant because they reflect subsidence events not expected with 1964-type megathrust subduction. In fact, the above paleopeat C-14 age dates correlate more with recognized earthquake events of the Castle Mountain fault, an intraplate fault 20 km to the NW, than with recognized 1964-type megathrust events. However, movements on regional crustal faults, such as the Castle Mountain fault, likely would not be enough to account for the large amounts of subsidence observed on the Duck Flats. Instead, these subsidence events probably reflect sudden tectonic movements of the Pacific plate beneath the North American plate in this region. The process would involve flat-slab subduction of the Yakutat microplate coupled to the Pacific plate. Such movements might have extended not only to, but possibly even combined at times with, 1964-type megathrust movements principally to the SE, as well as combined with movements of regional faults such as the Castle Mountain fault. The potential for such continental megathrust earthquakes should be included with any future earthquake hazard considerations for this region.
ERIC Educational Resources Information Center
Administration for Children, Youth, and Families (DHHS), Washington, DC. Head Start Bureau.
This document consolidates, clarifies, and updates federal regulations on Head Start services for children with disabilities. The regulations are designed to complement the Head Start Program Performance Standards governing services to all enrolled children. Specifically, these regulations require Head Start programs to: (1) design comprehensive…
Coast Guard Deepwater Acquisition Programs: Background, Oversight Issues, and Options for Congress
2009-12-23
NUMBER 5e . TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Congressional Research Service,Library of Congress...Northrop Grumman Ship Systems ( NGSS ). ICGS was awarded an indefinite delivery, indefinite quantity (ID/IQ) contract for the Deepwater program that...sustainment is not a Deepwater program but is displayed to align with the FY2009 Consolidated Security, Disaster Assistance, and Continuing Appropriations
An effective approach for road asset management through the FDTD simulation of the GPR signal
NASA Astrophysics Data System (ADS)
Benedetto, Andrea; Pajewski, Lara; Adabi, Saba; Kusayanagi, Wolfgang; Tosti, Fabio
2015-04-01
Ground-penetrating radar is a non-destructive tool widely used in many fields of application including pavement engineering surveys. Over the last decade, the need for further breakthroughs capable to assist end-users and practitioners as decision-support systems in more effective road asset management is increasing. In more details and despite the high potential and the consolidated results obtained over years by this non-destructive tool, pavement distress manuals are still based on visual inspections, so that only the effects and not the causes of faults are generally taken into account. In this framework, the use of simulation can represent an effective solution for supporting engineers and decision-makers in understanding the deep responses of both revealed and unrevealed damages. In this study, the potential of using finite-difference time-domain simulation of the ground-penetrating radar signal is analyzed by simulating several types of flexible pavement at different center frequencies of investigation typically used for road surveys. For these purposes, the numerical simulator GprMax2D, implementing the finite-difference time-domain method, was used, proving to be a highly effective tool for detecting road faults. In more details, comparisons with simplified undisturbed modelled pavement sections were carried out showing promising agreements with theoretical expectations, and good chances for detecting the shape of damages are demonstrated. Therefore, electromagnetic modelling has proved to represent a valuable support system in diagnosing the causes of damages, even for early or unrevealed faults. Further perspectives of this research will be focused on the modelling of more complex scenarios capable to represent more accurately the real boundary conditions of road cross-sections. Acknowledgements - This work has benefited from networking activities carried out within the EU funded COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar".
NASA Astrophysics Data System (ADS)
Gammaldi, S.; Amoroso, O.; D'Auria, L.; Zollo, A.
2018-05-01
A multi-2D imaging of the Solfatara Crater inside the Campi Flegrei Caldera, was obtained by the joint interpretation of geophysical evidences and the new active seismic dataset acquired during the RICEN experiment (EU project MEDSUV) in 2014. We used a total of 17,894 first P-wave arrival times manually picked on pre-processed waveforms, recorded along two 1D profiles criss-crossing the inner Solfatara crater, and performed a tomographic inversion based on a multi-scale strategy and a Bayesian estimation of velocity parameters. The resulting tomographic images provide evidence for a low velocity (500-1500 m/s) water saturated deeper layer at West near the outcropping evidence of the Fangaia, contrasted by a high velocity (2000-3200 m/s) layer correlated with a consolidated tephra deposit. The transition velocity range (1500-2000 m/s) layer suggests a possible presence of a gas-rich, accumulation volume. Thanks to the mutual P-wave velocity model, we infer a detailed image for the gas migration path to the Earth surface. The gasses coming from the deep hydrothermal plume accumulate in the central and most depressed area of the Solfatara being trapped by the meteoric water saturated layer. Therefore, the gasses are transmitted through the buried fault toward the east part of the crater, where the ring faults facilitate the release as confirmed by the fumaroles. Starting from the eastern surface evidence of the gas releasing in the Bocca Grande and Bocca Nuova fumaroles, and the presence of the central deeper plume we suggest a fault situated in the central part of the crater which seems to represent the main buried conduit among them plays a key role.
NASA Astrophysics Data System (ADS)
Yang, Wen-Xian
2006-05-01
Available machine fault diagnostic methods show unsatisfactory performances on both on-line and intelligent analyses because their operations involve intensive calculations and are labour intensive. Aiming at improving this situation, this paper describes the development of an intelligent approach by using the Genetic Programming (abbreviated as GP) method. Attributed to the simple calculation of the mathematical model being constructed, different kinds of machine faults may be diagnosed correctly and quickly. Moreover, human input is significantly reduced in the process of fault diagnosis. The effectiveness of the proposed strategy is validated by an illustrative example, in which three kinds of valve states inherent in a six-cylinders/four-stroke cycle diesel engine, i.e. normal condition, valve-tappet clearance and gas leakage faults, are identified. In the example, 22 mathematical functions have been specially designed and 8 easily obtained signal features are used to construct the diagnostic model. Different from existing GPs, the diagnostic tree used in the algorithm is constructed in an intelligent way by applying a power-weight coefficient to each feature. The power-weight coefficients vary adaptively between 0 and 1 during the evolutionary process. Moreover, different evolutionary strategies are employed, respectively for selecting the diagnostic features and functions, so that the mathematical functions are sufficiently utilized and in the meantime, the repeated use of signal features may be fully avoided. The experimental results are illustrated diagrammatically in the following sections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pashin, J.C.; Raymond, D.E.; Rindsberg, A.K.
1997-08-01
Gilbertown Field is the oldest oil field in Alabama and produces oil from chalk of the Upper Cretaceous Selma Group and from sandstone of the Eutaw Formation along the southern margin of the Gilbertown fault system. Most of the field has been in primary recovery since establishment, but production has declined to marginally economic levels. This investigation applies advanced geologic concepts designed to aid implementation of improved recovery programs. The Gilbertown fault system is detached at the base of Jurassic salt. The fault system began forming as a half graben and evolved in to a full graben by the Latemore » Cretaceous. Conventional trapping mechanisms are effective in Eutaw sandstone, whereas oil in Selma chalk is trapped in faults and fault-related fractures. Burial modeling establishes that the subsidence history of the Gilbertown area is typical of extensional basins and includes a major component of sediment loading and compaction. Surface mapping and fracture analysis indicate that faults offset strata as young as Miocene and that joints may be related to regional uplift postdating fault movement. Preliminary balanced structural models of the Gilbertown fault system indicate that synsedimentary growth factors need to be incorporated into the basic equations of area balance to model strain and predict fractures in Selma and Eutaw reservoirs.« less
75 FR 22736 - Notice of Request for Applications for the Veterinary Medicine Loan Repayment Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-30
... (RFA) at http://www.nifa.usda.gov/vmlrp . DATES: The FY 2010 Veterinary Medicine Loan Repayment Program (VMLRP) application package has been made available at http://www.nifa.usda.gov/vmlrp and applications... http://www.nslds.ed.gov . Individuals who consolidated their DVM loans with non-educational loans or...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
... announced in a Federal Register notice published on May 22, 2009 for Lead Based Paint Hazard Control and... for the Lead Based Paint Hazard Control Grant Program under the Consolidated Appropriations Act, 2009... Tahoe, 1901 Airport Road, Suite 107, South Lake Tahoe, CA 96150, $1,500,000; State of Connecticut...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-29
... of Public Law 113-6. 1. Additional Eligibility Criteria applicable to the $270 million provided by... deficiencies at such schools. Pursuant to Section 8108 of Public Law 113-6, the Consolidated and Further Continuing Appropriations Act, 2013, Congress made available an additional $270 million for the program and...
34 CFR 200.29 - Consolidation of funds in a schoolwide program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... assessment under § 200.83; and (ii) Document that these needs have been met. (2) Indian education. The school... meet the intent and purposes of that program to ensure that the needs of the intended beneficiaries of... those parents, or both, first to meet the unique educational needs of migratory students that result...
34 CFR 200.29 - Consolidation of funds in a schoolwide program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... assessment under § 200.83; and (ii) Document that these needs have been met. (2) Indian education. The school... meet the intent and purposes of that program to ensure that the needs of the intended beneficiaries of... those parents, or both, first to meet the unique educational needs of migratory students that result...
34 CFR 200.29 - Consolidation of funds in a schoolwide program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... assessment under § 200.83; and (ii) Document that these needs have been met. (2) Indian education. The school... meet the intent and purposes of that program to ensure that the needs of the intended beneficiaries of... those parents, or both, first to meet the unique educational needs of migratory students that result...
Ground-water resources in Mendocino County, California
Farrar, C.D.
1986-01-01
Mendocino County includes about 3,500 sq mi of coastal northern California. Groundwater is the main source for municipal and individual domestic water systems and contributes significantly to irrigation. Consolidated rocks of the Franciscan Complex are exposed over most of the county. The consolidated rocks are commonly dry and generally supply < 5 gal/min of water to wells. Unconsolidated fill in the inland valleys consists of gravel, sand, silt, and clay. Low permeability in the fill caused by fine grain size and poor sorting limits well yields to less than 50 gal/min in most areas; where the fill is better sorted, yields of 1,000 gal/min can be obtained. Storage capacity estimates for the three largest basins are Ukiah Valley, 90,000 acre-ft; Little lake Valley, 35,000 acre-ft; and Laytonville Valley, 14,000 acre-ft. Abundant rainfall (35 to 56 in/yr) generally recharges these basins to capacity. Seasonal water level fluctuations since the 1950 's have been nearly constant, except during the 1976-77 drought. Chemical quality of water in basement rocks and valley fill is generally acceptable for most uses. Some areas along fault zones yield water with high boron concentrations ( <2 mg/L). Sodium chloride water with dissolved solids concentrations exceeding 1,000 mg/L is found in deeper parts of Little Lake Valley. (Author 's abstract)
Xu, Xing-Wang; Cai, Xin-Ping; Zhong, Jia-You; Song, Bao-Chang; Peters, Stephen G.
2007-01-01
Tertiary (3.78 Ma to 3.65 Ma) biotite-K-feldspar porphyritic bodies intrude Tertiary, poorly consolidated lacustrine sedimentary rocks in the Beiya mineral district in southwestern China. The intrusives are characterized by a microcrystalline and vitreous-cryptocrystalline groundmass, by replacement of some tabular K-feldspar phenocrysts with microcrystalline chlorite and calcite, and by Fe-rich rings surrounding biotite phenocrysts. Peculiar structures, such as contemporary contact faults and slickensides, ductile shear zones and flow folds, foliation and lineations, tension fractures, and banded and boudin peperites, are developed along the contact zones of the intrusives. These features are related to the forceful intrusion of the alkaline magmas into the wet Tertiary sediments. The partially consolidated magmas were deformed and flattened by continued forceful magma intrusion that produced boudinaged and banded peperites. These peperites characterized by containing oriented deformation fabrics are classified as tectonic peperites as a new type of peperite, and formation of these tectonic peperites was related to fracturing of magmas caused by forceful intrusion and shear deformation and to contemporary migration and injection of fluidized sediments along fractures that dismembered the porphyritic magma. Emplacement of the magma into the wet sediments in the Beiya area is interpreted to be related to a large pressure difference rather than to the buoyancy force.
The finite element method in the deformation and consolidation of porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, R.W.; Schrefler, B.A.
1987-01-01
The authors start with an introduction to the concepts involved in physics giving the equations of flow through porous media and the deformation characteristics of soils and rocks. Succeeding chapters deal with the practical implications of these phenomena and explain the application of theory in both experimental and field work. Details are given of actual incidents, such as the subsidence experienced in Venice and Ravenna. The authors have also formulated a consolidation code, which is detailed at the end of the book, and provide instructions on how to modify the given program.
A distributed programming environment for Ada
NASA Technical Reports Server (NTRS)
Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.
1986-01-01
Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.
24 CFR 91.215 - Strategic plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... organizations, community and faith-based organizations, and public institutions, through which the jurisdiction... CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS Local Governments; Contents of... units, rehabilitation of existing units, or acquisition of existing units (including preserving...
Ryan, Holly F.; Conrad, James E.; Paull, C.K.; McGann, Mary
2012-01-01
The San Diego trough fault zone (SDTFZ) is part of a 90-km-wide zone of faults within the inner California Borderland that accommodates motion between the Pacific and North American plates. Along with most faults offshore southern California, the slip rate and paleoseismic history of the SDTFZ are unknown. We present new seismic reflection data that show that the fault zone steps across a 5-km-wide stepover to continue for an additional 60 km north of its previously mapped extent. The 1986 Oceanside earthquake swarm is located within the 20-km-long restraining stepover. Farther north, at the latitude of Santa Catalina Island, the SDTFZ bends 20° to the west and may be linked via a complex zone of folds with the San Pedro basin fault zone (SPBFZ). In a cooperative program between the U.S. Geological Survey (USGS) and the Monterey Bay Aquarium Research Institute (MBARI), we measure and date the coseismic offset of a submarine channel that intersects the fault zone near the SDTFZ–SPBFZ junction. We estimate a horizontal slip rate of about 1:5 0:3 mm=yr over the past 12,270 yr.
Model Transformation for a System of Systems Dependability Safety Case
NASA Technical Reports Server (NTRS)
Murphy, Judy; Driskell, Stephen B.
2010-01-01
Software plays an increasingly larger role in all aspects of NASA's science missions. This has been extended to the identification, management and control of faults which affect safety-critical functions and by default, the overall success of the mission. Traditionally, the analysis of fault identification, management and control are hardware based. Due to the increasing complexity of system, there has been a corresponding increase in the complexity in fault management software. The NASA Independent Validation & Verification (IV&V) program is creating processes and procedures to identify, and incorporate safety-critical software requirements along with corresponding software faults so that potential hazards may be mitigated. This Specific to Generic ... A Case for Reuse paper describes the phases of a dependability and safety study which identifies a new, process to create a foundation for reusable assets. These assets support the identification and management of specific software faults and, their transformation from specific to generic software faults. This approach also has applications to other systems outside of the NASA environment. This paper addresses how a mission specific dependability and safety case is being transformed to a generic dependability and safety case which can be reused for any type of space mission with an emphasis on software fault conditions.
Triaxial testing of Lopez Fault gouge at 150 MPa mean effective stress
Scott, D.R.; Lockner, D.A.; Byerlee, J.D.; Sammis, C.G.
1994-01-01
Triaxial compression experiments were performed on samples of natural granular fault gouge from the Lopez Fault in Southern California. This material consists primarily of quartz and has a self-similar grain size distribution thought to result from natural cataclasis. The experiments were performed at a constant mean effective stress of 150 MPa, to expose the volumetric strains associated with shear failure. The failure strength is parameterized by the coefficient of internal friction ??, based on the Mohr-Coulomb failure criterion. Samples of remoulded Lopez gouge have internal friction ??=0.6??0.02. In experiments where the ends of the sample are constrained to remain axially aligned, suppressing strain localisation, the sample compacts before failure and dilates persistently after failure. In experiments where one end of the sample is free to move laterally, the strain localises to a single oblique fault at around the point of failure; some dilation occurs but does not persist. A comparison of these experiments suggests that dilation is confined to the region of shear localisation in a sample. Overconsolidated samples have slightly larger failure strengths than normally consolidated samples, and smaller axial strains are required to cause failure. A large amount of dilation occurs after failure in heavily overconsolidated samples, suggesting that dilation is occurring throughout the sample. Undisturbed samples of Lopez gouge, cored from the outcrop, have internal friction in the range ??=0.4-0.6; the upper end of this range corresponds to the value established for remoulded Lopez gouge. Some kind of natural heterogeneity within the undisturbed samples is probably responsible for their low, variable strength. In samples of simulated gouge, with a more uniform grain size, active cataclasis during axial loading leads to large amounts of compaction. Larger axial strains are required to cause failure in simulated gouge, but the failure strength is similar to that of natural Lopez gouge. Use of the Mohr-Coulomb failure criterion to interpret the results from this study, and other recent studies on intact rock and granular gouge, leads to values of ?? that depend on the loading configuration and the intact or granular state of the sample. Conceptual models are advanced to account for these descrepancies. The consequences for strain-weakening of natural faults are also discussed. ?? 1994 Birkha??user Verlag.
NASA Technical Reports Server (NTRS)
Tomayko, James E.
1986-01-01
Twenty-five years of spacecraft onboard computer development have resulted in a better understanding of the requirements for effective, efficient, and fault tolerant flight computer systems. Lessons from eight flight programs (Gemini, Apollo, Skylab, Shuttle, Mariner, Voyager, and Galileo) and three reserach programs (digital fly-by-wire, STAR, and the Unified Data System) are useful in projecting the computer hardware configuration of the Space Station and the ways in which the Ada programming language will enhance the development of the necessary software. The evolution of hardware technology, fault protection methods, and software architectures used in space flight in order to provide insight into the pending development of such items for the Space Station are reviewed.
Validation of the SURE Program, phase 1
NASA Technical Reports Server (NTRS)
Dotson, Kelly J.
1987-01-01
Presented are the results of the first phase in the validation of the SURE (Semi-Markov Unreliability Range Evaluator) program. The SURE program gives lower and upper bounds on the death-state probabilities of a semi-Markov model. With these bounds, the reliability of a semi-Markov model of a fault-tolerant computer system can be analyzed. For the first phase in the validation, fifteen semi-Markov models were solved analytically for the exact death-state probabilities and these solutions compared to the corresponding bounds given by SURE. In every case, the SURE bounds covered the exact solution. The bounds, however, had a tendency to separate in cases where the recovery rate was slow or the fault arrival rate was fast.
All row, planar fault detection system
Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D; Smith, Brian Edward
2013-07-23
An apparatus, program product and method for detecting nodal faults may simultaneously cause designated nodes of a cell to communicate with all nodes adjacent to each of the designated nodes. Furthermore, all nodes along the axes of the designated nodes are made to communicate with their adjacent nodes, and the communications are analyzed to determine if a node or connection is faulty.
Development of a Hydrologic Characterization Technology for Fault Zones Phase II 2nd Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karasaki, Kenzi; Doughty, Christine; Gasperikova, Erika
2011-03-31
This is the 2nd report on the three-year program of the 2nd phase of the NUMO-LBNL collaborative project: Development of Hydrologic Characterization Technology for Fault Zones under NUMO-DOE/LBNL collaboration agreement. As such, this report is a compendium of the results by Kiho et al. (2011) and those by LBNL.
Scalable Replay with Partial-Order Dependencies for Message-Logging Fault Tolerance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lifflander, Jonathan; Meneses, Esteban; Menon, Harshita
2014-09-22
Deterministic replay of a parallel application is commonly used for discovering bugs or to recover from a hard fault with message-logging fault tolerance. For message passing programs, a major source of overhead during forward execution is recording the order in which messages are sent and received. During replay, this ordering must be used to deterministically reproduce the execution. Previous work in replay algorithms often makes minimal assumptions about the programming model and application in order to maintain generality. However, in many cases, only a partial order must be recorded due to determinism intrinsic in the code, ordering constraints imposed bymore » the execution model, and events that are commutative (their relative execution order during replay does not need to be reproduced exactly). In this paper, we present a novel algebraic framework for reasoning about the minimum dependencies required to represent the partial order for different concurrent orderings and interleavings. By exploiting this theory, we improve on an existing scalable message-logging fault tolerance scheme. The improved scheme scales to 131,072 cores on an IBM BlueGene/P with up to 2x lower overhead than one that records a total order.« less
NASA Technical Reports Server (NTRS)
Brock, L. D.; Lala, J.
1986-01-01
The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.
Fault tree applications within the safety program of Idaho Nuclear Corporation
NASA Technical Reports Server (NTRS)
Vesely, W. E.
1971-01-01
Computerized fault tree analyses are used to obtain both qualitative and quantitative information about the safety and reliability of an electrical control system that shuts the reactor down when certain safety criteria are exceeded, in the design of a nuclear plant protection system, and in an investigation of a backup emergency system for reactor shutdown. The fault tree yields the modes by which the system failure or accident will occur, the most critical failure or accident causing areas, detailed failure probabilities, and the response of safety or reliability to design modifications and maintenance schemes.
Operations management system advanced automation: Fault detection isolation and recovery prototyping
NASA Technical Reports Server (NTRS)
Hanson, Matt
1990-01-01
The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.
2015-08-01
faults are incorporated into the system in order to better understand the EMA reliability, and to aid in designing fault detection software for real...to a fixed angle repeatedly and accurately [16]. The motor in the EHA is used to drive a reversible pump tied to a hydraulic cylinder which moves...24] [25] [26]. These test stands are used for the prognostic testing of EMAS that have had mechanical or electrical faults injected into them. The