Sample records for loca standard problem

  1. Posttest analysis of international standard problem 10 using RELAP4/MOD7. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, M.; Davis, C.B.; Peterson, A.C. Jr.

    RELAP4/MOD7, a best estimate computer code for the calculation of thermal and hydraulic phenomena in a nuclear reactor or related system, is the latest version in the RELAP4 code development series. This paper evaluates the capability of RELAP4/MOD7 to calculate refill/reflood phenomena. This evaluation uses the data of International Standard Problem 10, which is based on West Germany's KWU PKL refill/reflood experiment K9A. The PKL test facility represents a typical West German four-loop, 1300 MW pressurized water reactor (PWR) in reduced scale while maintaining prototypical volume-to-power ratio. The PKL facility was designed to specifically simulate the refill/reflood phase of amore » hypothetical loss-of-coolant accident (LOCA).« less

  2. Parameter study on the influence of prepressurization on PWR fuel rod behavior during normal operation and hypothetical LOCAs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brzoska, B.; Depisch, F.; Fuchs, H.P.

    To analyze the influence of prepressurization on fuel rod behavior, a parametric study has been performed that considers the effects of as-fabricated fuel rod internal prepressure on the normal operation and postulated loss-of-coolant accident (LOCA) rod behavior of a 1300-MW(electric) Kraftwerk Union (KWU) standard pressurized water reactor nuclear power plant. A variation of the prepressure in the range from 15 to 35 bars has only a slight influence on normal operation behavior. Considering the LOCA behavior, only a small temperature increase results from prepressure reduction, while the core-wide straining behavior is improved significantly. The KWU prepressurization takes both conditions intomore » account.« less

  3. Risk-Informed Margin Management (RIMM) Industry Applications IA1 - Integrated Cladding ECCS/LOCA Performance Analysis - Problem Statement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques; Youngblood, Robert; Frepoli, Cesare

    2015-04-01

    The U. S. NRC is currently proposing rulemaking designated as “10 CFR 50.46c” to revise the LOCA/ECCS acceptance criteria to include the effects of higher burnup on cladding performance as well as to address some other issues. The NRC is also currently resolving the public comments with the final rule expected to be issued in the summer of 2016. The impact of the final 50.46c rule on the industry will involve updating of fuel vendor LOCA evaluation models, NRC review and approval, and licensee submittal of new LOCA evaluations or reanalyses and associated technical specification revisions for NRC review andmore » approval. The rule implementation process, both industry and NRC activities, is expected to take 5-10 years following the rule effective date. The need to use advanced cladding designs is expected. A loss of operational margin will result due to the more restrictive cladding embrittlement criteria. Initial and future compliance with the rule may significantly increase vendor workload and licensee cost as a spectrum of fuel rod initial burnup states may need to be analyzed to demonstrate compliance. Consequently there will be an increased focus on licensee decision making related to LOCA analysis to minimize cost and impact, and to manage margin.« less

  4. Nuclear power plant cable materials :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Celina, Mathias C.; Gillen, Kenneth T; Lindgren, Eric Richard

    2013-05-01

    A selective literature review was conducted to assess whether currently available accelerated aging and original qualification data could be used to establish operational margins for the continued use of cable insulation and jacketing materials in nuclear power plant environments. The materials are subject to chemical and physical degradation under extended radiationthermal- oxidative conditions. Of particular interest were the circumstances under which existing aging data could be used to predict whether aged materials should pass loss of coolant accident (LOCA) performance requirements. Original LOCA qualification testing usually involved accelerated aging simulations of the 40-year expected ambient aging conditions followed by amore » LOCA simulation. The accelerated aging simulations were conducted under rapid accelerated aging conditions that did not account for many of the known limitations in accelerated polymer aging and therefore did not correctly simulate actual aging conditions. These highly accelerated aging conditions resulted in insulation materials with mostly inert aging processes as well as jacket materials where oxidative damage dropped quickly away from the air-exposed outside jacket surface. Therefore, for most LOCA performance predictions, testing appears to have relied upon heterogeneous aging behavior with oxidation often limited to the exterior of the cable cross-section a situation which is not comparable with the nearly homogenous oxidative aging that will occur over decades under low dose rate and low temperature plant conditions. The historical aging conditions are therefore insufficient to determine with reasonable confidence the remaining operational margins for these materials. This does not necessarily imply that the existing 40-year-old materials would fail if LOCA conditions occurred, but rather that unambiguous statements about the current aging state and anticipated LOCA performance cannot be provided based on original qualification testing data alone. The non-availability of conclusive predictions for the aging conditions of 40-year-old cables implies that the same levels of uncertainty will remain for any re-qualification or extended operation of these cables. The highly variable aging behavior of the range of materials employed also implies that simple, standardized aging tests are not sufficient to provide the required aging data and performance predictions for all materials. It is recommended that focused studies be conducted that would yield the material aging parameters needed to predict aging behaviors under low dose, low temperature plant equivalent conditions and that appropriately aged specimens be prepared that would mimic oxidatively-aged 40- to 60- year-old materials for confirmatory LOCA performance testing. This study concludes that it is not sufficient to expose materials to rapid, high radiation and high temperature levels with subsequent LOCA qualification testing in order to predictively quantify safety margins of existing infrastructure with regard to LOCA performance. We need to better understand how cable jacketing and insulation materials have degraded over decades of power plant operation and how this aging history relates to service life prediction and the performance of existing equipment to withstand a LOCA situation.« less

  5. Bio-mechanical assessment toward throwing and lifting process of i-LOCA (Innovative Lobster Catcher)

    NASA Astrophysics Data System (ADS)

    Sudiarno, A.; Dewi, D. S.; Putri, M. A.

    2018-04-01

    Indonesia is the country rich in marine resource, one of which is lobster. East java, one of Indonesian province, especially in Region of Gresik and Lamogan, has very huge potential of lobster. Current condition shown that lobster catch by the fisherman mostly depend on lucky factor, which the lobster unintentionally trapped in fisherman’s fish net. By using this mechanism, the number of lobster catch cannot be optimum. Previous researches have produced two versions of i-LOCA, Innovative Lobster Catcher, a special tool for catching the lobster. Although produce more lobster catch, second version of i-LOCA still needs to be scrutinized, one of that is bio-mechanical assessment. The second version of i-LOCA still has no tool to ease throwing and lifting it into the sea. This condition cause Musculoskeletal Disorder (MSD) toward the fisherman. This research perform bio-mechanical assessment toward throwing and lifting process in order to suggest improvement for i-LOCA as the third version. Based on body moment calculation, we found that throwing and lifting process of third version of i-LOCA, each was 3 times and 2 times better than second version of i-LOCA. Meanwhile, Rapid Entire Body Assessment (REBA) score of throwing and lifting process for third version of i-LOCA can be reduced by 5 points compared to second version of i-LOCA.

  6. Rate theory scenarios study on fission gas behavior of U 3 Si 2 under LOCA conditions in LWRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, Yinbin; Gamble, Kyle A.; Andersson, David

    Fission gas behavior of U3Si2 under various loss-of-coolant accident (LOCA) conditions in light water reactors (LWRs) was simulated using rate theory. A rate theory model for U3Si2 that covers both steady-state operation and power transients was developed for the GRASS-SST code based on existing research reactor/ion irradiation experimental data and theoretical predictions of density functional theory (DFT) calculations. The steady-state and LOCA condition parameters were either directly provided or inspired by BISON simulations. Due to the absence of in-pile experiment data for U3Si2's fuel performance under LWR conditions at this stage of accident tolerant fuel (ATF) development, a variety ofmore » LOCA scenarios were taken into consideration to comprehensively and conservatively evaluate the fission gas behavior of U3Si2 during a LOCA.« less

  7. Modern Data Analysis techniques in Noise and Vibration Problems

    DTIC Science & Technology

    1981-11-01

    Hilbert l’une de l’autre. Cette propriete se retrouve dans l’etude de la causalite : ce qui de- finit un critere pratique caracterisant un signal donc, par...entre Ie champ direct et Ie champ reflechi se caracterisent loca- lement par l’existence de frequences pour lesquelles l’interference est totale

  8. JPRS Report Science & Technology Japan

    DTIC Science & Technology

    1989-03-02

    Oxychlorides MOCln_2 (Organic Metal Salts) Alkoxides M(OR)n Acetylacetonate M(C5H702)n Acetates M(C2H302)n Oxalates M(C204)n/2 2.2 Hydrolysis and Gel...more deeply understanding hydrothermal dynamics during not only a major rupture LOCA but also a minor rupture LOCA and clarifying the combination of... hydrothermal dynamics of the coolant from the beginning of LOCA to its end, using a scale model of PWR (pressurized water reactor). Under the ROSA-III Plan

  9. Decay Heat Removal from a GFR Core by Natural Convection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Wesley C.; Hejzlar, Pavel; Driscoll, Michael J.

    2004-07-01

    One of the primary challenges for Gas-cooled Fast Reactors (GFR) is decay heat removal after a loss of coolant accident (LOCA). Due to the fact that thermal gas cooled reactors currently under design rely on passive mechanisms to dissipate decay heat, there is a strong motivation to accomplish GFR core cooling through natural phenomena. This work investigates the potential of post-LOCA decay heat removal from a GFR core to a heat sink using an external convection loop. A model was developed in the form of the LOCA-COLA (Loss of Coolant Accident - Convection Loop Analysis) computer code as a meansmore » for 1D steady state convective heat transfer loop analysis. The results show that decay heat removal by means of gas cooled natural circulation is feasible under elevated post-LOCA containment pressure conditions. (authors)« less

  10. Loss of Coolant Accident (LOCA) / Emergency Core Coolant System (ECCS Evaluation of Risk-Informed Margins Management Strategies for a Representative Pressurized Water Reactor (PWR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques

    A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.

  11. Comparisons of Wilks’ and Monte Carlo Methods in Response to the 10CFR50.46(c) Proposed Rulemaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Szilard, Ronaldo; Zou, Ling

    The Nuclear Regulatory Commission (NRC) is proposing a new rulemaking on emergency core system/loss-of-coolant accident (LOCA) performance analysis. In the proposed rulemaking, designated as 10CFR50.46(c), the US NRC put forward an equivalent cladding oxidation criterion as a function of cladding pre-transient hydrogen content. The proposed rulemaking imposes more restrictive and burnup-dependent cladding embrittlement criteria; consequently nearly all the fuel rods in a reactor core need to be analyzed under LOCA conditions to demonstrate compliance to the safety limits. New analysis methods are required to provide a thorough characterization of the reactor core in order to identify the locations of themore » limiting rods as well as to quantify the safety margins under LOCA conditions. With the new analysis method presented in this work, the limiting transient case and the limiting rods can be easily identified to quantify the safety margins in response to the proposed new rulemaking. In this work, the best-estimate plus uncertainty (BEPU) analysis capability for large break LOCA with the new cladding embrittlement criteria using the RELAP5-3D code is established and demonstrated with a reduced set of uncertainty parameters. Both the direct Monte Carlo method and the Wilks’ nonparametric statistical method can be used to perform uncertainty quantification. Wilks’ method has become the de-facto industry standard to perform uncertainty quantification in BEPU LOCA analyses. Despite its widespread adoption by the industry, the use of small sample sizes to infer statement of compliance to the existing 10CFR50.46 rule, has been a major cause of unrealized operational margin in today’s BEPU methods. Moreover the debate on the proper interpretation of the Wilks’ theorem in the context of safety analyses is not fully resolved yet, even more than two decades after its introduction in the frame of safety analyses in the nuclear industry. This represents both a regulatory and application risk in rolling out new methods. With the 10CFR50.46(c) proposed rulemaking, the deficiencies of the Wilks’ approach are further exacerbated. The direct Monte Carlo approach offers a robust alternative to perform uncertainty quantification within the context of BEPU analyses. In this work, the Monte Carlo method is compared with the Wilks’ method in response to the NRC 10CFR50.46(c) proposed rulemaking.« less

  12. Preliminary LOCA analysis of the westinghouse small modular reactor using the WCOBRA/TRAC-TF2 thermal-hydraulics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, J.; Kucukboyaci, V. N.; Nguyen, L.

    2012-07-01

    The Westinghouse Small Modular Reactor (SMR) is an 800 MWt (> 225 MWe) integral pressurized water reactor (iPWR) with all primary components, including the steam generator and the pressurizer located inside the reactor vessel. The reactor core is based on a partial-height 17x17 fuel assembly design used in the AP1000{sup R} reactor core. The Westinghouse SMR utilizes passive safety systems and proven components from the AP1000 plant design with a compact containment that houses the integral reactor vessel and the passive safety systems. A preliminary loss of coolant accident (LOCA) analysis of the Westinghouse SMR has been performed using themore » WCOBRA/TRAC-TF2 code, simulating a transient caused by a double ended guillotine (DEG) break in the direct vessel injection (DVI) line. WCOBRA/TRAC-TF2 is a new generation Westinghouse LOCA thermal-hydraulics code evolving from the US NRC licensed WCOBRA/TRAC code. It is designed to simulate PWR LOCA events from the smallest break size to the largest break size (DEG cold leg). A significant number of fluid dynamics models and heat transfer models were developed or improved in WCOBRA/TRAC-TF2. A large number of separate effects and integral effects tests were performed for a rigorous code assessment and validation. WCOBRA/TRAC-TF2 was introduced into the Westinghouse SMR design phase to assist a quick and robust passive cooling system design and to identify thermal-hydraulic phenomena for the development of the SMR Phenomena Identification Ranking Table (PIRT). The LOCA analysis of the Westinghouse SMR demonstrates that the DEG DVI break LOCA is mitigated by the injection and venting from the Westinghouse SMR passive safety systems without core heat up, achieving long term core cooling. (authors)« less

  13. Analysis of LOCA Scenarios in the NIST Research Reactor Before and After Fuel Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baek, J. S.; Cheng, L. Y.; Diamond, D.

    An analysis has been done of hypothetical loss-of-coolant-accidents (LOCAs) in the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The purpose of the analysis is to determine if the peak clad temperature remains below the Safety Limit, which is the blister temperature for the fuel. The configuration of the NBSR considered in the analysis is that projected for the future when changes will be made so that shutdown pumps do not operate when a LOCA signal is detected. The analysis was done for the present core with high-enriched uranium (HEU) fuel and with the proposed low-enrichedmore » uranium (LEU) fuel that would be used when the NBSR is converted from one to the other. The analysis consists of two parts. The first examines how the water would drain from the primary system following a break and the possibility for the loss of coolant from within the fuel element flow channels. This work is performed using the TRACE system thermal-hydraulic code. The second looks at the fuel clad temperature as a function of time given that the water may have drained from many of the flow channels and the water in the vessel is in a quasi-equilibrium state. The temperature behavior is investigated using the three-dimensional heat conduction code HEATING7.3. The results in all scenarios considered for both HEU and LEU fuel show that the peak clad temperature remains below the blister temperature.« less

  14. Industry Application ECCS / LOCA Integrated Cladding/Emergency Core Cooling System Performance: Demonstration of LOTUS-Baseline Coupled Analysis of the South Texas Plant Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hongbin; Szilard, Ronaldo; Epiney, Aaron

    Under the auspices of the DOE LWRS Program RISMC Industry Application ECCS/LOCA, INL has engaged staff from both South Texas Project (STP) and the Texas A&M University (TAMU) to produce a generic pressurized water reactor (PWR) model including reactor core, clad/fuel design and systems thermal hydraulics based on the South Texas Project (STP) nuclear power plant, a 4-Loop Westinghouse PWR. A RISMC toolkit, named LOCA Toolkit for the U.S. (LOTUS), has been developed for use in this generic PWR plant model to assess safety margins for the proposed NRC 10 CFR 50.46c rule, Emergency Core Cooling System (ECCS) performance duringmore » LOCA. This demonstration includes coupled analysis of core design, fuel design, thermalhydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results. Within this context, a multi-physics best estimate plus uncertainty (MPBEPU) methodology framework is proposed.« less

  15. Hydrogen motion in Zircaloy-4 cladding during a LOCA transient

    NASA Astrophysics Data System (ADS)

    Elodie, T.; Jean, D.; Séverine, G.; M-Christine, B.; Michel, C.; Berger, P.; Martine, B.; Antoine, A.

    2016-04-01

    Hydrogen and oxygen are key elements influencing the embrittlement of zirconium-based nuclear fuel cladding during the quench phase following a Loss Of Coolant Accident (LOCA). The understanding of the mechanisms influencing the motion of these two chemical elements in the metal is required to fully describe the material embrittlement. High temperature steam oxidation tests were performed on pre-hydrided Zircaloy-4 samples with hydrogen contents ranging between 11 and 400 wppm prior to LOCA transient. Thanks to the use of both Electron Probe Micro-Analysis (EPMA) and Elastic Recoil Detection Analysis (μ-ERDA), the chemical elements partitioning has been systematically quantified inside the prior-β phase. Image analysis and metallographic examinations were combined to provide an average oxygen profile as well as hydrogen profile within the cladding thickness after LOCA transient. The measured hydrogen profile is far from homogeneous. Experimental distributions are compared to those predicted numerically using calculations derived from a finite difference thermo-diffusion code (DIFFOX) developed at IRSN.

  16. Spatializing Sexuality in Jaime Hernandez's "Locas"

    ERIC Educational Resources Information Center

    Jones, Jessica E.

    2009-01-01

    Focusing on Jaime Hernandez's "Locas: The Maggie and Hopey Stories," part of the "Love and Rockets" comic series, I argue that the graphic landscape of this understudied comic offers an illustration of the theories of space in relation to race, gender, and sexuality that have been critical to understandings of Chicana…

  17. Large-break LOCA, in-reactor fuel bundle Materials Test MT-6A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, C.L.; Hesson, G.M.; Pilger, J.P.

    1993-09-01

    This is a report on one of a series of experiments to simulates a loss-of-coolant accident (LOCA) using full-length fuel rods for pressurized water reactors (PWR). The experiments were conducted by Pacific Northwest Laboratory (PNL) under the LOCA simulation Program sponsored by the US Nuclear Regulatory Commission (NRC). The major objective of this program was causing the maximum possible expansion of the cladding on the fuel rods from a short-term adiabatic temperature transient to 1200 K (1700 F) leading to the rupture of the cladding; and second, by reflooding the fuel rods to determine the rate at which the fuelmore » bundle is cooled.« less

  18. 75 FR 53985 - Arizona Public Service Company, et al., Palo Verde Nuclear Generating Station, Unit 3; Temporary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-02

    ... are authorized by law, will not present an undue risk to public health or safety, and are consistent... Public Health and Safety The underlying purpose of 10 CFR 50.46 is to establish acceptance criteria for... (LOCA) and non-LOCA criteria, mechanical design, thermal hydraulics, seismic, core physics, and...

  19. 75 FR 8139 - Biweekly Notice; Applications and Amendments to Facility Operating Licenses Involving No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... the large break loss-of-coolant accident (LOCA) analysis methodology with a reference to WCAP-16009-P... required by 10 CFR 50.91(a), the licensee has provided its analysis of the issue of no significant hazards... Section 5.6.5 to incorporate a new large break LOCA analysis methodology. Specifically, the proposed...

  20. Modelling of LOCA Tests with the BISON Fuel Performance Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Richard L; Pastore, Giovanni; Novascone, Stephen Rhead

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculationsmore » are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.« less

  1. Steady state and LOCA analysis of Kartini reactor using RELAP5/SCDAP code: The role of passive system

    NASA Astrophysics Data System (ADS)

    Antariksawan, Anhar R.; Wahyono, Puradwi I.; Taxwim

    2018-02-01

    Safety is the priority for nuclear installations, including research reactors. On the other hand, many studies have been done to validate the applicability of nuclear power plant based best estimate computer codes to the research reactor. This study aims to assess the applicability of the RELAP5/SCDAP code to Kartini research reactor. The model development, steady state and transient due to LOCA calculations have been conducted by using RELAP5/SCDAP. The calculation results are compared with available measurements data from Kartini research reactor. The results show that the RELAP5/SCDAP model steady state calculation agrees quite well with the available measurement data. While, in the case of LOCA transient simulations, the model could result in reasonable physical phenomena during the transient showing the characteristics and performances of the reactor against the LOCA transient. The role of siphon breaker hole and natural circulation in the reactor tank as passive system was important to keep reactor in safe condition. It concludes that the RELAP/SCDAP could be use as one of the tool to analyse the thermal-hydraulic safety of Kartini reactor. However, further assessment to improve the model is still needed.

  2. Pressure suppression system

    DOEpatents

    Gluntz, D.M.

    1994-10-04

    A pressure suppression system includes a containment vessel surrounding a reactor pressure vessel and defining a drywell therein containing a non-condensable gas. An enclosed wetwell pool is disposed inside the containment vessel, and an enclosed gravity driven cooling system (GDCS) pool is disposed above the wetwell pool in the containment vessel. The GDCS pool includes a plenum for receiving through an inlet the non-condensable gas carried with steam from the drywell following a loss-of-coolant accident (LOCA). A condenser is disposed in the GDCS plenum for condensing the steam channeled therein and to trap the non-condensable gas therein. A method of operation includes draining the GDCS pool following the LOCA and channeling steam released into the drywell following the LOCA into the GDCS plenum for cooling along with the non-condensable gas carried therewith for trapping the gas therein. 3 figs.

  3. Pressure suppression system

    DOEpatents

    Gluntz, Douglas M.

    1994-01-01

    A pressure suppression system includes a containment vessel surrounding a reactor pressure vessel and defining a drywell therein containing a non-condensable gas. An enclosed wetwell pool is disposed inside the containment vessel, and an enclosed gravity driven cooling system (GDCS) pool is disposed above the wetwell pool in the containment vessel. The GDCS pool includes a plenum for receiving through an inlet the non-condensable gas carried with steam from the drywell following a loss-of-coolant accident (LOCA). A condenser is disposed in the GDCS plenum for condensing the steam channeled therein and to trap the non-condensable gas therein. A method of operation includes draining the GDCS pool following the LOCA and channeling steam released into the drywell following the LOCA into the GDCS plenum for cooling along with the non-condensable gas carried therewith for trapping the gas therein.

  4. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  5. TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childerson, M.T.; Fujita, R.K.

    1985-01-01

    A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less

  6. Implementation of non-condensable gases condensation suppression model into the WCOBRA/TRAC-TF2 LOCA safety evaluation code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, J.; Cao, L.; Ohkawa, K.

    2012-07-01

    The non-condensable gases condensation suppression model is important for a realistic LOCA safety analysis code. A condensation suppression model for direct contact condensation was previously developed by Westinghouse using first principles. The model is believed to be an accurate description of the direct contact condensation process in the presence of non-condensable gases. The Westinghouse condensation suppression model is further revised by applying a more physical model. The revised condensation suppression model is thus implemented into the WCOBRA/TRAC-TF2 LOCA safety evaluation code for both 3-D module (COBRA-TF) and 1-D module (TRAC-PF1). Parametric study using the revised Westinghouse condensation suppression model ismore » conducted. Additionally, the performance of non-condensable gases condensation suppression model is examined in the ACHILLES (ISP-25) separate effects test and LOFT L2-5 (ISP-13) integral effects test. (authors)« less

  7. Pressure suppression containment system

    DOEpatents

    Gluntz, Douglas M.; Townsend, Harold E.

    1994-03-15

    A pressure suppression containment system includes a containment vessel surrounding a reactor pressure vessel and defining a drywell therein containing a non-condensable gas. An enclosed wetwell pool is disposed inside the containment vessel, and a gravity driven cooling system (GDCS) pool is disposed above the wetwell pool in the containment vessel. The wetwell pool includes a plenum for receiving the non-condensable gas carried with steam from the drywell following a loss-of coolant-accident (LOCA). The wetwell plenum is vented to a plenum above the GDCS pool following the LOCA for suppressing pressure rise within the containment vessel. A method of operation includes channeling steam released into the drywell following the LOCA into the wetwell pool for cooling along with the non-condensable gas carried therewith. The GDCS pool is then drained by gravity, and the wetwell plenum is vented into the GDCS plenum for channeling the non-condensable gas thereto.

  8. Pressure suppression containment system

    DOEpatents

    Gluntz, D.M.; Townsend, H.E.

    1994-03-15

    A pressure suppression containment system includes a containment vessel surrounding a reactor pressure vessel and defining a drywell therein containing a non-condensable gas. An enclosed wetwell pool is disposed inside the containment vessel, and a gravity driven cooling system (GDCS) pool is disposed above the wetwell pool in the containment vessel. The wetwell pool includes a plenum for receiving the non-condensable gas carried with steam from the drywell following a loss-of-coolant-accident (LOCA). The wetwell plenum is vented to a plenum above the GDCS pool following the LOCA for suppressing pressure rise within the containment vessel. A method of operation includes channeling steam released into the drywell following the LOCA into the wetwell pool for cooling along with the non-condensable gas carried therewith. The GDCS pool is then drained by gravity, and the wetwell plenum is vented into the GDCS plenum for channeling the non-condensable gas thereto. 6 figures.

  9. Rate Theory Modeling and Simulation of Silicide Fuel at LWR Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, Yinbin; Ye, Bei; Hofman, Gerard

    As a promising candidate for the accident tolerant fuel (ATF) used in light water reactors (LWRs), the fuel performance of uranium silicide (U 3Si 2) at LWR conditions needs to be well understood. In this report, rate theory model was developed based on existing experimental data and density functional theory (DFT) calculations so as to predict the fission gas behavior in U 3Si 2 at LWR conditions. The fission gas behavior of U 3Si 2 can be divided into three temperature regimes. During steady-state operation, the majority of the fission gas stays in intragranular bubbles, whereas the dominance of intergranularmore » bubbles and fission gas release only occurs beyond 1000 K. The steady-state rate theory model was also used as reference to establish a gaseous swelling correlation of U 3Si 2 for the BISON code. Meanwhile, the overpressurized bubble model was also developed so that the fission gas behavior at LOCA can be simulated. LOCA simulation showed that intragranular bubbles are still dominant after a 70 second LOCA, resulting in a controllable gaseous swelling. The fission gas behavior of U 3Si 2 at LWR conditions is benign according to the rate theory prediction at both steady-state and LOCA conditions, which provides important references to the qualification of U 3Si 2 as a LWR fuel material with excellent fuel performance and enhanced accident tolerance.« less

  10. MODELLING OF FUEL BEHAVIOUR DURING LOSS-OF-COOLANT ACCIDENTS USING THE BISON CODE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pastore, G.; Novascone, S. R.; Williamson, R. L.

    2015-09-01

    This work presents recent developments to extend the BISON code to enable fuel performance analysis during LOCAs. This newly developed capability accounts for the main physical phenomena involved, as well as the interactions among them and with the global fuel rod thermo-mechanical analysis. Specifically, new multiphysics models are incorporated in the code to describe (1) transient fission gas behaviour, (2) rapid steam-cladding oxidation, (3) Zircaloy solid-solid phase transition, (4) hydrogen generation and transport through the cladding, and (5) Zircaloy high-temperature non-linear mechanical behaviour and failure. Basic model characteristics are described, and a demonstration BISON analysis of a LWR fuel rodmore » undergoing a LOCA accident is presented. Also, as a first step of validation, the code with the new capability is applied to the simulation of experiments investigating cladding behaviour under LOCA conditions. The comparison of the results with the available experimental data of cladding failure due to burst is presented.« less

  11. Industry Application Emergency Core Cooling System Cladding Acceptance Criteria Early Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo H.; Youngblood, Robert W.; Zhang, Hongbin

    2015-09-01

    The U. S. NRC is currently proposing rulemaking designated as “10 CFR 50.46c” to revise the loss-of-coolant-accident (LOCA)/emergency core cooling system (ECCS) acceptance criteria to include the effects of higher burnup on cladding performance as well as to address other technical issues. The NRC is also currently resolving the public comments with the final rule expected to be issued in April 2016. The impact of the final 50.46c rule on the industry may involve updating of fuel vendor LOCA evaluation models, NRC review and approval, and licensee submittal of new LOCA evaluations or re-analyses and associated technical specification revisions formore » NRC review and approval. The rule implementation process, both industry and NRC activities, is expected to take 4-6 years following the rule effective date. As motivated by the new rule, the need to use advanced cladding designs may be a result. A loss of operational margin may result due to the more restrictive cladding embrittlement criteria. Initial and future compliance with the rule may significantly increase vendor workload and licensee cost as a spectrum of fuel rod initial burnup states may need to be analyzed to demonstrate compliance. Consequently, there will be an increased focus on licensee decision making related to LOCA analysis to minimize cost and impact, and to manage margin. The proposed rule would apply to a light water reactor and to all cladding types.« less

  12. Experimental study of Siphon breaker about size effect in real scale reactor design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, S. H.; Ahn, H. S.; Kim, J. M.

    2012-07-01

    Rupture accident within the pipe of a nuclear reactor is one of the main causes of a loss of coolant accident (LOCA). Siphon-breaking is a passive method that can prevent a LOCA. In this study, either a line or a hole is used as a siphon-breaker, and the effect of various parameters, such as the siphon-breaker size, pipe rupture point, pipe rupture size, and the presence of an orifice, are investigated using an experimental facility similar in size to a full-scale reactor. (authors)

  13. R&D Plan for RISMC Industry Application #1: ECCS/LOCA Cladding Acceptance Criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques; Zhang, Hongbin; Epiney, Aaron Simon

    The Nuclear Regulatory Commission (NRC) is finalizing a rulemaking change that would revise the requirements in 10 CFR 50.46. In the proposed new rulemaking, designated as 10 CFR 50.46c, the NRC proposes a fuel performance-based equivalent cladding reacted (ECR) criterion as a function of cladding hydrogen content before the accident (pre-transient) in order to include the effects of higher burnup on cladding performance as well as to address other technical issues. A loss of operational margin may result due to the more restrictive cladding embrittlement criteria. Initial and future compliance with the rule may significantly increase vendor workload and licenseemore » costs as a spectrum of fuel rod initial burnup states may need to be analyzed to demonstrate compliance. The Idaho National Laboratory (INL) has initiated a project, as part of the DOE Light Water Reactor Sustainability Program (LWRS), to develop analytical capabilities to support the industry in the transition to the new rule. This project is called the Industry Application 1 (IA1) within the Risk-Informed Safety Margin Characterization (RISMC) Pathway of LWRS. The general idea behind the initiative is the development of an Integrated Evaluation Model (IEM). The motivation is to develop a multiphysics framework to analyze how uncertainties are propagated across the stream of physical disciplines and data involved, as well as how risks are evaluated in a LOCA safety analysis as regulated under 10 CFR 50.46c. This IEM is called LOTUS which stands for LOCA Toolkit for US, and it represents the LWRS Program’s response to the proposed new rule making. The focus of this report is to complete an R&D plan to describe the demonstration of the LOCA/ECCS RISMC Industry Application # 1 using the advanced RISMC Toolkit and methodologies. This report includes the description and development plan for a RISMC LOCA tool that fully couples advanced MOOSE tools already in development in order to characterize and optimize plant safety and operational margins. Advanced MOOSE tools that are needed to complete this integrated evaluation model are: RAVEN, RELAP-7, BISON, and MAMMOTH.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohr, C.L.; Rausch, W.N.; Hesson, G.M.

    The LOCA Simulation Program in the NRU reactor is the first set of experiments to provide data on the behavior of full-length, nuclear-heated PWR fuel bundles during the heatup, reflood, and quench phases of a loss-of-coolant accident (LOCA). This paper compares the temperature time histories of 4 experimental test cases with 4 computer codes: CE-THERM, FRAP-T5, GT3-FLECHT, and TRUMP-FLECHT. The preliminary comparisons between prediction and experiment show that the state-of-the art fuel codes have large uncertainties and are not necessarily conservative in predicting peak temperatures, turn around times, and bundle quench times.

  15. Localized Multi-Model Extremes Metrics for the Fourth National Climate Assessment

    NASA Astrophysics Data System (ADS)

    Thompson, T. R.; Kunkel, K.; Stevens, L. E.; Easterling, D. R.; Biard, J.; Sun, L.

    2017-12-01

    We have performed localized analysis of scenario-based datasets for the Fourth National Climate Assessment (NCA4). These datasets include CMIP5-based Localized Constructed Analogs (LOCA) downscaled simulations at daily temporal resolution and 1/16th-degree spatial resolution. Over 45 temperature and precipitation extremes metrics have been processed using LOCA data, including threshold, percentile, and degree-days calculations. The localized analysis calculates trends in the temperature and precipitation extremes metrics for relatively small regions such as counties, metropolitan areas, climate zones, administrative areas, or economic zones. For NCA4, we are currently addressing metropolitan areas as defined by U.S. Census Bureau Metropolitan Statistical Areas. Such localized analysis provides essential information for adaptation planning at scales relevant to local planning agencies and businesses. Nearly 30 such regions have been analyzed to date. Each locale is defined by a closed polygon that is used to extract LOCA-based extremes metrics specific to the area. For each metric, single-model data at each LOCA grid location are first averaged over several 30-year historical and future periods. Then, for each metric, the spatial average across the region is calculated using model weights based on both model independence and reproducibility of current climate conditions. The range of single-model results is also captured on the same localized basis, and then combined with the weighted ensemble average for each region and each metric. For example, Boston-area cooling degree days and maximum daily temperature is shown below for RCP8.5 (red) and RCP4.5 (blue) scenarios. We also discuss inter-regional comparison of these metrics, as well as their relevance to risk analysis for adaptation planning.

  16. Preliminary phenomena identification and ranking tables for simplified boiling water reactor Loss-of-Coolant Accident scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroeger, P.G.; Rohatgi, U.S.; Jo, J.H.

    1998-04-01

    For three potential Loss-of-Coolant Accident (LOCA) scenarios in the General Electric Simplified Boiling Water Reactors (SBWR) a set of Phenomena Identification and Ranking Tables (PIRT) is presented. The selected LOCA scenarios are typical for the class of small and large breaks generally considered in Safety Analysis Reports. The method used to develop the PIRTs is described. Following is a discussion of the transient scenarios, the PIRTs are presented and discussed in detailed and in summarized form. A procedure for future validation of the PIRTs, to enhance their value, is outlined. 26 refs., 25 figs., 44 tabs.

  17. TRAC analyses for CCTF and SCTF tests and UPTF design/operation. [Cylindrical Core Test Facility; Slab Core Test Facility; Upper Plenum Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spore, J.W.; Cappiello, M.W.; Dotson, P.J.

    The analytical support in 1985 for Cylindrical Core Test Facility (CCTF), Slab Core Test Facility (SCTF), and Upper Plenum Test Facility (UPTF) tests involves the posttest analysis of 16 tests that have already been run in the CCTF and the SCTF and the pretest analysis of 3 tests to be performed in the UPTF. Posttest analysis is used to provide insight into the detailed thermal-hydraulic phenomena occurring during the refill and reflood tests performed in CCTF and SCTF. Pretest analysis is used to ensure that the test facility is operated in a manner consistent with the expected behavior of anmore » operating full-scale plant during an accident. To obtain expected behavior of a plant during an accident, two plant loss-of-coolant-accident (LOCA) calculations were performed: a 200% cold-leg-break LOCA calculation for a 2772 MW(t) Babcock and Wilcox plant and a 200% cold-leg-break LOCA calculation for a 3315 MW(t) Westinghouse plant. Detailed results are presented for several CCTF UPI tests and the Westinghouse plant analysis.« less

  18. Aging, Loss-of-Coolant Accident (LOCA), and high potential testing of damaged cables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, R.A.; Jacobus, M.J.

    1994-04-01

    Experiments were conducted to assess the effects of high potential testing of cables and to assess the survivability of aged and damaged cables under Loss-of-Coolant Accident (LOCA) conditions. High potential testing at 240 Vdc/mil on undamaged cables suggested that no damage was incurred on the selected virgin cables. During aging and LOCA testing, Okonite ethylene propylene rubber (EPR) cables with a bonded jacket experienced unexpected failures. The failures appear to be primarily related to the level of thermal aging and the presence of a bonded jacket that ages more rapidly than the insulation. For Brand Rex crosslinked polyolefin (XLPO) cables,more » the results suggest that 7 mils of insulation remaining should give the cables a high probability of surviving accident exposure following aging. The voltage necessary to detect when 7 mils of insulation remain on unaged Brand Rex cables is approximately 35 kVdc. This voltage level would almost certainly be unacceptable to a utility for use as a damage assessment tool. However, additional tests indicated that a 35 kvdc voltage application would not damage virgin Brand Rex cables when tested in water. Although two damaged Rockbestos silicone rubber cables also failed during the accident test, no correlation between failures and level of damage was apparent.« less

  19. A Study of the Jettisoning of JP-4 Fuel in the Atmosphere

    DTIC Science & Technology

    1975-11-01

    t e r m i n e d by m e a s u r i n g the s tagna t ion p r e s s u r e with a pitot probe , shown in Fig. 21, with the s p h e r i c a l t...ip. The rod loca ted to the r igh t of the pitot probe was loca ted in the s a m e photographic plane as the suspended J P - 4 drop- le t...of Atomiza t ion in C a r b u r e t o r s . " NACA TM 518, 1929. 13. Lapple , C. E . , Henry , J . P . , J r . , and Blake, D. E. " A t o

  20. Overview of Fuel Rod Simulator Usage at ORNL

    NASA Astrophysics Data System (ADS)

    Ott, Larry J.; McCulloch, Reg

    2004-02-01

    During the 1970s and early 1980s, the Oak Ridge National Laboratory (ORNL) operated large out-of-reactor experimental facilities to resolve thermal-hydraulic safety issues in nuclear reactors. The fundamental research ranged from material mechanical behavior of fuel cladding during the depressurization phase of a loss-of-coolant accident (LOCA) to basic heat transfer research in gas- or sodium-cooled cores. The largest facility simulated the initial phase (less than 1 min. of transient time) of a LOCA in a commercial pressurized-water reactor. The nonnuclear reactor cores of these facilities were mimicked via advanced, highly instrumented electric fuel rod simulators locally manufactured at ORNL. This paper provides an overview of these experimental facilities with an emphasis on the fuel rod simulators.

  1. Numerical study of air ingress transition to natural circulation in a high temperature helium loop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franken, Daniel; Gould, Daniel; Jain, Prashant K.

    Here, the generation-IV high temperature gas cooled reactors (HTGRs) are designed with many passive safety features, one of which is the ability to passively remove heat under a loss of coolant accident (LOCA). However, several common reactor designs do not prevent against a large break in the coolant system and may therefore experience a depressurized LOCA. This would lead to air entering into the reactor system via several potential modes of ingress: diffusion, gravity currents, and natural circulation. At the onset of a LOCA, the initial rate of air ingress is expected to be very slow because it is governedmore » by molecular diffusion. However, after several hours, natural circulation would commence, thus, bringing the air into the reactor system at a much higher rate. As a consequence, air ingress would cause the high temperature graphite matrix to oxidize, leading to its thermal degradation and decreased passive heat (decay) removal capability. Therefore, it is essential to understand the transition of air ingress from molecular diffusion to natural circulation in an HTGR system. This paper presents results from a computational fluid dynamics (CFD) model to study the air ingress transition behavior. These results are validated against an h-shaped high temperature helium loop experiment. Details are provided to quantitatively predict the transition time from molecular diffusion to natural circulation.« less

  2. Numerical study of air ingress transition to natural circulation in a high temperature helium loop

    DOE PAGES

    Franken, Daniel; Gould, Daniel; Jain, Prashant K.; ...

    2017-09-21

    Here, the generation-IV high temperature gas cooled reactors (HTGRs) are designed with many passive safety features, one of which is the ability to passively remove heat under a loss of coolant accident (LOCA). However, several common reactor designs do not prevent against a large break in the coolant system and may therefore experience a depressurized LOCA. This would lead to air entering into the reactor system via several potential modes of ingress: diffusion, gravity currents, and natural circulation. At the onset of a LOCA, the initial rate of air ingress is expected to be very slow because it is governedmore » by molecular diffusion. However, after several hours, natural circulation would commence, thus, bringing the air into the reactor system at a much higher rate. As a consequence, air ingress would cause the high temperature graphite matrix to oxidize, leading to its thermal degradation and decreased passive heat (decay) removal capability. Therefore, it is essential to understand the transition of air ingress from molecular diffusion to natural circulation in an HTGR system. This paper presents results from a computational fluid dynamics (CFD) model to study the air ingress transition behavior. These results are validated against an h-shaped high temperature helium loop experiment. Details are provided to quantitatively predict the transition time from molecular diffusion to natural circulation.« less

  3. A guide to aviation education resources

    DOT National Transportation Integrated Search

    1993-01-01

    The National Coalition for Aviation Education represents industry and labor, united to promote : aviation education activities and resources; increase public understanding of the importance of aviation; and support educational initiatives at the loca...

  4. Partnership strategies for safety roadside rest areas.

    DOT National Transportation Integrated Search

    2009-01-01

    This project studied the many factors influencing the potential for public private partnerships for Safety : Roadside Rest Areas. It found that Federal and California State laws and regulations represent important : barriers to certain types and loca...

  5. A framework for collaboration in public transit systems

    DOT National Transportation Integrated Search

    1997-05-01

    The 494 transportation corridor stretches eight miles and connects residential suburbs with major commercial areas, including the Mall of America and the Minneapolis-St. Paul International Airport. The corridor includes l-494 as well as parallel loca...

  6. Passive containment cooling system

    DOEpatents

    Billig, P.F.; Cooke, F.E.; Fitch, J.R.

    1994-01-25

    A passive containment cooling system includes a containment vessel surrounding a reactor pressure vessel and defining a drywell therein containing a non-condensable gas. An enclosed wetwell pool is disposed inside the containment vessel, and a gravity driven cooling system (GDCS) pool is disposed above the wetwell pool in the containment vessel and is vented to the drywell. An isolation pool is disposed above the GDCS pool and includes an isolation condenser therein. The condenser has an inlet line disposed in flow communication with the drywell for receiving the non-condensable gas along with any steam released therein following a loss-of-coolant accident (LOCA). The condenser also has an outlet line disposed in flow communication with the drywell for returning to the drywell both liquid condensate produced upon cooling of the steam and the non-condensable gas for reducing pressure within the containment vessel following the LOCA. 1 figure.

  7. Passive containment cooling system

    DOEpatents

    Billig, Paul F.; Cooke, Franklin E.; Fitch, James R.

    1994-01-01

    A passive containment cooling system includes a containment vessel surrounding a reactor pressure vessel and defining a drywell therein containing a non-condensable gas. An enclosed wetwell pool is disposed inside the containment vessel, and a gravity driven cooling system (GDCS) pool is disposed above the wetwell pool in the containment vessel and is vented to the drywell. An isolation pool is disposed above the GDCS pool and includes an isolation condenser therein. The condenser has an inlet line disposed in flow communication with the drywell for receiving the non-condensable gas along with any steam released therein following a loss-of-coolant accident (LOCA). The condenser also has an outlet line disposed in flow communication with the drywell for returning to the drywell both liquid condensate produced upon cooling of the steam and the non-condensable gas for reducing pressure within the containment vessel following the LOCA.

  8. Influence of chemical composition of zirconium alloy E110 on embrittlement under LOCA conditions - Part 1: Oxidation kinetics and macrocharacteristics of structure and fracture

    NASA Astrophysics Data System (ADS)

    Nikulin, S. A.; Rozhnov, A. B.; Belov, V. A.; Li, E. V.; Glazkina, V. S.

    2011-11-01

    Exploratory investigations of the influence of alloying and impurity content in the E110 alloy cladding tubes on the behavior under conditions of Loss of Coolant Accidents (LOCA) has been performed. Three alloys of E110 type have been tested: E110 alloy of nominal composition Zr-1%Nb (E110), E110 alloy of modified composition Zr-1%Nb-0.12%Fe-0.13%O (E110M), E110 alloy of nominal composition Zr-1%Nb with reduced impurity content (E110G). Alloys E110 and E110M were manufactured on the electrolytic basis and alloy E110G was manufactured on the basis of zirconium sponge. The high temperature oxidation tests in steam ( T = 1100 °C, 18% of equivalent cladding reacted (ECR)) have been conducted, kinetics of oxidation was investigated. Quantitative research of structure and fracture macrocharacteristics was performed by means of optical and electron microscopy. The results received were compared with the residual ductility of specimens. The results of the investigation showed the existence of "breakaway oxidation" kinetics and white spalling oxide in E110 and E110M alloys while the specimen oxidation kinetics in E110G alloy was characterized by a parabolic law and specimens had a dense black oxide. Oxygen and iron alloying in the E110 alloy positively changed the macrocharacteristics of structure and fracture. However, in general, it did not improve the resistance to embrittlement in LOCA conditions apparently because of a strong impurity influence caused by electrolytic process of zirconium production.

  9. Sensitivity analysis of FeCrAl cladding and U3Si2 fuel under accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamble, Kyle Allan Lawrence; Hales, Jason Dean

    2016-08-01

    The purpose of this milestone report is to highlight the results of sensitivity analyses performed on two accident tol- erant fuel concepts: U3Si2 fuel and FeCrAl cladding. The BISON fuel performance code under development at Idaho National Laboratory was coupled to Sandia National Laboratories’ DAKOTA software to perform the sensitivity analyses. Both Loss of Coolant (LOCA) and Station blackout (SBO) scenarios were analyzed using main effects studies. The results indicate that for FeCrAl cladding the input parameters with greatest influence on the output metrics of interest (fuel centerline temperature and cladding hoop strain) during the LOCA were the isotropic swellingmore » and fuel enrichment. For U3Si2 the important inputs were found to be the intergranular diffusion coefficient, specific heat, and fuel thermal conductivity. For the SBO scenario, Young’s modulus was found to be influential in FeCrAl in addition to the isotropic swelling and fuel enrichment. Contrarily to the LOCA case, the specific heat of U3Si2 was found to have no effect during the SBO. The intergranular diffusion coefficient and fuel thermal conductivity were still found to be of importance. The results of the sensitivity analyses have identified areas where further research is required including fission gas behavior in U3Si2 and irradiation swelling in FeCrAl. Moreover, the results highlight the need to perform the sensitivity analyses on full length fuel rods for SBO scenarios.« less

  10. Traffic control at stop sign approaches.

    DOT National Transportation Integrated Search

    2003-04-01

    The objectives of this report were to: a) determine the number of crashes in Kentucky involving a driver disregarding a stop sign and the locations where these occur, b) determine the characteristics of these crashes, c) investigate loca tions with a...

  11. Innovative tools and techniques in identifying highway safety improvement projects : project summary.

    DOT National Transportation Integrated Search

    2017-01-01

    Researchers completed the following activities: - Reviewed the literature, state HSIP processes and practices, and HSIP tools used by various agencies. - Evaluated the applicability of safety assessment methods and tools used by other states and loca...

  12. VHF-FM Emergency Position Indicating Radio Beacon

    DOT National Transportation Integrated Search

    1978-10-01

    This report describes the development and testing of an Emergency Position Indicating Radio Beacon (EPIRB) which operates on Channels 15 and 16 of the Maritime Mobile VHF Band. It provides functions necessary to ensure that distress alerting and loca...

  13. Vehicle kinematics in turns and the role of cornering lamps in driver vision.

    DOT National Transportation Integrated Search

    2010-11-01

    "SAE Recommended Practice J852 and ECE Regulations 119 and 48 for cornering lamps : were compared. Photometric points described in each specification were then compared : to naturalistic low-speed turn trajectories produced by 87 drivers. Future loca...

  14. Church ladies, good girls, and locas: stigma and the intersection of gender, ethnicity, mental illness, and sexuality in relation to HIV risk.

    PubMed

    Collins, Pamela Y; von Unger, Hella; Armbrister, Adria

    2008-08-01

    Inner city women with severe mental illness may carry multiple stigmatized statuses. In some contexts these include having a mental illness, being a member of an ethnic minority group, being an immigrant, being poor, and being a woman who does not live up to gendered expectations. These potentially stigmatizing identities influence both the way women's sexuality is viewed and their risk for HIV infection. This qualitative study applies the concept of intersectionality to facilitate understanding of how these multiple identities intersect to influence women's sexuality and HIV risk. We report the firsthand accounts of 24 Latina women living with severe mental illness in New York City. In examining the interlocking domains of these women's sexual lives, we find that the women seek identities that define them in opposition to the stigmatizing label of "loca" (Spanish for crazy) and bestow respect and dignity. These identities have unfolded through the additional themes of "good girls" and "church ladies". Therefore, in spite of their association with the "loca", the women also identify with faith and religion ("church ladies") and uphold more traditional gender norms ("good girls") that are often undermined by the realities of life with a severe mental illness and the stigma attached to it. However, the participants fall short of their gender ideals and engage in sexual relationships that they experience as disempowering and unsatisfying. The effects of their multiple identities as poor Latina women living with severe mental illness in an urban ethnic minority community are not always additive, but the interlocking effects can facilitate increased HIV risks. Interventions should acknowledge women's multiple layers of vulnerability, both individual and structural, and stress women's empowerment in and beyond the sexual realm.

  15. Effect of excess dietary salt on calcium metabolism and bone mineral in a spaceflight rat model

    NASA Technical Reports Server (NTRS)

    Navidi, Meena; Wolinsky, Ira; Fung, Paul; Arnaud, Sara B.

    1995-01-01

    High levels of salt promote urinary calcium (UCa) loss and have the potential to cause bone mineral deficits if intestinal Ca absorption does not compensate for these losses. To determine the effect of excess dietary salt on the osteopenia that follows skeletal unloading, we used a spaceflight model that unloads the hindlimbs of 200-g rats by tail suspension (S). Rats were studied for 2 wk on diets containing high salt (4 and 8%) and normal calcium (0.45%) and for 4 wk on diets containing 8% salt (HiNa) and 0.2% Ca (LoCa). Final body weights were 9-11% lower in S than in control rats (C) in both experiments, reflecting lower growth rates in S than in C during pair feeding. UCa represented 12% of dietary Ca on HiNA diets and was twofold higher in S than in C transiently during unloading. Net intestinal Ca absorption was consistently 11-18% lower in S than in C. Serum 1,25-dihydroxyvitamin D was unaffected by either LoCa or HiNa diets in S but was increased by LoCa and HiNa diets in C. Despite depressed intestinal Ca absoption in S and a sluggish response of the Ca endocrine system to HiNa diets, UCa loss did not appear to affect the osteopenia induced by unloading. Although any deficit in bone mineral content from HiNa diets may have been too small to detect or the duration of the study too short to manifest, there were clear differences in Ca metabolism from control levels in the response of the spaceflight model to HiNa diets, indicated by depression of intestinal Ca absorption and its regulatory hormone.

  16. Oxidation of 304 stainless steel in high-temperature steam

    NASA Astrophysics Data System (ADS)

    Ishida, Toshihisa; Harayama, Yasuo; Yaguchi, Sinnosuke

    1986-08-01

    An experiment on oxidation of 304 stainless steel was performed in steam between 900°C and 1350°C, using the spare cladding of the reactor of the nuclear-powered ship Mutsu. The temperature range was appropriate for a postulated loss of coolant accident (LOCA) analysis of a LWR. The oxidation kinetics were found to obey the parabolic law during the first period of 8 min. After the first period, the parabolic reaction rate constant decreased in the case of heating temperatures between 1100°C and 1250°C. At 1250°C, especially, a marked decrease was observed in the oxide scale-forming kinetics when the surface treated initially by mechanical polishing and given a residual stress. This enhanced oxidation resistance was attributed to the presence of a chromium-enriched layer which was detected by use of an X-ray microanalyzer. The oxidation kinetics equation obtained for the first 8 min is applicable to the model calculation of a hypothetical LOCA in a LWR, employing 304 stainless steel cladding.

  17. Hot Cell Installation and Demonstration of the Severe Accident Test Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linton, Kory D.; Burns, Zachary M.; Terrani, Kurt A.

    A Severe Accident Test Station (SATS) capable of examining the oxidation kinetics and accident response of irradiated fuel and cladding materials for design basis accident (DBA) and beyond design basis accident (BDBA) scenarios has been successfully installed and demonstrated in the Irradiated Fuels Examination Laboratory (IFEL), a hot cell facility at Oak Ridge National Laboratory. The two test station modules provide various temperature profiles, steam, and the thermal shock conditions necessary for integral loss of coolant accident (LOCA) testing, defueled oxidation quench testing and high temperature BDBA testing. The installation of the SATS system restores the domestic capability to examinemore » postulated and extended LOCA conditions on spent fuel and cladding and provides a platform for evaluation of advanced fuel and accident tolerant fuel (ATF) cladding concepts. This document reports on the successful in-cell demonstration testing of unirradiated Zircaloy-4. It also contains descriptions of the integral test facility capabilities, installation activities, and out-of-cell benchmark testing to calibrate and optimize the system.« less

  18. Joy Development Properties, LLC, Pleasant Valley, Iowa and Summit Concrete, Inc., LeClaire, Iowa - Clean Water Act Public Notice

    EPA Pesticide Factsheets

    The EPA is providing notice of a proposed Administrative Penalty Assessment against Joy Development Properties, LLC and Summit Concrete, Inc., for alleged violations at the companies’ residential construction site known as the Schutter Farms Addition loca

  19. The art in getting flocks and herds to flerds

    USDA-ARS?s Scientific Manuscript database

    Flerds (small ruminants that consistently stay near cattle under free-ranging conditions) offer four distinct advantages over stocking simply flocks and herds to carry out mixed species stocking. One of the main advantages flerds offer is added protection from canine predation, reduced time in loca...

  20. BESAFE II: Accident safety analysis code for MFE reactor designs

    NASA Astrophysics Data System (ADS)

    Sevigny, Lawrence Michael

    The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications to BESAFE II is discussed in Chapter 6, for example, by adding additional environmental indices such as a waste disposal index. The biggest improvement to BESAFE II would be an increase in the database of activation product mobilization for a larger spectrum of fusion reactor materials. The ultimate goal we have is for BESAFE II to become part of a systems design program which would include economic factors and allow both safety and the cost of electricity to influence design.

  1. False Memories for Shape Activate the Lateral Occipital Complex

    ERIC Educational Resources Information Center

    Karanian, Jessica M.; Slotnick, Scott D.

    2017-01-01

    Previous functional magnetic resonance imaging evidence has shown that false memories arise from higher-level conscious processing regions rather than lower-level sensory processing regions. In the present study, we assessed whether the lateral occipital complex (LOC)--a lower-level conscious shape processing region--was associated with false…

  2. AROMATASE ACTIVITY IN THE OVARY OF MOSQUITOFISH GAMBUSIA HOLBROOKI, COLLECTED FROM THE FENHOLLOWAY AND ECONFINA RIVERS, FLORIDA (

    EPA Science Inventory

    Scientists are increasingly aware of the adverse effects of environmental contaminants, including their ability to alter the normal development and reproduction of wildlife species by modifying the endocrine system. Female mosquitofish living downstream of a paper mill plant loca...

  3. Imaging Near-Earth Electron Densities Using Thomson Scattering

    DTIC Science & Technology

    2009-01-15

    geocentric solar magnetospheric (GSM) coordinates1. TECs were initially computed from a viewing loca- tion at the Sun-Earth L1 Lagrange point2 for both...further find that an elliptical Earth orbit (apogee ~30 RE) is a suitable lower- cost option for a demonstration mission. 5. SIMULATED OBSERVATIONS We

  4. Multitarget Tracking Studies.

    DTIC Science & Technology

    1981-07-01

    ie "n.: r c:.t. ’ur wn Le r":2- ence Indistes :nac pole a n :arc loca- 40 ":Lors !AVe a sinificant effect n convergence raes.- .ahdn te poles and...LATTICE ALGORITHMS ( LADO ) 1. The Normalized AR Lattice (ARN) This algorithm implements the normalized algorithm described in [23]. The AR coefficients are

  5. 77 FR 53923 - Biweekly Notice;

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ... to be publicly disclosed. The NRC posts all comment submissions at http://www.regulations.gov as well... (psig) to 49.7 psig for the design basis loss-of- coolant accident (LOCA). In support of the revised P a... analysis. The P a remains below the containment design pressure of 50 psig because of the change in the...

  6. APT Blanket System Loss-of-Coolant Accident (LOCA) Based on Initial Conceptual Design - Case 4: External Pressurizer Surge Line Break Near Inlet Header

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports documenting accident scenario simulations for the Accelerator Production of Tritium (APT) blanket heat removal systems. The simulations were performed in support of the Preliminary Safety Analysis Report (PSAR) for the APT.

  7. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  8. Effect of sampling location on L* values and pH measurements and their relationship in broiler breast fillets

    USDA-ARS?s Scientific Manuscript database

    Lightness (CIELAB L*) and pH values are the most widely measured quality indicators for broiler breast fillets (pectoralis major). Measurement of L* values with a spectrophotometer can be done through Specular Component Included (SCI) or Specular Component Excluded (SCE) modes. The intra-fillet loca...

  9. A combination of sexual and ecological divergence contributes to the spread of a chromosomal rearrangement during initial stages of speciation

    USDA-ARS?s Scientific Manuscript database

    Chromosomal rearrangements between sympatric species often contain multiple loci contributing to assortative mating, local adaptation, and hybrid sterility. When and how these associations arise during the process of speciation remains a subject of debate. Here, we address the relative roles of loca...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faidy, C.; Gilles, P.

    The objective of the seminar was to present the current state of the art in Leak-Before-Break (LBB) methodology development, validation, and application in an international forum. With particular emphasis on industrial applications and regulatory policies, the seminar provided an opportunity to compare approaches, experiences, and codifications developed by different countries. The seminar was organized into four topic areas: status of LBB applications; technical issues in LBB methodology; complementary requirements (leak detection and inspection); LBB assessment and margins. As a result of this seminar, an improved understanding of LBB gained through sharing of different viewpoints from different countries, permits consideration of:more » simplified pipe support design and possible elimination of loss-of-coolant-accident (LOCA) mechanical consequences for specific cases; defense-in-depth type of applications without support modifications; support of safety cases for plants designed without the LOCA hypothesis. In support of these activities, better estimates of the limits to the LBB approach should follow, as well as an improvement in codifying methodologies. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less

  11. Estimation of ring tensile properties of steam oxidized Zircaloy-4 fuel cladding under simulated LOCA condition

    NASA Astrophysics Data System (ADS)

    Shriwastaw, R. S.; Sawarn, Tapan K.; Banerjee, Suparna; Rath, B. N.; Dubey, J. S.; Kumar, Sunil; Singh, J. L.; Bhasin, Vivek

    2017-09-01

    The present study involves the estimation of ring tensile properties of Indian Pressurised Heavy Water Reactor (IPHWR) fuel cladding made of Zircaloy-4, subjected to experiments under a simulated loss-of-coolant-accident (LOCA) condition. Isothermal steam oxidation experiments were conducted on clad tube specimens at temperatures ranging from 900 to 1200 °C at an interval of 50 °C for different soaking periods with subsequent quenching in water at ambient temperature. The specimens, which survived quenching, were then subjected to ambient temperature ring tension test (RTT). The microstructure was correlated with the mechanical properties. The yield strength (YS) and ultimate tensile strength (UTS) increased initially with rise in oxidation temperature and time duration but then decreased with further increase in oxidation. Ductility is adversely affected with rising oxidation temperature and longer holding time. A higher fraction of load bearing phase and lower oxygen content in it ensures higher residual ductility. Cladding shows almost zero ductility behavior in RIT when load bearing phase fraction is less than 0.72 and its average oxygen concentration is greater than 0.58 wt%.

  12. Post-quench ductility evaluation of Zircaloy-4 and select iron alloys under design basis and extended LOCA conditions

    NASA Astrophysics Data System (ADS)

    Yan, Y.; Keiser, J. R.; Terrani, K. A.; Bell, G. L.; Snead, L. L.

    2014-05-01

    Oxidation experiments were conducted at 1200 °C in flowing steam with tubing specimens of Zircaloy-4, 317, 347 stainless steels, and the commercial FeCrAl alloy APMT. The purpose was to determine the oxidation behavior and post-quench ductility under postulated and extended LOCA conditions. The parabolic rate constant for Zircaloy-4 tubing samples at 1200 °C was determined to be k = 2.173 × 107 g2/cm4/s, in excellent agreement with the Cathcart-Pawel correlation. The APMT alloy experienced the slowest oxidation rate among all materials examined in this work. The ductility of post-quenched samples was evaluated by ring compression tests at 135 °C. For Zircaloy-4, the ductile to brittle transition occurs at an equivalent cladding reacted (ECR) of 19.3%. SS-347 was still ductile after being oxidized for 2400 s (CP-ECR ≈ 50%), but the maximum load was reduced significantly owing to the metal layer thickness reduction. No ductility decrease was observed for the post-quenched APMT samples oxidized up to 4 h.

  13. Radiological dose in Muria peninsula from SB-LOCA event

    NASA Astrophysics Data System (ADS)

    Sunarko; Suud, Zaki

    2017-01-01

    Dose assessment for accident condition is performed for Muria Peninsula region using source-term from Three-Mile Island unit 2 SB-LOCA accident. Xe-133, Kr-88, 1-131 and Cs-137 isotopes are considered in the calculation. The effluent is assumed to be released from a 50 m stack. Lagrangian particle dispersion method (LPDM) employing non-Gaussian dispersion coefficient in 3-dimensional mass-consistent wind-field is employed to obtain periodic surface-level concentration which is then time-integrated to obtain spatial distribution of ground-level dose. In 1-hour simulation, segmented plumes with 60 seconds duration with a total of 18.000 particles involved. Simulations using 6-hour worst-case meteorological data from Muria peninsula results in a peak external dose of around 1.668 mSv for low scenario and 6.892 mSv for high scenario in dry condition. In wet condition with 5 mm/hour and 10 mm/hour rain for the whole duration of the simulation provides only minor effect to dose. The peak external dose is below the regulatory limit of 50 mSv for effective skin dose from external gamma exposure.

  14. Manpower Requirements Report for FY 1982

    DTIC Science & Technology

    1981-02-01

    Specifically included are program elements for industrial preparedness, second destination transportation, property disposal, production engineering ...artillery, and combat - engineers . Army policy accepts the fact that women will serve in loca- .. tions throughout the battlefield, will be expected to... industrial engineering work measurement techniques and computerized models such as the Logistics Composite Model (LCOM). MEP policy emanates from the

  15. Effects of short-term tocopherol (T) feeding on structure-localized protein tyrosine nitration (pTN) patterns of mitochondrial ATPase following endotoxin (LPS) challenge in beef calves.

    USDA-ARS?s Scientific Manuscript database

    Mitochondrial ATPase/Complex-V (MCV) is an electron transport chain (ETC) component needed for ATP synthesis. The ETC, exquisitely sensitive to proinflammatory mediators (PIM), generates oxynitrogen reactants leading to pTN formation as mitochondrial membrane leakage occurs. Immunohistochemical loca...

  16. Evaluation of a barley core collection for spot form net blotch reaction reveals distinct genotype specific pathogen virulence and host susceptibility

    USDA-ARS?s Scientific Manuscript database

    Spot form net blotch (SFNB) caused by Pyrenophora teres Drechs. f. maculata Smedeg., (anamorph Drechslera teres [Sacc.] Shoem.) is a major foliar disease of barley (Hordeum vulgare L.) worldwide. SFNB epidemics have recently been observed in major barley producing countries, suggesting that the loca...

  17. Patient to Health Team Communications Preferences and Perceptions of Secure Messaging

    DTIC Science & Technology

    2017-04-25

    Ellicott C itv MD, 25- 27 April 2017 in accordance with MDWI 4 1- 108, has been approved and assigned loca l fi le # 17202. 2. Pe11 inent biographic...scholarl y activities o f our professional staff and students, which is an essential component of Wi lford Hall Ambulatory Surgical Center (WHASC

  18. Enhanced thermal conductivity oxide nuclear fuels by co-sintering with BeO: II. Fuel performance and neutronics

    NASA Astrophysics Data System (ADS)

    McCoy, Kevin; Mays, Claude

    2008-04-01

    The fuel rod performance and neutronics of enhanced thermal conductivity oxide (ECO) nuclear fuel with BeO have been compared to those of standard UO 2 fuel. The standards of comparison were that the ECO fuel should have the same infinite neutron-multiplication factor kinf at end of life and provide the same energy extraction per fuel assembly over its lifetime. The BeO displaces some uranium, so equivalence with standard UO 2 fuel was obtained by increasing the burnup and slightly increasing the enrichment. The COPERNIC fuel rod performance code was adapted to account for the effect of BeO on thermal properties. The materials considered were standard UO 2, UO 2 with 4.0 vol.% BeO, and UO 2 with 9.6 vol.% BeO. The smaller amount of BeO was assumed to provide increases in thermal conductivity of 0, 5, or 10%, whereas the larger amount was assumed to provide an increase of 50%. A significant improvement in performance was seen, as evidenced by reduced temperatures, internal rod pressures, and fission gas release, even with modest (5-10%) increases in thermal conductivity. The benefits increased monotonically with increasing thermal conductivity. Improvements in LOCA initialization performance were also seen. A neutronic calculation considered a transition from standard UO 2 fuel to ECO fuel. The calculation indicated that only a small increase in enrichment is required to maintain the kinf at end of life. The smallness of the change was attributed to the neutron-multiplication reaction of Be with fast neutrons and the moderating effect of BeO. Adoption of ECO fuel was predicted to provide a net reduction in uranium cost. Requirements for industrial hygiene were found to be comparable to those for processing of UO 2.

  19. Strategy and Airpower

    DTIC Science & Technology

    2011-01-01

    myopia often leads otherwise competent observers to under­ estimate significantly the new technology’s potential. Two business examples stand out: in...direction. With precision of effect combined with precision of impact, bloodless war becomes a reality . To this point, we have tried to make the...against virtually all of the centers of gravity directly related to strategic objectives, regardless of their loca­ tion. Because it can bring many

  20. Atmospheric Models For Over-Ocean Propagation Loss

    DTIC Science & Technology

    2015-08-24

    Radiosonde balloons are launched daily at selected loca- tions, and measure temperature, dew point temperature, and air pressure as they ascend. Radiosondes...different times of year and locations. The result was used to estimate high-reliability SHF/EHF air -to-surface radio link performance in a maritime...environment. I. INTRODUCTION Air -to-surface radio links differ from typical satellite com- munications links in that the path elevation angles are lower

  1. Fluid Dynamic Analysis of Volcanic Tremor,

    DTIC Science & Technology

    1982-10-01

    information regarding the fluid system Fiske (1969) Kilauea volcano : The 1967-68 summit configuration, tremor magnitudes and source loca- eruption...Koyanagi (1981) Deep volcanic tremor logicalSociety of America, vol. 40, p. 175-194. and magma ascent mechanism under Kilauea , Hawaii . Omori, F...dynamics Seismology Tremors Volcanoes 40 M\\ TlACT (amhue ai revers if5 neeeeiy md ide~Wify by block number) Low-frequency (< 10 Hz) volcanic earthquakes

  2. Joint Force Quarterly. Number 15, Spring 1997

    DTIC Science & Technology

    1997-06-01

    headquarters to extract information from sensors on the vehicle without bothering crew members with extraneous reports. Position loca- tion devices on... change in how they do business. Air Force lean logistics and Army velocity management programs are literal springboards for quantum improvements in...Spring 1997 Victory smiles upon those who anticipate the changes in the character of war, not upon those who wait to adapt themselves after the changes

  3. Beam Research Program

    DTIC Science & Technology

    1984-04-01

    wavelengths. A direct application of such a laser is isotope separation. 2. For a brief status report of the Laboratory’s high- explosive flash...operation in the fall of 1982. in a 50-MeV Advanced Test Accelerator Facility (the ATA)1 that we are con- structing at our high- explosives test loca...chemical explosives in target-damage studies. Potential hazards associated with the ATA experiments were considered in choosing our site. LLNL’s

  4. RELAP5/MOD2 analysis of a postulated cold leg SBLOCA'' simultaneous to a total black-out'' event in the Jose Cabrera Nuclear Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rebollo, L.

    1992-04-01

    Several beyond-design bases cold leg small-break LOCA postulated scenarios based on the lessons learned'' in the OECD-LOFT LP-SB-3 experiment have been analyzed for the Westinghouse single loop Jose Cabrera Nuclear Power Plant belonging to the Spanish utility UNION ELECTRICA FENOSA, S.A. The analysis has been done by the utility in the Thermal-Hydraulic Accident Analysis Section of the Engineering Department of the Nuclear Division. The RELAP5/MOD2/36.04 code has been used on a CYBER 180/830 computer and the simulation includes the 6 in. RHRS charging line, the 2 in. pressurizer spray, and the 1.5 in. CVCS make-up line piping breaks. The assumptionmore » of a total black-out condition'' coincident with the occurrence of the event has been made in order to consider a plant degraded condition with total active failure of the ECCS. As a result of the analysis, estimates of the time to core overheating startup'' as well as an evaluation of alternate operator measures to mitigate the consequences of the event have been obtained. Finally a proposal for improving the LOCA emergency operating procedure (E-1) has been suggested.« less

  5. RELAP5/MOD2 analysis of a postulated ``cold leg SBLOCA`` simultaneous to a ``total black-out`` event in the Jose Cabrera Nuclear Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rebollo, L.

    1992-04-01

    Several beyond-design bases cold leg small-break LOCA postulated scenarios based on the ``lessons learned`` in the OECD-LOFT LP-SB-3 experiment have been analyzed for the Westinghouse single loop Jose Cabrera Nuclear Power Plant belonging to the Spanish utility UNION ELECTRICA FENOSA, S.A. The analysis has been done by the utility in the Thermal-Hydraulic & Accident Analysis Section of the Engineering Department of the Nuclear Division. The RELAP5/MOD2/36.04 code has been used on a CYBER 180/830 computer and the simulation includes the 6 in. RHRS charging line, the 2 in. pressurizer spray, and the 1.5 in. CVCS make-up line piping breaks. Themore » assumption of a ``total black-out condition`` coincident with the occurrence of the event has been made in order to consider a plant degraded condition with total active failure of the ECCS. As a result of the analysis, estimates of the ``time to core overheating startup`` as well as an evaluation of alternate operator measures to mitigate the consequences of the event have been obtained. Finally a proposal for improving the LOCA emergency operating procedure (E-1) has been suggested.« less

  6. Safety margins in zircaloy oxidation and embrittlement criteria for emergency core cooling system acceptance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williford, R.E.

    1986-09-01

    Current emergency core cooling system acceptance criteria for light water reactors specify that, under loss-of-coolant accident (LOCA) conditions, the Baker-Just (BJ) correlation must be used to calculate Zircaloy-steam oxidation, calculated peak cladding temperatures (PCT) must not exceed 1204/sup 0/C, and calculated oxidation must not exceed 17% equivalent cladding reacted (ECR). An appropriately defined minimum margin of safety was estimated for each of these criteria. The currently required BJ oxidation correlation provides margins only over the 1100 to 1500/sup 0/C temperature range at the 95% confidence level. The PCT margins for thermal shock and handling failures are adequate at oxidation temperaturesmore » above 1204/sup 0/C for up to 210 and 160 s, respectively, at the 95% confidence level. The ECR thermal shock and handling margins at the 50 and 95% confidence levels, respectively, range between 2 and 7% ECR for the BJ correlation, but vanish at temperatures above 1100 to 1160/sup 0/C for the best-estimate Cathcart-Pawel correlation. However, use of the Cathcart Pawel correlation for ''design basis'' LOCA calculations can be justified at the 85 to 88% confidence level if cooling rate effects can be neglected.« less

  7. Analysis of Loss-of-Coolant Accidents in the NIST Research Reactor - Early Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baek, Joo S.; Diamond, David

    A study of the fuel temperature during the early phase of a loss-of-coolant accident (LOCA) in the NIST research reactor (NBSR) was completed. Previous studies had been reported in the preliminary safety analysis report for the conversion of the NBSR from high-enriched uranium (HEU) fuel to low-enriched (LEU) fuel. Those studies had focused on the most vulnerable LOCA situation, namely, a double-ended guillotine break in the time period after reactor trip when water is drained from either the coolant channels inside the fuel elements or the region outside the fuel elements. The current study fills in a gap in themore » analysis which is the early phase of the event when there may still be water present but the reactor is at power or immediately after reactor trip and pumps have tripped. The calculations were done, for both the current HEU-fueled core and the proposed LEU core, with the TRACE thermal-hydraulic systems code. Several break locations and different break sizes were considered. In all cases the increase in the clad (or fuel meat) temperature was relatively small so that a large margin to the temperature threshold for blistering (the Safety Limit for the NBSR) remained.« less

  8. A review for identification of initiating events in event tree development process on nuclear power plants

    NASA Astrophysics Data System (ADS)

    Riyadi, Eko H.

    2014-09-01

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.

  9. Fallon Geothermal Exploration Project, Naval Air Station, Fallon, Nevada.

    DTIC Science & Technology

    1980-05-01

    magneto- telluric studies. LINEAMENT ANALYSIS As part of the initial phase of the Fallon Exploration Project, a composite lineament analysis of the region...Nevada. United States Geological Survey Bulletin 750, 1924, pp. 79-86. Hoover, D. B., R. M. Senterfit, and Bruce Radtke. Telluric Profile Loca- tion...Map and Telluric Data for the Salt Wells Known Geothermal Resource Area, Nevada. United States Geological Survey Open File Report 77-66F, 1977. Horton

  10. The California Debris Commission: A History

    DTIC Science & Technology

    1981-01-01

    the pipe a more freely in the horizontal plane, while vertical elastic packing in the joint instead of two stable instrument to handle. movement was...report of January duplicate and triplicate taxation , and (4) it 1880 painted a dark and sobering picture Following two months of intense and had not the...isolated cases it is possible to impound debris without injury; also, that loca- tions exist in the canons of the different mining streams in the Sierra

  11. The Effects of Alarm Display, Processing, and Availability on Crew Performance

    DTIC Science & Technology

    2000-11-01

    snow Instrumentation line leakage Small LOCA Steam generator tube rupture Small feedwater leakage inside containment Cycling of main steam...implemented. • Due to primary pressure controller failure, pressure heater banks cycle between on and off. 8.00 CF1 CF2 CF3 CF4 CF5...temperatures after the high-pressure pre- heaters flows into the steam generators number of active emergency feedwater pumps openings of the condensate

  12. Operation TEAPOT. Report of the Test Manager Joint Test Organization

    DTIC Science & Technology

    1981-11-01

    Analysis of air, water, and milk samples were made at the laboratory at Mercury. z. 6.4 MONITORING PROCEDURES A mobile surface monitoring group consisting o...additional weather observations proved very use- ful in weather analysis and forecasting as well as for monitoring winds aloft prior to shot time. The...data were also valuable in post analysis for accurate plotting of fallout and determining cloud trajectories. The loca- tion and type of operations of

  13. Better Vision Through Manipulation

    DTIC Science & Technology

    2002-01-01

    correlation can be used as a "signature" to iden- map useful for generating arm, head, and trunk tify parts of the scene that are being influenced by...down reaching. This can be over- in response to the nudge of the arm. This move- come by training up a function to estimate the loca- ment will be... behavioural evidence. Experimental Gibson, J. J. (1977). The theory of affordances. Brain Research, 143:335-341. In Shaw, R. and Bransford, J., (Eds

  14. Particle Size Determination in Small Solid Propellant Rocket Motors Using the Diffractively Scattered Light Method.

    DTIC Science & Technology

    1982-10-01

    calibrated by using spherical glass beads and aluminum oxide powder . Measurements were successfully made at both locations. Because DO 1473 EoITioN OF I NOVy...determined using measurements of diffrac- tively scattered laser power spectra. The apparatus was calibrated by using spherical glass beads and aluminum oxide... powder . Measurements were successfully made at both loca- tions. Because of the presence of char agglomerates in the exhaust, continued effort is

  15. Core-power and decay-time limits for disabled automatic-actuation of LOFT ECCS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, G.H.

    1978-11-22

    The Emergency Core Cooling System (ECCS) for the LOFT reactor may need to be disabled for modifications or repairs of hardware or instrumentation or for component testing during periods when the reactor system is hot and pressurized, or it may be desirable to enable the ECCS to be disabled without the necessity of cooling down and depressurizing the reactor. A policy involves disabling the automatic-actuation of the LOFT ECCS, but still retaining the manual actuation capability. Disabling of the automatic actuation can be safely utilized, without subjecting the fuel cladding to unacceptable temperatures, when the LOFT power decays to 33more » kW; this power level permits a maximum delay of 20 minutes following a LOCA for the manual actuation of ECCS. For the operating power of the L2-2 Experiment, the required decay-periods (with operating periods of 40 and 2000 hours) are about 21 and 389 hours, respectively. With operating periods of 40 and 2000 hours at Core-I full power, the required decay-periods are about 42 and 973 hours, respectively. After these decay periods the automatic actuation of the LOFT ECCS can be disabled assuming a maximum delay of 20 minutes following a LOCA for the manual actuation of ECCS. The automatic and manual lineup of the ECCS may be waived if decay power is less than 11 kW.« less

  16. A review for identification of initiating events in event tree development process on nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id

    2014-09-30

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logicmore » model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.« less

  17. Diagnostics of Loss of Coolant Accidents Using SVC and GMDH Models

    NASA Astrophysics Data System (ADS)

    Lee, Sung Han; No, Young Gyu; Na, Man Gyun; Ahn, Kwang-Il; Park, Soo-Yong

    2011-02-01

    As a means of effectively managing severe accidents at nuclear power plants, it is important to identify and diagnose accident initiating events within a short time interval after the accidents by observing the major measured signals. The main objective of this study was to diagnose loss of coolant accidents (LOCAs) using artificial intelligence techniques, such as SVC (support vector classification) and GMDH (group method of data handling). In this study, the methodologies of SVC and GMDH models were utilized to discover the break location and estimate the break size of the LOCA, respectively. The 300 accident simulation data (based on MAAP4) were used to develop the SVC and GMDH models, and the 33 test data sets were used to independently confirm whether or not the SVC and GMDH models work well. The measured signals from the reactor coolant system, steam generators, and containment at a nuclear power plant were used as inputs to the models, and the 60 sec time-integrated values of the input signals were used as inputs into the SVC and GMDH models. The simulation results confirmed that the proposed SVC model can identify the break location and the proposed GMDH models can estimate the break size accurately. In addition, even if the measurement errors exist and safety systems actuate, the proposed SVC and GMDH models can discover the break locations without a misclassification and accurately estimate the break size.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walston, S; Rowland, M; Campbell, K

    It is difficult to track to the location of a melted core in a GE BWR with Mark I containment during a beyond-design-basis accident. The Cooper Nuclear Station provided a baseline of normal material distributions and shielding configurations for the GE BWR with Mark I containment. Starting with source terms for a design-basis accident, methods and remote observation points were investigated to allow tracking of a melted core during a beyond-design-basis accident. The design of the GE BWR with Mark-I containment highlights an amazing poverty of expectations regarding a common mode failure of all reactor core cooling systems resulting inmore » a beyond-design-basis accident from the simple loss of electric power. This design is shown in Figure 1. The station blackout accident scenario has been consistently identified as the leading contributor to calculated probabilities for core damage. While NRC-approved models and calculations provide guidance for indirect methods to assess core damage during a beyond-design-basis loss-of-coolant accident (LOCA), there appears to be no established method to track the location of the core directly should the LOCA include a degree of fuel melt. We came to the conclusion that - starting with detailed calculations which estimate the release and movement of gaseous and soluble fission products from the fuel - selected dose readings in specific rooms of the reactor building should allow the location of the core to be verified.« less

  19. ENcentive: A Framework for Intelligent Marketing in Mobile Peer-To-Peer Environments

    DTIC Science & Technology

    2005-01-01

    trade and commu- nication strategies, mobile electronic marketing, intelligent agents, collaborative eCommerce 1. INTRODUCTION With the explosion of...requests the promotion (since Jeff is a cof- fee drinker). MH2 signs the promotion with Susan’s eN- centive ID. At 6pm, Jeff decides to take advantage of the...to become valid, a user has a choice of remaining in his current loca- tion and being able to take advantage of the promotion. The eNcentive Ad

  20. Predicting the Behavior of Asphalt Concrete Pavements in Seasonal Frost Areas Using Nondestructive Techniques

    DTIC Science & Technology

    1990-11-01

    and psychrometers : the loca- cm-diameter wooden dowel approximately 122 cm in tion of these gauges is shown in Figure 16. The length, with 4.0-mm holes...thermocouple psychrometers were the third Dowel set of sensors used. A detailed description of these sensors can be found in a paper by Brown and...Figure 19. Freezing of test sections. in resistance with temperature in TS 2. Major changes psychrometers were not evaluated for this report. These in

  1. The history of Fort Leavenworth, 1937 - 1951

    DTIC Science & Technology

    1951-01-01

    research assistance on the bibliography and some of the appendixes; Public Library, Leaven- worth, Kansas, for use of facilities; M~r. Cleve Williams for...througho4t (Act or Cometition s ged by Loca Citizens)its active ser~iice on a Snake River run, from Pocatello, Idaho , north to (Act or Cmpetitio s aged y...Post as usual 35th Division and Maj Gen William K. Herndon on June 9, 1940. A few days later the War of the 24th Division, National Guard, were on

  2. Development of Facility Type Information Packages for Design of Air Force Facilities.

    DTIC Science & Technology

    1983-03-01

    solution. For example, the optimum size and loca- 19 tion of windows for the incorporation of a passive solar *l . heating system varies with location, time...conditioning load estimate M. Energy impact statement N. Majcom review comments 0. Solar energy systems 61 4 Information which could help in the development...and Passive solar systems. All facilities should have Scme aspects of passive solar incor- por3ted into the iesign. Active sclar systems should ze con

  3. A Steady State and Dynamic Analysis of a Mooring System.

    DTIC Science & Technology

    1977-03-25

    drag on subsurface buoy Dx«, »y«» Dzs Cable drag eoaponants in esble coordinates *A Distance between calculated location and actual loca- tion of...ship Be Modulus of elasticity of esble fh Highest natural frequency of systea 0 Horizontal distance between snshor and ship -- MUIIM—laSHS...TQD Tension in esble fron ahlp st subsurface buoy TBB Tension in cable fron anchor at aubaurfaea buoy Ty" Tension In n’th esble segaent Hif|l

  4. Reconstruction of Acoustic Exposure on Orcas in Haro Strait

    DTIC Science & Technology

    2009-01-01

    Resident killer whales (Orcinus orca) (J pod).1 The class shadowed the J pod from their boat, recording its behavior, the GPS loca- tion of the...one of the resident pods of orcas, raising the question of the sonar’s impact on them. Due to two coincidental activities, this question can be...addressed in detail. Coinciding with Shoup’s transit, a marine mammal class from Friday Harbor Labs led by Dr. David Bain was observing a pod of Southern

  5. Archeological Investigations in Cochiti Reservoir, New Mexico. Volume 2. Excavation and Analysis 1975 Season.

    DTIC Science & Technology

    1977-01-01

    groups who used this ratio should be considered. them to process wild seeds and other vegeta material. The trough metate has been considered a specialized...transition and as an indi- was composed entirely of jar formswith wide oriface. cation of the importance assumed by wild vegetal ma- not as difficult to...For example. Ovis/Capra meat is con- faunal resource utilization from the excavated site loca- sidered most edible when the animal is young. Meat

  6. International Workshop on Millimeter Waves (1992) Held in Orvieto, Italy on April 22-24, 1992

    DTIC Science & Technology

    1992-04-24

    1fillinmler4Vtlatri Circuits T. Yotwymatiat. Receott IDeehotnns. #of NRI)-;,.,s. Te, impIig It. 11. Jamset. Adv’anced Design Tee hnu jues fir Leiear... designed for the sky radiation measurement. - consideration of typical flight altitudes of 300m to Its output delivers a mean-value of the relevant...Electrolytic Processes: Anodic, Etching and Cathodic -increase of Surface to Volume Ratio metal deponition. Loca-Stuctre epoitin b pont- ikea ) no damage

  7. Design and Implementation of a Consolidated Airfield at McMurdo, Antarctica

    DTIC Science & Technology

    2014-09-01

    DESTROY THIS REPORT WHEN NO LONGER NEEDED. DO NOT RETURN IT TO THE ORIGINATOR . ERDC/CRREL TR-14-22 iii Contents Abstract...the current loca- tion of the white ice runway (the wheeled runway at Pegasus) is about 1/3 mile WSW of where it was when it was originally ...ft below the surface. This is not surprising; when the original runway was established in 1991–92, there were regions where the ice needed to be

  8. Thermal-hydraulic modeling needs for passive reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, J.M.

    1997-07-01

    The U.S. Nuclear Regulatory Commission has received an application for design certification from the Westinghouse Electric Corporation for an Advanced Light Water Reactor design known as the AP600. As part of the design certification process, the USNRC uses its thermal-hydraulic system analysis codes to independently audit the vendor calculations. The focus of this effort has been the small break LOCA transients that rely upon the passive safety features of the design to depressurize the primary system sufficiently so that gravity driven injection can provide a stable source for long term cooling. Of course, large break LOCAs have also been considered,more » but as the involved phenomena do not appear to be appreciably different from those of current plants, they were not discussed in this paper. Although the SBLOCA scenario does not appear to threaten core coolability - indeed, heatup is not even expected to occur - there have been concerns as to the performance of the passive safety systems. For example, the passive systems drive flows with small heads, consequently requiring more precision in the analysis compared to active systems methods for passive plants as compared to current plants with active systems. For the analysis of SBLOCAs and operating transients, the USNRC uses the RELAP5 thermal-hydraulic system analysis code. To assure the applicability of RELAP5 to the analysis of these transients for the AP600 design, a four year long program of code development and assessment has been undertaken.« less

  9. Assessment of safety margins in zircaloy oxidation and embrittlement criteria for ECCS acceptance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williford, R.E.

    1986-04-01

    Current Emergency Core Cooling System (ECCS) Acceptance Criteria for light-water reactors include certain requirements pertaining to calculations of core performance during a Loss of Coolant Accident (LOCA). The Baker-Just correlation must be used to calculate Zircaloy-steam oxidation, calculated peak cladding temperatures (PCT) must not exceed 1204/sup 0/C, and calculated oxidation must not exceed 17% equivalent cladding reacted (17% ECR). The minimum margin of safety was estimated for each of these criteria, based on research performed in the last decade. Margins were defined as the amounts of conservatism over and above the expected extreme values computed from the data base atmore » specified confidence levels. The currently required Baker-Just oxidation correlation provides margins only over the 1100/sup 0/C to 1500/sup 0/C temperature range at the 95% confidence level. The PCT margins for thermal shock and handling failures are adequate at oxidation temperatures above 1204/sup 0/C for 210 and 160 seconds, respectively, at the 95% confidence level. ECR thermal shock and handling margins at the 50% and 95% confidence levels, respectively, range between 2% and 7% ECR for the Baker-Just correlation, but vanish at temperatures between 1100/sup 0/C and 1160/sup 0/C for the best-estimate Cathcart-Pawel correlation. Use of the Cathcart-Pawel correlation for LOCA calculations can be justified at the 85% to 88% confidence level if cooling rate effects can be neglected. 75 refs., 21 figs.« less

  10. Development and Assessment of CFD Models Including a Supplemental Program Code for Analyzing Buoyancy-Driven Flows Through BWR Fuel Assemblies in SFP Complete LOCA Scenarios

    NASA Astrophysics Data System (ADS)

    Artnak, Edward Joseph, III

    This work seeks to illustrate the potential benefits afforded by implementing aspects of fluid dynamics, especially the latest computational fluid dynamics (CFD) modeling approach, through numerical experimentation and the traditional discipline of physical experimentation to improve the calibration of the severe reactor accident analysis code, MELCOR, in one of several spent fuel pool (SFP) complete loss-ofcoolant accident (LOCA) scenarios. While the scope of experimental work performed by Sandia National Laboratories (SNL) extends well beyond that which is reasonably addressed by our allotted resources and computational time in accordance with initial project allocations to complete the report, these simulated case trials produced a significant array of supplementary high-fidelity solutions and hydraulic flow-field data in support of SNL research objectives. Results contained herein show FLUENT CFD model representations of a 9x9 BWR fuel assembly in conditions corresponding to a complete loss-of-coolant accident scenario. In addition to the CFD model developments, a MATLAB based controlvolume model was constructed to independently assess the 9x9 BWR fuel assembly under similar accident scenarios. The data produced from this work show that FLUENT CFD models are capable of resolving complex flow fields within a BWR fuel assembly in the realm of buoyancy-induced mass flow rates and that characteristic hydraulic parameters from such CFD simulations (or physical experiments) are reasonably employed in corresponding constitutive correlations for developing simplified numerical models of comparable solution accuracy.

  11. Core cooling under accident conditions at the high flux beam reactor (HFBR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, P.; Cheng, L.; Fauske, H.

    In certain accident scenarios, e.g. loss of coolant accidents (LOCA) all forced flow cooling is lost. Decay heating causes a temperature increase in the core coolant and the resulting thermal buoyancy causes a reversal of the flow direction to a natural circulation mode. Although there was experimental evidence during the reactor design period (1958--1963) that the heat removal capacity in the fully developed natural circulation cooling mode was relatively high, it was not possible to make a confident prediction of the heat removal capacity during the transition from downflow to natural circulation. In a LOCA scenario where even limited fuelmore » damage occurs and natural circulation is established, fission product gases could be carried from the damaged fuel by steam into areas where operator access is required to maintain the core in a coolable configuration. This would force evacuation of the building and lead to extensive core damage. As a result the HFBR was shut down by the Department of Energy (DOE) and an extensive review of the HFBR was initiated. In an effort to address this issue BNL developed a model designed to predict the heat removal limit during flow reversal that was found to be in good agreement with the test results. Currently a thermal-hydraulic test program is being developed to provide a more realistic and defensible estimate of the flow reversal heat removal limit so that the reactor power level can be increased.« less

  12. APT Blanket Thermal Analyses of Top Horizontal Row 1 Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadday, M.A.

    1999-09-20

    The Accelerator Production of Tritium (APT) cavity flood system (CFS) is designed to be the primary safeguard for the integrity of the blanket modules during loss of coolant accidents (LOCAs). For certain large break LOCAs the CFS also provides backup for the residual heat removal systems (RHRs) in cooling the target assemblies. In the unlikely event that the internal flow passages in a blanket module or target assembly dryout, decay heat in the metal structures will be dissipated to the CFS through the module or assembly walls (i.e., rung outer walls). The target assemblies consist of tungsten targets encased inmore » steel conduits, and they can safely sustain high metal temperatures. Under internally dry conditions, the cavity flood fluid will cool the target assemblies with vigorous nucleate boiling on the external surfaces. However, the metal structures in the blanket modules consist of lead cladded in aluminum, and they have a long-term exposure temperature limit currently set to 150 degrees C. Simultaneous LOCAs in both the target and blanket heat removal systems (HRS) could result in dryout of the target ladders, as well as the horizontal blanket modules above the target. The cavity flood coolant would boil on the outside surfaces of the target ladder rungs, and the resultant steam could reduce the effectiveness of convection heat transfer from the blanket modules to the cavity flood coolant. A two-part analysis was conducted to ascertain if the cavity flood system can adequately cool the blanket modules above the targets, even when boiling is occurring on the outer surfaces of the target ladder rungs. The first part of the analysis was to model transient thermal conduction in the front top horizontal row 1 module (i.e. top horizontal modules nearest the incoming beam), while varying parametrically the convection heat transfer coefficient (htc) for the external surfaces exposed to the cavity flood flow. This part of the analysis demonstrated that the module could adequately conduct heat to the outer module surfaces, given reasonable values for the convection heat transfer coefficients. The second part of the analysis consisted of two-phase flow modeling of the natural circulation of the cavity flood fluid past the top modules. Slots in the top shield allow the cavity flood fluid to circulate. The required width for these slots, to prevent steam from backing up and blanketing the outer surfaces of the top modules, was determined.« less

  13. Evaluating the influence of motor control on selective attention through a stochastic model: the paradigm of motor control dysfunction in cerebellar patient.

    PubMed

    Veneri, Giacomo; Federico, Antonio; Rufa, Alessandra

    2014-01-01

    Attention allows us to selectively process the vast amount of information with which we are confronted, prioritizing some aspects of information and ignoring others by focusing on a certain location or aspect of the visual scene. Selective attention is guided by two cognitive mechanisms: saliency of the image (bottom up) and endogenous mechanisms (top down). These two mechanisms interact to direct attention and plan eye movements; then, the movement profile is sent to the motor system, which must constantly update the command needed to produce the desired eye movement. A new approach is described here to study how the eye motor control could influence this selection mechanism in clinical behavior: two groups of patients (SCA2 and late onset cerebellar ataxia LOCA) with well-known problems of motor control were studied; patients performed a cognitively demanding task; the results were compared to a stochastic model based on Monte Carlo simulations and a group of healthy subjects. The analytical procedure evaluated some energy functions for understanding the process. The implemented model suggested that patients performed an optimal visual search, reducing intrinsic noise sources. Our findings theorize a strict correlation between the "optimal motor system" and the "optimal stimulus encoders."

  14. Development of an AFIT (Air Force Institute of Technology) ADP System Network Model.

    DTIC Science & Technology

    1983-12-01

    printers. Lastly, after a batch event finishes ’printing’, the turnaround time for each computer and loca- tion may be collected. Resource Module. The...9)), + (IJOBSA(10)) ,(IORG,A( 12)) ,(,AEi4A( 13)) C CNNNN IF JOB JUST FINISHED CPU OR PRINTER IN WAIT MODE THEN SCHEDULE C**** THINK TINE C IF...Modules: CPUAIJ I C Scheduled by. none C C FUNCTION XIOSEC(ICLASSISIZE) P RAMETER (1AXCLS=6,WAXSIZ-6) C -’. CMII COPtION BLOCKS COtON /FACTOR/ DI STRB

  15. APT Blanket System Loss-of-Coolant Accident (LOCA) Based on Initial Conceptual Design - Case 2: with Beam Shutdown Only

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report. This report documents the results of simulations of a Loss-of-Flow Accident (LOFA) where power is lost to all of the pumps that circulate water in the blanket region, the accelerator beam is shut off and neither the residual heat removal nor cavity flood systems operate.

  16. Public Response to a Near-Miss Nuclear Accident Scenario Varying in Causal Attributions and Outcome Uncertainty.

    PubMed

    Cui, Jinshu; Rosoff, Heather; John, Richard S

    2018-05-01

    Many studies have investigated public reactions to nuclear accidents. However, few studies focused on more common events when a serious accident could have happened but did not. This study evaluated public response (emotional, cognitive, and behavioral) over three phases of a near-miss nuclear accident. Simulating a loss-of-coolant accident (LOCA) scenario, we manipulated (1) attribution for the initial cause of the incident (software failure vs. cyber terrorist attack vs. earthquake), (2) attribution for halting the incident (fail-safe system design vs. an intervention by an individual expert vs. a chance coincidence), and (3) level of uncertainty (certain vs. uncertain) about risk of a future radiation leak after the LOCA is halted. A total of 773 respondents were sampled using a 3 × 3 × 2 between-subjects design. Results from both MANCOVA and structural equation modeling (SEM) indicate that respondents experienced more negative affect, perceived more risk, and expressed more avoidance behavioral intention when the near-miss event was initiated by an external attributed source (e.g., earthquake) compared to an internally attributed source (e.g., software failure). Similarly, respondents also indicated greater negative affect, perceived risk, and avoidance behavioral intentions when the future impact of the near-miss incident on people and the environment remained uncertain. Results from SEM analyses also suggested that negative affect predicted risk perception, and both predicted avoidance behavior. Affect, risk perception, and avoidance behavior demonstrated high stability (i.e., reliability) from one phase to the next. © 2017 Society for Risk Analysis.

  17. Severe Accident Test Station Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snead, Mary A.; Yan, Yong; Howell, Michael

    The purpose of the ORNL severe accident test station (SATS) is to provide a platform for evaluation of advanced fuels under projected beyond design basis accident (BDBA) conditions. The SATS delivers the capability to map the behavior of advanced fuels concepts under accident scenarios across various temperature and pressure profiles, steam and steam-hydrogen gas mixtures, and thermal shock. The overall facility will include parallel capabilities for examination of fuels and irradiated materials (in-cell) and non-irradiated materials (out-of-cell) at BDBA conditions as well as design basis accident (DBA) or loss of coolant accident (LOCA) conditions. Also, a supporting analytical infrastructure tomore » provide the data-needs for the fuel-modeling components of the Fuel Cycle Research and Development (FCRD) program will be put in place in a parallel manner. This design report contains the information for the first, second and third phases of design and construction of the SATS. The first phase consisted of the design and construction of an out-of-cell BDBA module intended for examination of non-irradiated materials. The second phase of this work was to construct the BDBA in-cell module to test irradiated fuels and materials as well as the module for DBA (i.e. LOCA) testing out-of-cell, The third phase was to build the in-cell DBA module. The details of the design constraints and requirements for the in-cell facility have been closely captured during the deployment of the out-of-cell SATS modules to ensure effective future implementation of the in-cell modules.« less

  18. Transfer of 45Ca and 36Cl at the blood-nerve barrier of the sciatic nerve in rats fed low or high calcium diets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadhwani, K.C.; Murphy, V.A.; Rapoport, S.I.

    1991-04-01

    Unidirectional fluxes of 45Ca, 36Cl, and of (3H)mannitol from blood into the sciatic nerve and cerebral cortex were determined from 5- and 15-min uptakes of these tracers after an intravenous (i.v.) bolus injection in awake rats. Rats were fed diets for 8 wk, that had either a low (0.01% wt/wt), normal (0.67%), or high (3%) Ca content. Plasma (Ca) was 32% less and 11% more in rats fed low (LOCA) and high Ca diets (HICA), respectively, than in rats fed a normal Ca diet (CONT). The mean permeability-surface area product (PA) of 45Ca at the blood-nerve barrier was about eightfoldmore » higher than at the blood-brain barrier in the same animals and did not differ significantly between groups (greater than 0.05). Mean PA ratios of 45Ca/36Cl for the blood-nerve and blood-brain barriers in CONT rats, 0.52 {plus minus} 0.04 and 0.40 {plus minus} 0.02, respectively, were not significantly different from corresponding ratios in LOCA and HICA groups, and corresponded to the aqueous limiting diffusion ratio (0.45). The authors results show no evidence for concentration-dependent transport of Ca over a plasma (Ca) range of 0.8-1.4 mmol/liter at the blood-nerve barrier of the rat peripheral nerve, and suggest that Ca and Cl exchange slowly between nerve and blood via paracellular pathways.« less

  19. The R&D PERFROI Project on Thermal Mechanical and Thermal Hydraulics Behaviors of a Fuel Rod Assembly during a Loss of Coolant Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Repetto, G.; Dominguez, C.; Durville, B.

    The safety principle in case of a LOCA is to preserve the short and long term coolability of the core. The associated safety requirements are to ensure the resistance of the fuel rods upon quench and post-quench loads and to maintain a coolable geometry in the core. An R&D program has been launched by IRSN with the support of EDF, to perform both experimental and modeling activities in the frame of the LOCA transient, on technical issues such as: - flow blockage within a fuel rods bundle and its potential impact on coolability, - fuel fragment relocation in the balloonedmore » areas: its potential impact on cladding PCT (Peak Cladding Temperature) and on the maximum oxidation rate, - potential loss of cladding integrity upon quench and post-quench loads. The PERFROI project (2014-2019) focusing on the first above issue, is structured in two axes: 1. axis 1: thermal mechanical behavior of deformation and rupture of cladding taking into account the contact between fuel rods; specific research at LaMCoS laboratory focus on the hydrogen behavior in cladding alloys and its impact on the mechanical behavior of the rod; and, 2. axis 2: thermal hydraulics study of a partially blocked region of the core (ballooned area taking into account the fuel relocation with local over power), during cooling phase by water injection; More detailed activities foreseen in collaboration with LEMTA laboratory will focus on the characterization of two phase flows with heat transfer in deformed structures.« less

  20. Development and Validation of Accident Models for FeCrAl Cladding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamble, Kyle Allan Lawrence; Hales, Jason Dean

    2016-08-01

    The purpose of this milestone report is to present the work completed in regards to material model development for FeCrAl cladding and highlight the results of applying these models to Loss of Coolant Accidents (LOCA) and Station Blackouts (SBO). With the limited experimental data available (essentially only the data used to create the models) true validation is not possible. In the absence of another alternative, qualitative comparisons during postulated accident scenarios between FeCrAl and Zircaloy-4 cladded rods have been completed demonstrating the superior performance of FeCrAl.

  1. Emergency heat removal system for a nuclear reactor

    DOEpatents

    Dunckel, Thomas L.

    1976-01-01

    A heat removal system for nuclear reactors serving as a supplement to an Emergency Core Cooling System (ECCS) during a Loss of Coolant Accident (LOCA) comprises a plurality of heat pipes having one end in heat transfer relationship with either the reactor pressure vessel, the core support grid structure or other in-core components and the opposite end located in heat transfer relationship with a heat exchanger having heat transfer fluid therein. The heat exchanger is located external to the pressure vessel whereby excessive core heat is transferred from the above reactor components and dissipated within the heat exchanger fluid.

  2. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3. Part 2.

    DTIC Science & Technology

    1983-09-01

    F.P. PX /AMPZIJ/ REFH /AMPZIJ/ REFV /AI4PZIJ/ * RHOX /AI4PZIJ/ RHOY /At4PZIJ/ RHOZ /AI4PZIJ/ S A-ZJ SA /AMPZIJ/ SALP /AMPZIJ/ 6. CALLING ROUTINE: FLDDRV...US3NG ALGORITHM 72 COMPUTE P- YES .~:*:.~~ USING* *. 1. NAME: PLAINT (GTD) ] 2. PURPOSE: To determine if a ray traveling from a given source loca...determine if a source ray reflection from plate MP occurs. If a ray traveling from the source image location in the reflected ray direction passes through

  3. Westinghouse Small Modular Reactor passive safety system response to postulated events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, M. C.; Wright, R. F.

    2012-07-01

    The Westinghouse Small Modular Reactor (SMR) is an 800 MWt (>225 MWe) integral pressurized water reactor. This paper is part of a series of four describing the design and safety features of the Westinghouse SMR. This paper focuses in particular upon the passive safety features and the safety system response of the Westinghouse SMR. The Westinghouse SMR design incorporates many features to minimize the effects of, and in some cases eliminates the possibility of postulated accidents. The small size of the reactor and the low power density limits the potential consequences of an accident relative to a large plant. Themore » integral design eliminates large loop piping, which significantly reduces the flow area of postulated loss of coolant accidents (LOCAs). The Westinghouse SMR containment is a high-pressure, compact design that normally operates at a partial vacuum. This facilitates heat removal from the containment during LOCA events. The containment is submerged in water which also aides the heat removal and provides an additional radionuclide filter. The Westinghouse SMR safety system design is passive, is based largely on the passive safety systems used in the AP1000{sup R} reactor, and provides mitigation of all design basis accidents without the need for AC electrical power for a period of seven days. Frequent faults, such as reactivity insertion events and loss of power events, are protected by first shutting down the nuclear reaction by inserting control rods, then providing cold, borated water through a passive, buoyancy-driven flow. Decay heat removal is provided using a layered approach that includes the passive removal of heat by the steam drum and independent passive heat removal system that transfers heat from the primary system to the environment. Less frequent faults such as loss of coolant accidents are mitigated by passive injection of a large quantity of water that is readily available inside containment. An automatic depressurization system is used to reduce the reactor pressure in a controlled manner to facilitate the passive injection. Long-term decay heat removal is accomplished using the passive heat removal systems augmented by heat transfer through the containment vessel to the environment. The passive injection systems are designed so that the fuel remains covered and effectively cooled throughout the event. Like during the frequent faults, the passive systems provide effective cooling without the need for ac power for seven days following the accident. Connections are available to add additional water to indefinitely cool the plant. The response of the safety systems of the Westinghouse SMR to various initiating faults has been examined. Among them, two accidents; an extended station blackout event, and a LOCA event have been evaluated to demonstrate how the plant will remain safe in the unlikely event that either should occur. (authors)« less

  4. The role of the uncertainty in code development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barre, F.

    1997-07-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may notmore » really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation.« less

  5. The desirability of education in didactic skills according to medical interns.

    PubMed

    Kloek, Anne T; Verbakel, Joshua R A; Bernard, Simone E; Evenboer, Januska; Hendriks, Eef J; Stam, Hanneke

    2012-12-01

    Since all doctors at some point in their career will be faced with their role as a teacher, it appears desirable that future doctors are educated in didactic skills. At present, however, there are no formal opportunities for developing didactic skills at the majority of Dutch medical faculties. The main question of this study is: How do medical interns perceive the quality and quantity of their education in didactic skills? The Dutch Association for Medical Interns (LOCA) ran a national survey among 1,008 medical interns that measured the interns' self-assessed needs for training in didactic skills during medical school. Almost 80 % of the respondents argue that the mastery of didactic skills composes an essential competency for doctors, with the skill of providing adequate feedback considered to be the most important didactic quality for doctors. Of the respondents, 41 % wish to be educated in didactic skills, both during their medical undergraduate degree and during their subsequent training to become a resident. Teaching while being observed and receiving feedback in this setting is regarded as a particularly valuable didactic method by 74 % of the medical interns. Of the respondents, 82 % would invest time to follow training for the development of didactic skills if it was offered. Medical interns stress the importance of doctors' didactic skills during their clinical internships. Compared with current levels, most interns desire increased attention to the formal development of didactic skills during medical school. Considering the importance of didactic skills and the need for more extensive training, the LOCA advises medical faculties to include more formal didactic training in the medical curriculum.

  6. Geochemical and Hydrologic Controls of Copper-Rich Surface Waters in the Yerba Loca-Mapocho System

    NASA Astrophysics Data System (ADS)

    Pasten, P.; Montecinos, M.; Coquery, M.; Pizarro, G. E.; Abarca, M. I.; Arce, G. J.

    2015-12-01

    Andean watersheds in Northern and Central Chile are naturally enriched with metals, many of them associated to sulfide mineralizations related to copper mining districts. The natural and anthropogenic influx of toxic metals into drinking water sources pose a sustainability challenge for cities that need to provide safe water with the smallest footprint. This work presents our study of the transformations of copper in the Yerba Loca-Mapocho system. Our sampling campaign started from the headwaters at La Paloma Glacier and continues to the inlet of the San Enrique drinking water treatment plant, a system feeding municipalities in the Eastern area of Santiago, Chile. Depending on the season, total copper concentrations go as high as 22 mg/L for the upper sections, which become diluted to <5 mg/L downstream. pH ranged from 3 to 5.6 while suspended solids ranged from <10 to 100 mg/L. We used Geochemist Workbench to assess copper speciation and to evaluate the thermodynamic controls for the formation and dissolution of solid phases. A sediment trap was used to concentrate suspended particulate matter, which was analyzed with ICP-MS, TXRF (total reflection X ray fluorescence) and XRD (X-ray diffraction). Major elements detected in the precipitates were Al (200 g/kg), S (60 g/kg), and Cu (6 g/kg). Likely solid phases include hydrous amorphous phases of aluminum hydroxides and sulfates, and copper hydroxides/carbonates. Efforts are undergoing to find the optimal mixing ratios between the acidic stream and more alkaline streams to maximize attenuation of dissolved copper. The results of this research could be used for enhancing in-stream natural attenuation of copper and reducing treatment needs at the drinking water facility. Acknowledgements to Fondecyt 1130936 and Conicyt Fondap 15110020

  7. New reactor cavity cooling system having passive safety features using novel shape for HTGRs and VHTRs

    DOE PAGES

    Takamatsu, Kuniyoshi; Hu, Rui

    2014-11-27

    A new, highly efficient reactor cavity cooling system (RCCS) with passive safety features without a requirement for electricity and mechanical drive is proposed for high temperature gas cooled reactors (HTGRs) and very high temperature reactors (VHTRs). The RCCS design consists of continuous closed regions; one is an ex-reactor pressure vessel (RPV) region and another is a cooling region having heat transfer area to ambient air assumed at 40 (°C). The RCCS uses a novel shape to efficiently remove the heat released from the RPV with radiation and natural convection. Employing the air as the working fluid and the ambient airmore » as the ultimate heat sink, the new RCCS design strongly reduces the possibility of losing the heat sink for decay heat removal. Therefore, HTGRs and VHTRs adopting the new RCCS design can avoid core melting due to overheating the fuels. The simulation results from a commercial CFD code, STAR-CCM+, show that the temperature distribution of the RCCS is within the temperature limits of the structures, such as the maximum operating temperature of the RPV, 713.15 (K) = 440 (°C), and the heat released from the RPV could be removed safely, even during a loss of coolant accident (LOCA). Finally, when the RCCS can remove 600 (kW) of the rated nominal state even during LOCA, the safety review for building the HTTR could confirm that the temperature distribution of the HTTR is within the temperature limits of the structures to secure structures and fuels after the shutdown because the large heat capacity of the graphite core can absorb heat from the fuel in a short period. Therefore, the capacity of the new RCCS design would be sufficient for decay heat removal.« less

  8. PANDA asymmetric-configuration passive decay heat removal test results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, O.; Dreier, J.; Aubert, C.

    1997-12-01

    PANDA is a large-scale, low-pressure test facility for investigating passive decay heat removal systems for the next generation of LWRs. In the first series of experiments, PANDA was used to examine the long-term LOCA response of the Passive Containment Cooling System (PCCS) for the General Electric (GE) Simplified Boiling Water Reactor (SBWR). The test objectives include concept demonstration and extension of the database available for qualification of containment codes. Also included is the study of the effects of nonuniform distributions of steam and noncondensable gases in the Dry-well (DW) and in the Suppression Chamber (SC). 3 refs., 9 figs.

  9. Summary on the depressurization from supercritical pressure conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, M.; Chen, Y.; Ammirable, L.

    When a fluid discharges from a high pressure and temperature system, a 'choking' or critical condition occurs, and the flow rate becomes independent of the downstream pressure. During a postulated loss of coolant accident (LOCA) of a water reactor the break flow will be subject to this condition. An accurate estimation of the critical flow rate is important for the evaluation of the reactor safety, because this flow rate controls the loss of coolant inventory and energy from the system, and thus has a significant effect on the accident consequences[1]. In the design of safety systems for a super criticalmore » water reactor (SCWR), postulated LOCA transients are particularly important due to the lower coolant inventory compared to a typical PWR for the same power output. This lower coolant inventory would result in a faster transient response of the SCWR, and hence accurate prediction of the critical discharge is mandatory. Under potential two-phase conditions critical flow is dominated by the vapor content or quality of the vapor, which is closely related with the onset of vaporization and the interfacial interaction between phases [2]. This presents a major challenge for the estimation of the flow rate due to the lack of the knowledge of those processes, especially under the conditions of interest for the SCWR. According to the limited data of supercritical fluids, the critical flows at conditions above the pseudo-critical point seem to be fairly stable and consistent with the subcritical homogeneous equilibrium model (HEM) model predictions, while having a lower flow rate than those in the two-phase region. Thus the major difficulty in the prediction of the depressurization flow rates remains in the region where two phases co-exist at the top of the vapor dome. In this region, the flow rate is strongly affected by the nozzle geometry and tends to be unstable. Various models for this region have been developed with different assumptions, e.g. the HEM and Moody model [3], and the Henry-Fauske non-equilibrium model [4], and are currently used in subcritical pressure reactor safety design[5]. It appears that some of these models could be reasonably extended to above the thermodynamic pseudo-critical point. The more stable and lower discharge flow rates observed in conditions above the pseudo-critical point suggests that even though SCWR's have a smaller coolant inventory, the safety implications of a LOCA and the subsequent depressurization may not be as severe as expected, this however needs to be confirmed by a rigorous evaluation of the particular event and further evaluation of the critical flow rate. This paper will summarize activities on critical flow models, experimental data and numerical modeling during blowdown from supercritical pressure conditions under the International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) on 'Heat Transfer Behaviour and Thermo-hydraulics Code testing for SCWRs'. (authors)« less

  10. Passive containment cooling system with drywell pressure regulation for boiling water reactor

    DOEpatents

    Hill, Paul R.

    1994-01-01

    A boiling water reactor having a regulating valve for placing the wetwell in flow communication with an intake duct of the passive containment cooling system. This subsystem can be adjusted to maintain the drywell pressure at (or slightly below or above) wetwell pressure after the initial reactor blowdown transient is over. This addition to the PCCS design has the benefit of eliminating or minimizing steam leakage from the drywell to the wetwell in the longer-term post-LOCA time period and also minimizes the temperature difference between drywell and wetwell. This in turn reduces the rate of long-term pressure buildup of the containment, thereby extending the time to reach the design pressure limit.

  11. Comparison of Standardized Test Scores from Traditional Classrooms and Those Using Problem-Based Learning

    ERIC Educational Resources Information Center

    Needham, Martha Elaine

    2010-01-01

    This research compares differences between standardized test scores in problem-based learning (PBL) classrooms and a traditional classroom for 6th grade students using a mixed-method, quasi-experimental and qualitative design. The research shows that problem-based learning is as effective as traditional teaching methods on standardized tests. The…

  12. Implicit time-integration method for simultaneous solution of a coupled non-linear system

    NASA Astrophysics Data System (ADS)

    Watson, Justin Kyle

    Historically large physical problems have been divided into smaller problems based on the physics involved. This is no different in reactor safety analysis. The problem of analyzing a nuclear reactor for design basis accidents is performed by a handful of computer codes each solving a portion of the problem. The reactor thermal hydraulic response to an event is determined using a system code like TRAC RELAP Advanced Computational Engine (TRACE). The core power response to the same accident scenario is determined using a core physics code like Purdue Advanced Core Simulator (PARCS). Containment response to the reactor depressurization in a Loss Of Coolant Accident (LOCA) type event is calculated by a separate code. Sub-channel analysis is performed with yet another computer code. This is just a sample of the computer codes used to solve the overall problems of nuclear reactor design basis accidents. Traditionally each of these codes operates independently from each other using only the global results from one calculation as boundary conditions to another. Industry's drive to uprate power for reactors has motivated analysts to move from a conservative approach to design basis accident towards a best estimate method. To achieve a best estimate calculation efforts have been aimed at coupling the individual physics models to improve the accuracy of the analysis and reduce margins. The current coupling techniques are sequential in nature. During a calculation time-step data is passed between the two codes. The individual codes solve their portion of the calculation and converge to a solution before the calculation is allowed to proceed to the next time-step. This thesis presents a fully implicit method of simultaneous solving the neutron balance equations, heat conduction equations and the constitutive fluid dynamics equations. It discusses the problems involved in coupling different physics phenomena within multi-physics codes and presents a solution to these problems. The thesis also outlines the basic concepts behind the nodal balance equations, heat transfer equations and the thermal hydraulic equations, which will be coupled to form a fully implicit nonlinear system of equations. The coupling of separate physics models to solve a larger problem and improve accuracy and efficiency of a calculation is not a new idea, however implementing them in an implicit manner and solving the system simultaneously is. Also the application to reactor safety codes is new and has not be done with thermal hydraulics and neutronics codes on realistic applications in the past. The coupling technique described in this thesis is applicable to other similar coupled thermal hydraulic and core physics reactor safety codes. This technique is demonstrated using coupled input decks to show that the system is solved correctly and then verified by using two derivative test problems based on international benchmark problems the OECD/NRC Three mile Island (TMI) Main Steam Line Break (MSLB) problem (representative of pressurized water reactor analysis) and the OECD/NRC Peach Bottom (PB) Turbine Trip (TT) benchmark (representative of boiling water reactor analysis).

  13. A Natural Fit: Problem-based Learning and Technology Standards.

    ERIC Educational Resources Information Center

    Sage, Sara M.

    2000-01-01

    Discusses the use of problem-based learning to meet technology standards. Highlights include technology as a tool for locating and organizing information; the Wolf Wars problem for elementary and secondary school students that provides resources, including Web sites, for information; Web-based problems; and technology as assessment and as a…

  14. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su'ud, Zaki; Anshari, Rio

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environmentmore » such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.« less

  15. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  16. A coupled hydrodynamic-hydrochemical modeling for predicting mineral transport in a natural acid drainage system.

    NASA Astrophysics Data System (ADS)

    Zegers Risopatron, G., Sr.; Navarro, L.; Montserrat, S., Sr.; McPhee, J. P.; Niño, Y.

    2017-12-01

    The geochemistry of water and sediments, coupled with hydrodynamic transport in mountainous channels, is of particular interest in central Chilean Andes due to natural occurrence of acid waters. In this paper, we present a coupled transport and geochemical model to estimate and understand transport processes and fate of minerals at the Yerba Loca Basin, located near Santiago, Chile. In the upper zone, water presentes low pH ( 3) and high concentrations of iron, aluminum, copper, manganese and zinc. Acidity and minerals are the consequence of water-rock interactions in hydrothermal alteration zones, rich in sulphides and sulphates, covered by seasonal snow and glaciers. Downstream, as a consequence of neutral to alkaline lateral water contributions (pH >7) along the river, pH increases and concentration of solutes decreases. The mineral transport model has three components: (i) a hydrodynamic model, where we use HEC-RAS to solve 1D Saint-Venant equations, (ii) a sediment transport model to estimate erosion and sedimentation rates, which quantify minerals transference between water and riverbed and (iii) a solute transport model, based on the 1D OTIS model which takes into account the temporal delay in solutes transport that typically is observed in natural channels (transient storage). Hydrochemistry is solved using PHREEQC, a software for speciation and batch reaction. Our results show that correlation between mineral precipitation and dissolution according to pH values changes along the river. Based on pH measurements (and according to literature) we inferred that main minerals in the water system are brochantite, ferrihydrite, hydrobasaluminite and schwertmannite. Results show that our model can predict the transport and fate of minerals and metals in the Yerba Loca Basin. Mineral dissolution and precipitation process occur for limited ranges of pH values. When pH values are increased, iron minerals (schwertmannite) are the first to precipitate ( 2.5

  17. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  18. LOFT L2-3 blowdown experiment safety analyses D, E, and G; LOCA analyses H, K, K1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perryman, J.L.; Keeler, C.D.; Saukkoriipi, L.O.

    1978-12-01

    Three calculations using conservative off-nominal conditions and evaluation model options were made using RELAP4/MOD5 for blowdown-refill and RELAP4/MOD6 for reflood for Loss-of-Fluid Test Experiment L2-3 to support the experiment safety analysis effort. The three analyses are as follows: Analysis D: Loss of commercial power during Experiment L2-3; Analysis E: Hot leg quick-opening blowdown valve (QOBV) does not open during Experiment L2-3; and Analysis G: Cold leg QOBV does not open during Experiment L2-3. In addition, the results of three LOFT loss-of-coolant accident (LOCA) analyses using a power of 56.1 MW and a primary coolant system flow rate of 3.6 millionmore » 1bm/hr are presented: Analysis H: Intact loop 200% hot leg break; emergency core cooling (ECC) system B unavailable; Analysis K: Pressurizer relief valve stuck in open position; ECC system B unavailable; and Analysis K1: Same as analysis K, but using a primary coolant system flow rate of 1.92 million 1bm/hr (L2-4 pre-LOCE flow rate). For analysis D, the maximum cladding temperature reached was 1762/sup 0/F, 22 sec into reflood. In analyses E and G, the blowdowns were slower due to one of the QOBVs not functioning. The maximum cladding temperature reached in analysis E was 1700/sup 0/F, 64.7 sec into reflood; for analysis G, it was 1300/sup 0/F at the start of reflood. For analysis H, the maximum cladding temperature reached was 1825/sup 0/F, 0.01 sec into reflood. Analysis K was a very slow blowdown, and the cladding temperatures followed the saturation temperature of the system. The results of analysis K1 was nearly identical to analysis K; system depressurization was not affected by the primary coolant system flow rate.« less

  19. Standardized Definitions for Code Verification Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  20. ACHILLES: Heat Transfer in PWR Core During LOCA Reflood Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2013-11-01

    1. NAME AND TITLE OF DATA LIBRARY ACHILLES -Heat Transfer in PWR Core During LOCA Reflood Phase. 2. NAME AND TITLE OF DATA RETRIEVAL PROGRAMS N/A 3. CONTRIBUTOR AEA Technology, Winfrith Technology Centre, Dorchester DT2 8DH United Kingdom through the OECD Nuclear Energy Agency Data Bank, Issy-les-Moulineaux, France. 4. DESCRIPTION OF TEST FACILITY The most important features of the Achilles rig were the shroud vessel, which contained the test section, and the downcomer. These may be thought of as representing the core barrel and the annular downcomer in the reactor pressure vessel. The test section comprises a cluster of 69more » rods in a square array within a circular shroud vessel. The rod diameter and pitch (9.5 mm and 12.6 mm) were typical of PWR dimensions. The internal diameter of the shroud vessel was 128 mm. Each rod was electrically heated over a length of 3.66 m, which is typical of the nuclear heated length in a PWR fuel rod, and each contained 6 internal thermocouples. These were arranged in one of 8 groupings which concentrated the thermocouples in different axial zones. The spacer grids were at prototypic PWR locations. Each grid had two thermocouples attached to its trailing edge at radial locations. The axial power profile along the rods was an 11 step approximation to a "chopped cosine". The shroud vessel had 5 heating zones whose power could be independently controlled. 5. DESCRIPTION OF TESTS The Achilles experiments investigated the heat transfer in the core of a Pressurized Water Reactor during the re-flood phase of a postulated large break loss of coolant accident. The results provided data to validate codes and to improve modeling. Different types of experiments were carried out which included single phase cooling, re-flood under low flow conditions, level swell and re-flood under high flow conditions. Three series of experiments were performed. The first and the third used the same test section but the second used another test section, similar in all respects except that it contained a partial blockage formed by attaching sleeves (or "balloons") to some of the rods. 6. SOURCE AND SCOPE OF DATA Phenomena Tested - Heat transfer in the core of a PWR during a re-flood phase of postulated large break LOCA. Test Designation - Achilles Rig. The programme includes the following types of experiments: - on an unballooned cluster: -- single phase air flow -- low pressure level swell -- low flooding rate re-flood -- high flooding rate re-flood - on a ballooned cluster containing 80% blockage formed by 16 balloon sleeves -- single phase air flow -- low flooding rate re-flood 7. DISCUSSION OF THE DATA RETRIEVAL PROGRAM N/A 8. DATA FORMAT AND COMPUTER Many Computers (M00019MNYCP00). 9. TYPICAL RUNNING TIME N/A 11. CONTENTS OF LIBRARY The ACHILLES package contains test data and associated data processing software as well as the documentation listed above. 12. DATE OF ABSTRACT November 2013. KEYWORDS: DATABASES, BENCHMARKS, HEAT TRANSFER, LOSS-OF-COLLANT ACCIDENT, PWR REACTORS, REFLOODING« less

  1. Youth Top Problems: using idiographic, consumer-guided assessment to identify treatment needs and to track change during psychotherapy.

    PubMed

    Weisz, John R; Chorpita, Bruce F; Frye, Alice; Ng, Mei Yi; Lau, Nancy; Bearman, Sarah Kate; Ugueto, Ana M; Langer, David A; Hoagwood, Kimberly E

    2011-06-01

    To complement standardized measurement of symptoms, we developed and tested an efficient strategy for identifying (before treatment) and repeatedly assessing (during treatment) the problems identified as most important by caregivers and youths in psychotherapy. A total of 178 outpatient-referred youths, 7-13 years of age, and their caregivers separately identified the 3 problems of greatest concern to them at pretreatment and then rated the severity of those problems weekly during treatment. The Top Problems measure thus formed was evaluated for (a) whether it added to the information obtained through empirically derived standardized measures (e.g., the Child Behavior Checklist [CBCL; Achenbach & Rescorla, 2001] and the Youth Self-Report [YSR; Achenbach & Rescorla, 2001]) and (b) whether it met conventional psychometric standards. The problems identified were significant and clinically relevant; most matched CBCL/YSR items while adding specificity. The top problems also complemented the information yield of the CBCL/YSR; for example, for 41% of caregivers and 79% of youths, the identified top problems did not correspond to any items of any narrowband scales in the clinical range. Evidence on test-retest reliability, convergent and discriminant validity, sensitivity to change, slope reliability, and the association of Top Problems slopes with standardized measure slopes supported the psychometric strength of the measure. The Top Problems measure appears to be a psychometrically sound, client-guided approach that complements empirically derived standardized assessment; the approach can help focus attention and treatment planning on the problems that youths and caregivers consider most important and can generate evidence on trajectories of change in those problems during treatment. (PsycINFO Database Record (c) 2011 APA, all rights reserved).

  2. Proficiency Standards and Cut-Scores for Language Proficiency Tests.

    ERIC Educational Resources Information Center

    Moy, Raymond H.

    The problem of standard setting on language proficiency tests is often approached by the use of norms derived from the group being tested, a process commonly known as "grading on the curve." One particular problem with this ad hoc method of standard setting is that it will usually result in a fluctuating standard dependent on the particular group…

  3. Automated Hypothesis Tests and Standard Errors for Nonstandard Problems with Description of Computer Package: A Draft.

    ERIC Educational Resources Information Center

    Lord, Frederic M.; Stocking, Martha

    A general Computer program is described that will compute asymptotic standard errors and carry out significance tests for an endless variety of (standard and) nonstandard large-sample statistical problems, without requiring the statistician to derive asymptotic standard error formulas. The program assumes that the observations have a multinormal…

  4. When procedures discourage insight: epistemological consequences of prompting novice physics students to construct force diagrams

    NASA Astrophysics Data System (ADS)

    Kuo, Eric; Hallinen, Nicole R.; Conlin, Luke D.

    2017-05-01

    One aim of school science instruction is to help students become adaptive problem solvers. Though successful at structuring novice problem solving, step-by-step problem-solving frameworks may also constrain students' thinking. This study utilises a paradigm established by Heckler [(2010). Some consequences of prompting novice physics students to construct force diagrams. International Journal of Science Education, 32(14), 1829-1851] to test how cuing the first step in a standard framework affects undergraduate students' approaches and evaluation of solutions in physics problem solving. Specifically, prompting the construction of a standard diagram before problem solving increases the use of standard procedures, decreasing the use of a conceptual shortcut. Providing a diagram prompt also lowers students' ratings of informal approaches to similar problems. These results suggest that reminding students to follow typical problem-solving frameworks limits their views of what counts as good problem solving.

  5. Passive containment cooling system with drywell pressure regulation for boiling water reactor

    DOEpatents

    Hill, P.R.

    1994-12-27

    A boiling water reactor is described having a regulating valve for placing the wetwell in flow communication with an intake duct of the passive containment cooling system. This subsystem can be adjusted to maintain the drywell pressure at (or slightly below or above) wetwell pressure after the initial reactor blowdown transient is over. This addition to the PCCS design has the benefit of eliminating or minimizing steam leakage from the drywell to the wetwell in the longer-term post-LOCA time period and also minimizes the temperature difference between drywell and wetwell. This in turn reduces the rate of long-term pressure buildup of the containment, thereby extending the time to reach the design pressure limit. 4 figures.

  6. Intensive motivational interviewing for women with concurrent alcohol problems and methamphetamine dependence.

    PubMed

    Korcha, Rachael A; Polcin, Douglas L; Evans, Kristy; Bond, Jason C; Galloway, Gantt P

    2014-02-01

    Motivational interviewing (MI) for the treatment of alcohol and drug problems is typically conducted over 1 to 3 sessions. The current work evaluates an intensive 9-session version of MI (Intensive MI) compared to a standard single MI session (Standard MI) using 163 methamphetamine (MA) dependent individuals. The primary purpose of this paper is to report the unexpected finding that women with co-occurring alcohol problems in the Intensive MI condition reduced the severity of their alcohol problems significantly more than women in the Standard MI condition at the 6-month follow-up. Stronger perceived alliance with the therapist was inversely associated with alcohol problem severity scores. Findings indicate that Intensive MI is a beneficial treatment for alcohol problems among women with MA dependence. © 2013.

  7. Air Pollution over the States

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1972

    1972-01-01

    State plans for implementing air quality standards are evaluated together with problems in modeling procedures and enforcement. Monitoring networks, standards, air quality regions, and industrial problems are also discussed. (BL)

  8. SGML-Based Markup for Literary Texts: Two Problems and Some Solutions.

    ERIC Educational Resources Information Center

    Barnard, David; And Others

    1988-01-01

    Identifies the Standard Generalized Markup Language (SGML) as the best basis for a markup standard for encoding literary texts. Outlines solutions to problems using SGML and discusses the problem of maintaining multiple views of a document. Examines several ways of reducing the burden of markups. (GEA)

  9. RELAP5 Analyses of OECD/NEA ROSA-2 Project Experiments on Intermediate-Break LOCAs at Hot Leg or Cold Leg

    NASA Astrophysics Data System (ADS)

    Takeda, Takeshi; Maruyama, Yu; Watanabe, Tadashi; Nakamura, Hideo

    Experiments simulating PWR intermediate-break loss-of-coolant accidents (IBLOCAs) with 17% break at hot leg or cold leg were conducted in OECD/NEA ROSA-2 Project using the Large Scale Test Facility (LSTF). In the hot leg IBLOCA test, core uncovery started simultaneously with liquid level drop in crossover leg downflow-side before loop seal clearing (LSC) induced by steam condensation on accumulator coolant injected into cold leg. Water remained on upper core plate in upper plenum due to counter-current flow limiting (CCFL) because of significant upward steam flow from the core. In the cold leg IBLOCA test, core dryout took place due to rapid liquid level drop in the core before LSC. Liquid was accumulated in upper plenum, steam generator (SG) U-tube upflow-side and SG inlet plenum before the LSC due to CCFL by high velocity vapor flow, causing enhanced decrease in the core liquid level. The RELAP5/MOD3.2.1.2 post-test analyses of the two LSTF experiments were performed employing critical flow model in the code with a discharge coefficient of 1.0. In the hot leg IBLOCA case, cladding surface temperature of simulated fuel rods was underpredicted due to overprediction of core liquid level after the core uncovery. In the cold leg IBLOCA case, the cladding surface temperature was underpredicted too due to later core uncovery than in the experiment. These may suggest that the code has remaining problems in proper prediction of primary coolant distribution.

  10. Mission Mathematics: Linking Aerospace and the NCTM Standards, K-6.

    ERIC Educational Resources Information Center

    Hynes, Mary Ellen, Ed.

    This book is designed to present mathematical problems and tasks that focus on the National Council of Teachers of Mathematics (NCTM) curriculum and evaluation standards in the context of aerospace activities. It aims at actively engaging students in NCTM's four process standards: (1) problem solving; (2) mathematical reasoning; (3) communicating…

  11. Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations

    NASA Astrophysics Data System (ADS)

    Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans

    2017-01-01

    Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.

  12. Addressing Beyond Standard Model physics using cosmology

    NASA Astrophysics Data System (ADS)

    Ghalsasi, Akshay

    We have consensus models for both particle physics (i.e. standard model) and cosmology (i.e. LambdaCDM). Given certain assumptions about the initial conditions of the universe, the marriage of the standard model (SM) of particle physics and LambdaCDM cosmology has been phenomenally successful in describing the universe we live in. However it is quite clear that all is not well. The three biggest problems that the SM faces today are baryogenesis, dark matter and dark energy. These problems, along with the problem of neutrino masses, indicate the existence of physics beyond SM. Evidence of baryogenesis, dark matter and dark energy all comes from astrophysical and cosmological observations. Cosmology also provides the best (model dependent) constraints on neutrino masses. In this thesis I will try address the following problems 1) Addressing the origin of dark energy (DE) using non-standard neutrino cosmology and exploring the effects of the non-standard neutrino cosmology on terrestrial and cosmological experiments. 2) Addressing the matter anti-matter asymmetry of the universe.

  13. Incorporating the Common Core's Problem Solving Standard for Mathematical Practice into an Early Elementary Inclusive Classroom

    ERIC Educational Resources Information Center

    Fletcher, Nicole

    2014-01-01

    Mathematics curriculum designers and policy decision makers are beginning to recognize the importance of problem solving, even at the earliest stages of mathematics learning. The Common Core includes sense making and perseverance in solving problems in its standards for mathematical practice for students at all grade levels. Incorporating problem…

  14. Promoting Access to Common Core Mathematics for Students with Severe Disabilities through Mathematical Problem Solving

    ERIC Educational Resources Information Center

    Spooner, Fred; Saunders, Alicia; Root, Jenny; Brosh, Chelsi

    2017-01-01

    There is a need to teach the pivotal skill of mathematical problem solving to students with severe disabilities, moving beyond basic skills like computation to higher level thinking skills. Problem solving is emphasized as a Standard for Mathematical Practice in the Common Core State Standards across grade levels. This article describes a…

  15. Analyzing Multilevel Data: An Empirical Comparison of Parameter Estimates of Hierarchical Linear Modeling and Ordinary Least Squares Regression

    ERIC Educational Resources Information Center

    Rocconi, Louis M.

    2011-01-01

    Hierarchical linear models (HLM) solve the problems associated with the unit of analysis problem such as misestimated standard errors, heterogeneity of regression and aggregation bias by modeling all levels of interest simultaneously. Hierarchical linear modeling resolves the problem of misestimated standard errors by incorporating a unique random…

  16. [Management and orientation of a hand laceration].

    PubMed

    Masmejean, Emmanuel

    2013-11-01

    The good management and orientation of a hand laceration by the general physician is essential. Anatomical knowledge help to judge, after examination, the opportunity for surgery exploration with local anesthesia. Serious stakes are prognostic and economics. The conclusion identifies three clinical pictures : simple superf cial wound requiring a simple clinical control 2 days follow-up, the dubious wound that need to be sent to a specialized center, and the wound requiring care in an emergency hand unit. Extremely urgent wounds are devascularization, amputation and the pressure injection . Bites and wounds on a tendon way require surgical exploration. Bandage should be as simple as possible in order to allow early motion. No antibiotic is given preventively exept for bite, open fractures and/or delay of treatment. Outpatient surgery under loca anesthesia simplifies management.

  17. The Problem of Correspondence of Educational and Professional Standards (Results of Empirical Research)

    ERIC Educational Resources Information Center

    Piskunova, Elena; Sokolova, Irina; Kalimullin, Aydar

    2016-01-01

    In the article, the problem of correspondence of educational standards of higher pedagogical education and teacher professional standards in Russia is actualized. Modern understanding of the quality of vocational education suggests that in the process of education the student develops a set of competencies that will enable him or her to carry out…

  18. Status and analysis of test standard for on-board charger

    NASA Astrophysics Data System (ADS)

    Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan

    2018-05-01

    This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.

  19. An information geometric approach to least squares minimization

    NASA Astrophysics Data System (ADS)

    Transtrum, Mark; Machta, Benjamin; Sethna, James

    2009-03-01

    Parameter estimation by nonlinear least squares minimization is a ubiquitous problem that has an elegant geometric interpretation: all possible parameter values induce a manifold embedded within the space of data. The minimization problem is then to find the point on the manifold closest to the origin. The standard algorithm for minimizing sums of squares, the Levenberg-Marquardt algorithm, also has geometric meaning. When the standard algorithm fails to efficiently find accurate fits to the data, geometric considerations suggest improvements. Problems involving large numbers of parameters, such as often arise in biological contexts, are notoriously difficult. We suggest an algorithm based on geodesic motion that may offer improvements over the standard algorithm for a certain class of problems.

  20. Pupils' Visual Representations in Standard and Problematic Problem Solving in Mathematics: Their Role in the Breach of the Didactical Contract

    ERIC Educational Resources Information Center

    Deliyianni, Eleni; Monoyiou, Annita; Elia, Iliada; Georgiou, Chryso; Zannettou, Eleni

    2009-01-01

    This study investigated the modes of representations generated by kindergarteners and first graders while solving standard and problematic problems in mathematics. Furthermore, it examined the influence of pupils' visual representations on the breach of the didactical contract rules in problem solving. The sample of the study consisted of 38…

  1. Application of a Mixed Consequential Ethical Model to a Problem Regarding Test Standards.

    ERIC Educational Resources Information Center

    Busch, John Christian

    The work of the ethicist Charles Curran and the problem-solving strategy of the mixed consequentialist ethical model are applied to a traditional social science measurement problem--that of how to adjust a recommended standard in order to be fair to the test-taker and society. The focus is on criterion-referenced teacher certification tests.…

  2. Improved National Response to Climate Change: Aligning USGCRP reports and the U.S. Climate Resilience Toolkit

    NASA Astrophysics Data System (ADS)

    Lipschultz, F.; Dahlman, L. E.; Herring, D.; Fox, J. F.

    2017-12-01

    As part of an effort to coordinate production and distribution of scientific climate information across the U.S. Government, and to spur adaptation actions across the nation, the U.S. Global Change Research Program (USGCRP) has worked to better integrate the U.S. Climate Resilience Toolkit (CRT) and its Climate Explorer (CE) tool into USGCRP activities and products. Much of the initial CRT content was based on the Third National Climate Assessment (NCA3). The opportunity to integrate current development of NCA4—scheduled for release in late 2018—with CRT and CE can enhance all three projects and result in a useable and "living" NCA that is part of USGCRP's approach to sustained climate assessment. To coordinate this work, a USGCRP-led science team worked with CRT staff and CE developers to update the set of climate projections displayed in the CE tool. In concert with the USGCRP scenarios effort, the combined team selected the Localized Constructed Analogs (LOCA) dataset for the updated version of CE, based on its capabilities for capturing climate extremes and local climate variations. The team identified 28 variables from the LOCA dataset for display in the CE; many of these variables will also be used in USGCRP reports. In CRT engagements, communities with vulnerable assets have expressed a high value for the ability to integrate climate data available through the CE with data related to non-climate stressors in their locations. Moving forward, the teams intend to serve climate information needs at additional spatial scales by making NCA4 content available via CE's capability for dynamic interaction with climate-relevant datasets. This will permit users to customize the extent of data they access for decision-making, starting with the static NCA4 report. Additionally, NCA4 case studies and other content can be linked to more in-depth content within the CRT site. This capability will enable more frequent content updates than can be managed with quadrennial NCA reports. Overall, enhanced integration between USGCRP and CRT will provide consistent information for communities that are assessing their climate vulnerabilities or considering adaptation options.

  3. [Evaluation of the standard application of Delphi in the diagnosis of chronic obstructive pulmonary disease caused by occupational irritant chemicals].

    PubMed

    Zhao, L; Yan, Y J

    2017-11-20

    Objective: To investigate the problems encountered in the application of the standard (hereinafter referred to as standard) for the diagnosis of chronic obstructive pulmonary disease caused by occu-pational irritant chemicals, to provide reference for the revision of the new standard, to reduce the number of missed patients in occupational COPD, and to get rid of the working environment of those who suffer from chronic respiratory diseases due to long-term exposure to poisons., slowing the progression of the disease. Methods: Using Delphi (Delphi) Expert research method, after the senior experts to demonstrate, to under-stand the GBZ 237-2011 "occupational irritant chemicals to the diagnosis of chronic obstructive pulmonary dis-ease" standard evaluation of the system encountered problems, to seek expert advice, The problems encoun-tered during the clinical implementation of the standards promulgated in 2011 are presented. Results: Through the Delphi Expert investigation method, it is found that experts agree on the content evaluation and implemen-tation evaluation in the standard, but the operational evaluation of the standard is disputed. According to the clinical experience, the experts believe that the range of occupational irritant gases should be expanded, and the operation of the problem of smoking, seniority determination and occupational contact history should be challenged during the diagnosis. Conclusions: Since the promulgation in 2011 of the criteria for the diagnosis of chronic obstructive pulmonary disease caused by occupational stimulant chemicals, there have been some problems in the implementation process, which have caused many occupationally exposed to irritating gases to suffer from "occupational chronic respiratory Diseases" without a definitive diagnosis.

  4. Authentication: A Standard Problem or a Problem of Standards?

    PubMed

    Capes-Davis, Amanda; Neve, Richard M

    2016-06-01

    Reproducibility and transparency in biomedical sciences have been called into question, and scientists have been found wanting as a result. Putting aside deliberate fraud, there is evidence that a major contributor to lack of reproducibility is insufficient quality assurance of reagents used in preclinical research. Cell lines are widely used in biomedical research to understand fundamental biological processes and disease states, yet most researchers do not perform a simple, affordable test to authenticate these key resources. Here, we provide a synopsis of the problems we face and how standards can contribute to an achievable solution.

  5. A process for reaching standardization of word processing software for Sandia National Laboratories (Albuquerque) secretaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, S.R.

    1989-04-01

    In the summer of 1986, a number of problems being experienced by Sandia secretaries due to multiple word processing packages being used were brought to the attention of Sandia's upper management. This report discusses how these problems evolved, how management chose to correct the problem, and how standardization of word processing for Sandia secretaries was achieved. 11 refs.

  6. Dependability of technical items: Problems of standardization

    NASA Astrophysics Data System (ADS)

    Fedotova, G. A.; Voropai, N. I.; Kovalev, G. F.

    2016-12-01

    This paper is concerned with problems blown up in the development of a new version of the Interstate Standard GOST 27.002 "Industrial product dependability. Terms and definitions". This Standard covers a wide range of technical items and is used in numerous regulations, specifications, standard and technical documentation. A currently available State Standard GOST 27.002-89 was introduced in 1990. Its development involved a participation of scientists and experts from different technical areas, its draft was debated in different audiences and constantly refined, so it was a high quality document. However, after 25 years of its application it's become necessary to develop a new version of the Standard that would reflect the current understanding of industrial dependability, accounting for the changes taking place in Russia in the production, management and development of various technical systems and facilities. The development of a new version of the Standard makes it possible to generalize on a terminological level the knowledge and experience in the area of reliability of technical items, accumulated over a quarter of the century in different industries and reliability research schools, to account for domestic and foreign experience of standardization. Working on the new version of the Standard, we have faced a number of issues and problems on harmonization with the International Standard IEC 60500-192, caused first of all by different approaches to the use of terms and differences in the mentalities of experts from different countries. The paper focuses on the problems related to the chapter "Maintenance, restoration and repair", which caused difficulties for the developers to harmonize term definitions both with experts and the International Standard, which is mainly related to differences between the Russian concept and practice of maintenance and repair and foreign ones.

  7. Qualitative Differences in Real-Time Solution of Standardized Figural Analogies.

    ERIC Educational Resources Information Center

    Schiano, Diane J.; And Others

    Performance on standardized figural analogy tests is considered highly predictive of academic success. While information-processing models of analogy solution attribute performance differences to quantitative differences in processing parameters, the problem-solving literature suggests that qualitative differences in problem representation and…

  8. Assembling Appliances Standards from a Basket of Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siderious, Hans-Paul; Meier, Alan

    2014-08-11

    Rapid innovation in product design challenges the current methodology for setting standards and labels, especially for electronics, software and networking. Major problems include defining the product, measuring its energy consumption, and choosing the appropriate metric and level for the standard. Most governments have tried to solve these problems by defining ever more specific product subcategories, along with their corresponding test methods and metrics. An alternative approach would treat each energy-using product as something that delivers a basket of functions. Then separate standards would be constructed for the individual functions that can be defined, tested, and evaluated. Case studies of thermostats,more » displays and network equipment are presented to illustrate the problems with the classical approach for setting standards and indicate the merits and drawbacks of the alternative. The functional approach appears best suited to products whose primary purpose is processing information and that have multiple functions.« less

  9. In-pile tests at Karlsruhe of LWR fuel-rod behavior during the heatup phase of a LOCA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karb, E.H.

    1980-01-01

    In order to investigate the influence of a nuclar environment on the mechanisms of fuel-rod failure, in-pile tests simulating the heatup phase of a loss-of-coolant accident in a pressurized-water reactor are being conducted with irradiated and unirradiated short-length single rods in the FR2 reactor at Kernforschungszentrum karlsruhe (Karlsruhe Nuclear Reasearch Center), Federal Republic of Germany, within the Project Nuclear Safety. With nearly 70% of the scheduled tests completed, no such influences have been found. The in-pile burst and deformation data are in good agreement with results from nonnuclear tests with electrically heated fuel-rod simulators. The phenomenon of pellet disintegration, whichmore » has been observed in all tests with previously irradiated rods, needs further investigation.« less

  10. 77 FR 9239 - California State Motor Vehicle and Nonroad Engine Pollution Control Standards; Truck Idling...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... Pollution Control Standards; Truck Idling Requirements; Notice of Decision AGENCY: Environmental Protection... to meet its serious air pollution problems. Likewise, EPA has consistently recognized that California... and high concentrations of automobiles, create serious pollution problems.'' \\37\\ Furthermore, no...

  11. Learning to Write about Mathematics

    ERIC Educational Resources Information Center

    Parker, Renee; Breyfogle, M. Lynn

    2011-01-01

    Beginning in third grade, Pennsylvania students are required to take the Pennsylvania State Standardized Assessment (PSSA), which presents multiple-choice mathematics questions and open-ended mathematics problems. Consistent with the Communication Standard of the National Council of Teachers of Mathematics, while solving the open-ended problems,…

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J E; Vassilevski, P S; Woodward, C S

    This paper provides extensions of an element agglomeration AMG method to nonlinear elliptic problems discretized by the finite element method on general unstructured meshes. The method constructs coarse discretization spaces and corresponding coarse nonlinear operators as well as their Jacobians. We introduce both standard (fairly quasi-uniformly coarsened) and non-standard (coarsened away) coarse meshes and respective finite element spaces. We use both kind of spaces in FAS type coarse subspace correction (or Schwarz) algorithms. Their performance is illustrated on a number of model problems. The coarsened away spaces seem to perform better than the standard spaces for problems with nonlinearities inmore » the principal part of the elliptic operator.« less

  13. [Development of a software standardizing optical density with operation settings related to several limitations].

    PubMed

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  14. Solving standard traveling salesman problem and multiple traveling salesman problem by using branch-and-bound

    NASA Astrophysics Data System (ADS)

    Saad, Shakila; Wan Jaafar, Wan Nurhadani; Jamil, Siti Jasmida

    2013-04-01

    The standard Traveling Salesman Problem (TSP) is the classical Traveling Salesman Problem (TSP) while Multiple Traveling Salesman Problem (MTSP) is an extension of TSP when more than one salesman is involved. The objective of MTSP is to find the least costly route that the traveling salesman problem can take if he wishes to visit exactly once each of a list of n cities and then return back to the home city. There are a few methods that can be used to solve MTSP. The objective of this research is to implement an exact method called Branch-and-Bound (B&B) algorithm. Briefly, the idea of B&B algorithm is to start with the associated Assignment Problem (AP). A branching strategy will be applied to the TSP and MTSP which is Breadth-first-Search (BFS). 11 nodes of cities are implemented for both problem and the solutions to the problem are presented.

  15. Modeling of tool path for the CNC sheet cutting machines

    NASA Astrophysics Data System (ADS)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  16. Boosting standard order sets utilization through clinical decision support.

    PubMed

    Li, Haomin; Zhang, Yinsheng; Cheng, Haixia; Lu, Xudong; Duan, Huilong

    2013-01-01

    Well-designed standard order sets have the potential to integrate and coordinate care by communicating best practices through multiple disciplines, levels of care, and services. However, there are several challenges which certainly affected the benefits expected from standard order sets. To boost standard order sets utilization, a problem-oriented knowledge delivery solution was proposed in this study to facilitate access of standard order sets and evaluation of its treatment effect. In this solution, standard order sets were created along with diagnostic rule sets which can trigger a CDS-based reminder to help clinician quickly discovery hidden clinical problems and corresponding standard order sets during ordering. Those rule set also provide indicators for targeted evaluation of standard order sets during treatment. A prototype system was developed based on this solution and will be presented at Medinfo 2013.

  17. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  18. Research Problems Associated with Limiting the Applied Force in Vibration Tests and Conducting Base-Drive Modal Vibration Tests

    NASA Technical Reports Server (NTRS)

    Scharton, Terry D.

    1995-01-01

    The intent of this paper is to make a case for developing and conducting vibration tests which are both realistic and practical (a question of tailoring versus standards). Tests are essential for finding things overlooked in the analyses. The best test is often the most realistic test which can be conducted within the cost and budget constraints. Some standards are essential, but the author believes more in the individual's ingenuity to solve a specific problem than in the application of standards which reduce problems (and technology) to their lowest common denominator. Force limited vibration tests and base-drive modal tests are two examples of realistic, but practical testing approaches. Since both of these approaches are relatively new, a number of interesting research problems exist, and these are emphasized herein.

  19. The problem of epistemic jurisdiction in global governance: The case of sustainability standards for biofuels.

    PubMed

    Winickoff, David E; Mondou, Matthieu

    2017-02-01

    While there is ample scholarly work on regulatory science within the state, or single-sited global institutions, there is less on its operation within complex modes of global governance that are decentered, overlapping, multi-sectorial and multi-leveled. Using a co-productionist framework, this study identifies 'epistemic jurisdiction' - the power to produce or warrant technical knowledge for a given political community, topical arena or geographical territory - as a central problem for regulatory science in complex governance. We explore these dynamics in the arena of global sustainability standards for biofuels. We select three institutional fora as sites of inquiry: the European Union's Renewable Energy Directive, the Roundtable on Sustainable Biomaterials, and the International Organization for Standardization. These cases allow us to analyze how the co-production of sustainability science responds to problems of epistemic jurisdiction in the global regulatory order. First, different problems of epistemic jurisdiction beset different standard-setting bodies, and these problems shape both the content of regulatory science and the procedures designed to make it authoritative. Second, in order to produce global regulatory science, technical bodies must manage an array of conflicting imperatives - including scientific virtue, due process and the need to recruit adoptees to perpetuate the standard. At different levels of governance, standard drafters struggle to balance loyalties to country, to company or constituency and to the larger project of internationalization. Confronted with these sometimes conflicting pressures, actors across the standards system quite self-consciously maneuver to build or retain authority for their forum through a combination of scientific adjustment and political negotiation. Third, the evidentiary demands of regulatory science in global administrative spaces are deeply affected by 1) a market for standards, in which firms and states can choose the cheapest sustainability certification, and 2) the international trade regime, in which the long shadow of WTO law exerts a powerful disciplining function.

  20. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  1. DICOMweb™: Background and Application of the Web Standard for Medical Imaging.

    PubMed

    Genereaux, Brad W; Dennison, Donald K; Ho, Kinson; Horn, Robert; Silver, Elliot Lewis; O'Donnell, Kevin; Kahn, Charles E

    2018-05-10

    This paper describes why and how DICOM, the standard that has been the basis for medical imaging interoperability around the world for several decades, has been extended into a full web technology-based standard, DICOMweb. At the turn of the century, healthcare embraced information technology, which created new problems and new opportunities for the medical imaging industry; at the same time, web technologies matured and began serving other domains well. This paper describes DICOMweb, how it extended the DICOM standard, and how DICOMweb can be applied to problems facing healthcare applications to address workflow and the changing healthcare climate.

  2. [Problems Inherent in Attempting Standardization of Libraries.

    ERIC Educational Resources Information Center

    Port, Idelle

    In setting standards for a large and geographically dispersed library system, one must reconcile the many varying practices that affect what is being measured or discussed. The California State University and Colleges (CSUC) consists of 19 very distinct campuses. The problems and solutions of one type of CSUC library are not likely to be those of…

  3. Severe Accident Scoping Simulations of Accident Tolerant Fuel Concepts for BWRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robb, Kevin R.

    2015-08-01

    Accident-tolerant fuels (ATFs) are fuels and/or cladding that, in comparison with the standard uranium dioxide Zircaloy system, can tolerate loss of active cooling in the core for a considerably longer time period while maintaining or improving the fuel performance during normal operations [1]. It is important to note that the currently used uranium dioxide Zircaloy fuel system tolerates design basis accidents (and anticipated operational occurrences and normal operation) as prescribed by the US Nuclear Regulatory Commission. Previously, preliminary simulations of the plant response have been performed under a range of accident scenarios using various ATF cladding concepts and fully ceramicmore » microencapsulated fuel. Design basis loss of coolant accidents (LOCAs) and station blackout (SBO) severe accidents were analyzed at Oak Ridge National Laboratory (ORNL) for boiling water reactors (BWRs) [2]. Researchers have investigated the effects of thermal conductivity on design basis accidents [3], investigated silicon carbide (SiC) cladding [4], as well as the effects of ATF concepts on the late stage accident progression [5]. These preliminary analyses were performed to provide initial insight into the possible improvements that ATF concepts could provide and to identify issues with respect to modeling ATF concepts. More recently, preliminary analyses for a range of ATF concepts have been evaluated internationally for LOCA and severe accident scenarios for the Chinese CPR1000 [6] and the South Korean OPR-1000 [7] pressurized water reactors (PWRs). In addition to these scoping studies, a common methodology and set of performance metrics were developed to compare and support prioritizing ATF concepts [8]. A proposed ATF concept is based on iron-chromium-aluminum alloys (FeCrAl) [9]. With respect to enhancing accident tolerance, FeCrAl alloys have substantially slower oxidation kinetics compared to the zirconium alloys typically employed. During a severe accident, FeCrAl would tend to generate heat and hydrogen from oxidation at a slower rate compared to the zirconium-based alloys in use today. The previous study, [2], of the FeCrAl ATF concept during station blackout (SBO) severe accident scenarios in BWRs was based on simulating short term SBO (STSBO), long term SBO (LTSBO), and modified SBO scenarios occurring in a BWR-4 reactor with MARK-I containment. The analysis indicated that FeCrAl had the potential to delay the onset of fuel failure by a few hours depending on the scenario, and it could delay lower head failure by several hours. The analysis demonstrated reduced in-vessel hydrogen production. However, the work was preliminary and was based on limited knowledge of material properties for FeCrAl. Limitations of the MELCOR code were identified for direct use in modeling ATF concepts. This effort used an older version of MELCOR (1.8.5). Since these analyses, the BWR model has been updated for use in MELCOR 1.8.6 [10], and more representative material properties for FeCrAl have been modeled. Sections 2 4 present updated analyses for the FeCrAl ATF concept response during severe accidents in a BWR. The purpose of the study is to estimate the potential gains afforded by the FeCrAl ATF concept during BWR SBO scenarios.« less

  4. Naturalness of Electroweak Symmetry Breaking

    NASA Astrophysics Data System (ADS)

    Espinosa, J. R.

    2007-02-01

    After revisiting the hierarchy problem of the Standard Model and its implications for the scale of New Physics, I consider the fine tuning problem of electroweak symmetry breaking in two main scenarios beyond the Standard Model: SUSY and Little Higgs models. The main conclusions are that New Physics should appear on the reach of the LHC; that some SUSY models can solve the hierarchy problem with acceptable residual fine tuning and, finally, that Little Higgs models generically suffer from large tunings, many times hidden.

  5. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolda, Christopher

    In this talk, I review recent work on using a generalization of the Next-to-Minimal Supersymmetric Standard Model (NMSSM), called the Singlet-extended Minimal Supersymmetric Standard Model (SMSSM), to raise the mass of the Standard Model-like Higgs boson without requiring extremely heavy top squarks or large stop mixing. In so doing, this model solves the little hierarchy problem of the minimal model (MSSM), at the expense of leaving the {mu}-problem of the MSSM unresolved. This talk is based on work published in Refs. [1, 2, 3].

  7. Quantum annealing of the traveling-salesman problem.

    PubMed

    Martonák, Roman; Santoro, Giuseppe E; Tosatti, Erio

    2004-11-01

    We propose a path-integral Monte Carlo quantum annealing scheme for the symmetric traveling-salesman problem, based on a highly constrained Ising-like representation, and we compare its performance against standard thermal simulated annealing. The Monte Carlo moves implemented are standard, and consist in restructuring a tour by exchanging two links (two-opt moves). The quantum annealing scheme, even with a drastically simple form of kinetic energy, appears definitely superior to the classical one, when tested on a 1002-city instance of the standard TSPLIB.

  8. Does language ambiguity in clinical practice justify the introduction of standard terminology? An integrative review.

    PubMed

    Stallinga, Hillegonda A; ten Napel, Huib; Jansen, Gerard J; Geertzen, Jan H B; de Vries Robbé, Pieter F; Roodbol, Petrie F

    2015-02-01

    To research the use of ambiguous language in written information concerning patients' functioning and to identify problems resulting from the use of ambiguous language in clinical practice. Many projects that aimed to introduce standard terminology concerning patients' functioning in clinical practice are unsuccessful because standard terminology is rarely used in clinical practice. These projects mainly aim to improve communication by reducing ambiguous language. Considering their lack of success, the validity of the argument that language ambiguity is used in clinical practice is questioned. An integrative literature review. A systematic search of the MEDLINE (1950-2012) and CINAHL (1982-2012) databases was undertaken, including empirical and theoretical literature. The selected studies were critically appraised using a data assessment and extraction form. Seventeen of 767 papers were included in the review and synthesis. The use of ambiguous language in written information concerning patients' functioning was demonstrated. Problems resulting from the use of ambiguous language in clinical practice were not identified. However, several potential problems were suggested, including hindered clinical decision-making and limited research opportunities. The results of this review demonstrated the use of ambiguous language concerning patients' functioning, but health professionals in clinical practice did not experience this issue as a problem. This finding might explain why many projects aimed at introducing standard terminology concerning functioning in clinical practice to solve problems caused by ambiguous language are often unsuccessful. Language ambiguity alone is not a valid argument to justify the introduction of standard terminology. The introduction of standard terminology concerning patients' functioning will only be successful when clinical practice requires the aggregation and reuse of data from electronic patient records for different purposes, including multidisciplinary decision-making and research. © 2014 John Wiley & Sons Ltd.

  9. Flow-Field Measurements in the Windward Surface Shock Layer of Space Shuttle Orbiter Configurations at Mach Number 8

    DTIC Science & Technology

    1975-07-01

    pitot p robes . T e m p e r a - tu re probe TT 2 was 0.010 in. in d i a m e t e r and was used as the p r i m a r y i n s t r u m e n t . P...lower pitot probe , PPl , was c o n s t r u c t e d of 0. 0 2 0 - i n . - O D tubing t a p e r e d to 0 .014 in. at the t ip and had an ins ide...t ion in the model boundary l a y e r . The o the r pitot probe , PP2, was loca ted about 1.0 in. above PPl and was c o n s t r u c t e d of

  10. Core-power and decay-time limits for disabled automatic-actuation of LOFT ECCS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, G.H.

    1978-06-05

    The Emergency Core Cooling System (ECCS) for the LOFT reactor may need to be disabled for modifications or repairs of hardware or instrumentation or for component testing during periods when the reactor system is hot and pressurized, or it may be desirable to enable the ECCS to be disabled without the necessity of cooling down and depressurizing the reactor. LTR 113-47 has shown that the LOFT ECCS can be safely bypassed or disabled when the total core power does not exceed 25 kW. A modified policy involves disabling the automatic actuation of the LOFT ECCS, but still retaining the manualmore » activation capability. Disabling of the automatic actuation can be safely utilized, without subjecting the fuel cladding to unacceptable temperatures, when the LOFT power decays to 70 kW; this power level permits a maximum delay of 20 minutes following a LOCA for the manual actuation of ECCS.« less

  11. SCORE-EVET: a computer code for the multidimensional transient thermal-hydraulic analysis of nuclear fuel rod arrays. [BWR; PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benedetti, R. L.; Lords, L. V.; Kiser, D. M.

    1978-02-01

    The SCORE-EVET code was developed to study multidimensional transient fluid flow in nuclear reactor fuel rod arrays. The conservation equations used were derived by volume averaging the transient compressible three-dimensional local continuum equations in Cartesian coordinates. No assumptions associated with subchannel flow have been incorporated into the derivation of the conservation equations. In addition to the three-dimensional fluid flow equations, the SCORE-EVET code ocntains: (a) a one-dimensional steady state solution scheme to initialize the flow field, (b) steady state and transient fuel rod conduction models, and (c) comprehensive correlation packages to describe fluid-to-fuel rod interfacial energy and momentum exchange. Velocitymore » and pressure boundary conditions can be specified as a function of time and space to model reactor transient conditions such as a hypothesized loss-of-coolant accident (LOCA) or flow blockage.« less

  12. Discrete element method study of fuel relocation and dispersal during loss-of-coolant accidents

    NASA Astrophysics Data System (ADS)

    Govers, K.; Verwerft, M.

    2016-09-01

    The fuel fragmentation, relocation and dispersal (FFRD) during LOCA transients today retain the attention of the nuclear safety community. The fine fragmentation observed at high burnup may, indeed, affect the Emergency Core Cooling System performance: accumulation of fuel debris in the cladding ballooned zone leads to a redistribution of the temperature profile, while dispersal of debris might lead to coolant blockage or to debris circulation through the primary circuit. This work presents a contribution, by discrete element method, towards a mechanistic description of the various stages of FFRD. The fuel fragments are described as a set of interacting particles, behaving as a granular medium. The model shows qualitative and quantitative agreement with experimental observations, such as the packing efficiency in the balloon, which is shown to stabilize at about 55%. The model is then applied to study fuel dispersal, for which experimental parametric studies are both difficult and expensive.

  13. Effects of the air–steam mixture on the permeability of damaged concrete

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medjigbodo, Sonagnon; Darquennes, Aveline; Aubernon, Corentin

    Massive concrete structures such as the containments of nuclear power plant must maintain their tightness at any circumstances to prevent the escape of radioactive fission products into the environment. In the event of an accident like a Loss of Coolant Accident (LOCA), the concrete wall is submitted to both hydric and mechanical loadings. A new experimental device reproducing these extreme conditions (water vapor transfer, 140 °C and 5 bars) is developed in the GeM Laboratory to determine the effect of the saturation degree, the mechanical loading and the flowing fluid type on the concrete transfer properties. The experimental tests showmore » that the previous parameters significantly affect the concrete permeability and the gas leakage rate. Their evolution as a function of the mechanical loading is characterized by two phases that are directly related to concrete microstructure and crack development.« less

  14. Simulation of German PKL refill/reflood experiment K9A using RELAP4/MOD7. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, M.T.; Davis, C.B.; Behling, S.R.

    This paper describes a RELAP4/MOD7 simulation of West Germany's Kraftwerk Union (KWU) Primary Coolant Loop (PKL) refill/reflood experiment K9A. RELAP4/MOD7, a best-estimate computer program for the calculation of thermal and hydraulic phenomena in a nuclear reactor or related system, is the latest version in the RELAP4 code development series. This study was the first major simulation using RELAP4/MOD7 since its release by the Idaho National Engineering Laboratory (INEL). The PKL facility is a reduced scale (1:134) representation of a typical West German four-loop 1300 MW pressurized water reactor (PWR). A prototypical scale of the total volume to power ratio wasmore » maintained. The test facility was designed specifically for an experiment simulating the refill/reflood phase of a Loss-of-Coolant Accident (LOCA).« less

  15. Golden Ratio in a Coupled-Oscillator Problem

    ERIC Educational Resources Information Center

    Moorman, Crystal M.; Goff, John Eric

    2007-01-01

    The golden ratio appears in a classical mechanics coupled-oscillator problem that many undergraduates may not solve. Once the symmetry is broken in a more standard problem, the golden ratio appears. Several student exercises arise from the problem considered in this paper.

  16. Current Problems of Improving the Environmental Certification and Output Compliance Verification in the Context of Environmental Management in Kazakhstan

    ERIC Educational Resources Information Center

    Zhambaev, Yerzhan S.; Sagieva, Galia K.; Bazarbek, Bakhytzhan Zh.; Akkulov, Rustem T.

    2016-01-01

    The article discusses the issues of improving the activity of subjects of environmental management in accordance with international environmental standards and national environmental legislation. The article deals with the problem of ensuring the implementation of international environmental standards, the introduction of eco-management, and the…

  17. The Impact of a Standards Guided Equity and Problem Solving Institute on Participating Science Teachers and Their Students.

    ERIC Educational Resources Information Center

    Huber, Richard A.; Smith, Robert W.; Shotsberger, Paul G.

    This study examined the effect of a teacher enhancement project combining training on the National Science Education Standards, problem solving and equity education on middle school science teachers' attitudes and practices and, in turn, the attitudes of their students. Participating teachers reported changes in their instructional methods that…

  18. 40 CFR 61.346 - Standards: Individual drain systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS National Emission Standard for Benzene... of cracks, gaps, or other problems that could result in benzene emissions. (5) Except as provided in...

  19. 40 CFR 61.346 - Standards: Individual drain systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS National Emission Standard for Benzene... of cracks, gaps, or other problems that could result in benzene emissions. (5) Except as provided in...

  20. The effects of multi-disciplinary psycho-social care on socio-economic problems in cancer patients: a cluster-randomized trial.

    PubMed

    Singer, Susanne; Roick, Julia; Meixensberger, Jürgen; Schiefke, Franziska; Briest, Susanne; Dietz, Andreas; Papsdorf, Kirsten; Mössner, Joachim; Berg, Thomas; Stolzenburg, Jens-Uwe; Niederwieser, Dietger; Keller, Annette; Kersting, Anette; Danker, Helge

    2018-06-01

    We examined whether multi-disciplinary stepped psycho-social care decreases financial problems and improves return-to-work in cancer patients. In a university hospital, wards were randomly allocated to either stepped or standard care. Stepped care comprised screening for financial problems, consultation between doctor and patient, and the provision of social service. Outcomes were financial problems at the time of discharge and return-to-work in patients < 65 years old half a year after baseline. The analysis employed mixed-effect multivariate regression modeling. Thirteen wards were randomized and 1012 patients participated (n = 570 in stepped care and n = 442 in standard care). Those who reported financial problems at baseline were less likely to have financial problems at discharge when they had received stepped care (odds ratio (OR) 0.2, 95% confidence interval (CI) 0.1, 0.7; p = 0.01). There was no evidence for an effect of stepped care on financial problems in patients without such problems at baseline (OR 1.1, CI 0.5, 2.6; p = 0.82). There were 399 patients < 65 years old who were not retired at baseline. In this group, there was no evidence for an effect of stepped care on being employed half a year after baseline (OR 0.7, CI 0.3, 2.0; p = 0.52). NCT01859429 CONCLUSIONS: Financial problems can be avoided more effectively with multi-disciplinary stepped psycho-social care than with standard care in patients who have such problems.

  1. Hydrochemical simulation of a mountain basin under hydrological variability

    NASA Astrophysics Data System (ADS)

    Montserrat, S.; Trewhela, T. A.; Navarro, L.; Navarrete, A.; Lagos Zuniga, M. A.; Garcia, A.; Caraballo, M.; Niño, Y.; McPhee, J. P.

    2016-12-01

    Water quality and the comprehension of hydrochemical phenomena in natural basins should be of complete relevance under hydrological uncertainties. The importance of identifying the main variables that are controlling a natural system and finding a way to predict their behavior under variable scenarios is mandatory to preserve these natural basins. This work presents an interdisciplinary model for the Yerba Loca watershed, a natural reserve basin in the Chilean central Andes. Based on different data sets, provided by public and private campaigns, a natural hydrochemical regime was identified. Yerba Loca is a natural reserve, characterized by the presence of several glaciers and wide sediment deposits crossed by a small low-slope creek in the upper part of the basin that leads to a high-slope narrow channel with less sediment depositions. Most relevant is the geological context around the glaciers, considering that most of them cover hydrothermal zones rich in both sulfides and sulfates, a situation commonly found in the Andes due to volcanic activity. Low pH (around 3), calcium-sulfate water with high concentrations of Iron, Copper and Zinc are found in the upper part of the basin in summer. These values can be attributed to the glaciers melting down and draining of the mentioned country rocks, which provide most of the creek flow in the upper basin. The latter clearly contrasts with the creek outlet, located 18 km downstream, showing near to neutral pH values and lower concentrations of the elements already mentioned. The scope of the present research is to account for the sources of the different hydrological inlets (e.g., rainfall, snow and/or glacier melting) that, depending on their location, may interact with a variety of reactive minerals and generate acid rock drainage (ARD). The inlet water is modeled along the creek using the softwares HEC-RAS and PHREEQC coupled, in order to characterize the water quality and to detect preferred sedimentation sections retaining precipitated minerals, mostly Iron and Aluminium hydroxysulfates, due to low velocity flow in those areas. Validation of the results is done using several data sets that show cycles along seasons under variable outflow and chemical conditions in the outlet of the basin, responding to the same inflow and initial chemical data used for simulation.

  2. Problematic Alcohol Use and Mild Intellectual Disability: Standardization of Pictorial Stimuli for an Alcohol Cue Reactivity Task

    ERIC Educational Resources Information Center

    van Duijvenbode, Neomi; Didden, Robert; Bloemsaat, Gijs; Engels, Rutger C. M. E.

    2012-01-01

    The present study focused on the first step in developing a cue reactivity task for studying cognitive biases in individuals with mild to borderline intellectual disability (ID) and alcohol use-related problems: the standardization of pictorial stimuli. Participants (N = 40), both with and without a history of alcohol use-related problems and…

  3. Looking beyond RtI Standard Treatment Approach: It's Not Too Late to Embrace the Problem-Solving Approach

    ERIC Educational Resources Information Center

    King, Diane; Coughlin, Patricia Kathleen

    2016-01-01

    There are two approaches for providing Tier 2 interventions within Response to Intervention (RtI): standard treatment protocol (STP) and the problem-solving approach (PSA). This article describes the multi-tiered RtI prevention model being implemented across the United States through an analysis of these two approaches in reading instruction. It…

  4. The "Pedagogy of the Oppressed": The Necessity of Dealing with Problems in Students' Lives

    ERIC Educational Resources Information Center

    Reynolds, Patricia R.

    2007-01-01

    Students have problems in their lives, but can teachers help them? Should teachers help? The No Child Left Behind (NCLB) act and its emphasis on standardized test results have forced school systems to produce high scores, and in turn school administrators pressure teachers to prepare students for taking standardized tests. Teachers may want to…

  5. Standardization of 237Np by the CIEMAT/NIST LSC tracer method

    PubMed

    Gunther

    2000-03-01

    The standardization of 237Np presents some difficulties: several groups of alpha, beta and gamma radiation, chemical problems with the daughter nuclide 233Pa, an incomplete radioactive equilibrium after sample preparation, high conversion of some gamma transitions. To solve the chemical problems, a sample composition involving the Ultima Gold AB scintillator and a high concentration of HCl is used. Standardization by the CIEMAT/NIST method and by pulse shape discrimination is described. The results agree within 0.1% with those obtained by two other methods.

  6. An implementation and analysis of the Abstract Syntax Notation One and the basic encoding rules

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The details of abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively solve the problem of data transfer across incompatible host environments are presented, and a compiler that was built to automate their use is described. Experiences with this compiler are also discussed which provide a quantitative analysis of the performance costs associated with the application of these standards. An evaluation is offered as to how well suited ASN.1 and BER are in solving the common data representation problem.

  7. Emission Standards for Particulates

    ERIC Educational Resources Information Center

    Walsh, George W.

    1974-01-01

    Promulgation of standards of performance under Section 111 and national emission standards for hazardous pollutants under Section 112 of the Clean Air Act is the responsibility of the Emission Standards and Engineering Division of the Environmental Protection Agency. The problems encountered and the bases used are examined. (Author/BT)

  8. Problem Solvers: Problem--Jesse's Train

    ERIC Educational Resources Information Center

    James, Julie; Steimle, Alice

    2014-01-01

    Persevering in problem solving and constructing and critiquing mathematical arguments are some of the mathematical practices included in the Common Core State Standards for Mathematics (CCSSI 2010). To solve unfamiliar problems, students must make sense of the situation and apply current knowledge. Teachers can present such opportunities by…

  9. Exploring creativity and critical thinking in traditional and innovative problem-based learning groups.

    PubMed

    Chan, Zenobia C Y

    2013-08-01

    To explore students' attitude towards problem-based learning, creativity and critical thinking, and the relevance to nursing education and clinical practice. Critical thinking and creativity are crucial in nursing education. The teaching approach of problem-based learning can help to reduce the difficulties of nurturing problem-solving skills. However, there is little in the literature on how to improve the effectiveness of a problem-based learning lesson by designing appropriate and innovative activities such as composing songs, writing poems and using role plays. Exploratory qualitative study. A sample of 100 students participated in seven semi-structured focus groups, of which two were innovative groups and five were standard groups, adopting three activities in problem-based learning, namely composing songs, writing poems and performing role plays. The data were analysed using thematic analysis. There are three themes extracted from the conversations: 'students' perceptions of problem-based learning', 'students' perceptions of creative thinking' and 'students' perceptions of critical thinking'. Participants generally agreed that critical thinking is more important than creativity in problem-based learning and clinical practice. Participants in the innovative groups perceived a significantly closer relationship between critical thinking and nursing care, and between creativity and nursing care than the standard groups. Both standard and innovative groups agreed that problem-based learning could significantly increase their critical thinking and problem-solving skills. Further, by composing songs, writing poems and using role plays, the innovative groups had significantly increased their awareness of the relationship among critical thinking, creativity and nursing care. Nursing educators should include more types of creative activities than it often does in conventional problem-based learning classes. The results could help nurse educators design an appropriate curriculum for preparing professional and ethical nurses for future clinical practice. © 2013 Blackwell Publishing Ltd.

  10. Preservation of Digital Objects.

    ERIC Educational Resources Information Center

    Galloway, Patricia

    2004-01-01

    Presents a literature review that covers the following topics related to preservation of digital objects: practical examples; stakeholders; recordkeeping standards; genre-specific problems; trusted repository standards; preservation methods; preservation metadata standards; and future directions. (Contains 82 references.) (MES)

  11. California residential energy standards: problems and recommendations relating to implementation, enforcement, and design. [Thermal insulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-08-01

    Documents relevant to the development and implementation of the California energy insulation standards for new residential buildings were evaluated and a survey was conducted to determine problems encountered in the implementation, enforcement, and design aspects of the standards. The impact of the standards on enforcement agencies, designers, builders and developers, manufacturers and suppliers, consumers, and the building process in general is summarized. The impact on construction costs and energy savings varies considerably because of the wide variation in prior insulation practices and climatic conditions in California. The report concludes with a series of recommendations covering all levels of government andmore » the building process. (MCW)« less

  12. Hierarchy problem and BSM physics

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Gautam

    2017-10-01

    The `hierarchy problem' plagues the Standard Model of particle physics. The source of this problem is our inability to answer the following question: Why is the Higgs mass so much below the GUT or Planck scale? A brief description about how `supersymmetry' and `composite Higgs' address this problem is given here.

  13. The Performance of Chinese Primary School Students on Realistic Arithmetic Word Problems

    ERIC Educational Resources Information Center

    Xin, Ziqiang; Lin, Chongde; Zhang, Li; Yan, Rong

    2007-01-01

    Compared with standard arithmetic word problems demanding only the direct use of number operations and computations, realistic problems are harder to solve because children need to incorporate "real-world" knowledge into their solutions. Using the realistic word problem testing materials developed by Verschaffel, De Corte, and Lasure…

  14. Using the CPGI to Determine Problem Gambling Prevalence in Australia: Measurement Issues

    ERIC Educational Resources Information Center

    Jackson, Alun C.; Wynne, Harold; Dowling, Nicki A.; Tomnay, Jane E.; Thomas, Shane A.

    2010-01-01

    Most states and territories in Australia have adopted the Problem Gambling Severity Index (PGSI) of the Canadian Problem Gambling Index as the standard measure of problem gambling in their prevalence studies and research programs. However, notwithstanding this attempted standardisation, differences in sampling and recruitment methodologies and in…

  15. Kindergarten Students Solving Mathematical Word Problems

    ERIC Educational Resources Information Center

    Johnson, Nickey Owen

    2013-01-01

    The purpose of this study was to explore problem solving with kindergarten students. This line of inquiry is highly significant given that Common Core State Standards emphasize deep, conceptual understanding in mathematics as well as problem solving in kindergarten. However, there is little research on problem solving with kindergarten students.…

  16. The Role of Expository Writing in Mathematical Problem Solving

    ERIC Educational Resources Information Center

    Craig, Tracy S.

    2016-01-01

    Mathematical problem-solving is notoriously difficult to teach in a standard university mathematics classroom. The project on which this article reports aimed to investigate the effect of the writing of explanatory strategies in the context of mathematical problem solving on problem-solving behaviour. This article serves to describe the…

  17. The Thinnest Path Problem

    DTIC Science & Technology

    2016-07-22

    their corresponding transmission powers . At first glance, one may wonder whether the thinnest path problem is simply a shortest path problem with the...nature of the shortest path problem. Another aspect that complicates the problem is the choice of the transmission power at each node (within a maximum...fixed transmission power at each node (in this case, the resulting hypergraph degenerates to a standard graph), the thinnest path problem is NP

  18. Problems of Technical Standards Teaching in the Context of the Globalization and Euro-Integration in Higher Education System of Ukraine

    ERIC Educational Resources Information Center

    Kornuta, Olena; Pryhorovska, Tetiana

    2015-01-01

    Globalization and Ukraine association with EU imply including Ukrainian universities into the world scientific space. The aim of this article is to analyze the problem of drawing standards teaching, based on the experience of Ivano-Frankivsk National Technical University of Oil and Gas (Ukraine) and to summarize the experience of post Soviet…

  19. Planning Model of Physics Learning In Senior High School To Develop Problem Solving Creativity Based On National Standard Of Education

    NASA Astrophysics Data System (ADS)

    Putra, A.; Masril, M.; Yurnetti, Y.

    2018-04-01

    One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.

  20. A Five Stage Conceptual Model for Information Technology Standards.

    ERIC Educational Resources Information Center

    Cargill, Carl F.

    The advent of anticipatory and boundary layer standards used in information technology standardization has created a need for a new base level theory that can be used to anticipate the problems that will be encountered in standards planning, creation, and implementation. To meet this need, a five-level model of standards has been developed. The…

  1. The stage-value model: Implications for the changing standards of care.

    PubMed

    Görtz, Daniel Patrik; Commons, Michael Lamport

    2015-01-01

    The standard of care is a legal and professional notion against which doctors and other medical personnel are held liable. The standard of care changes as new scientific findings and technological innovations within medicine, pharmacology, nursing and public health are developed and adopted. This study consists of four parts. Part 1 describes the problem and gives concrete examples of its occurrence. The second part discusses the application of the Model of Hierarchical Complexity on the field, giving examples of how standards of care are understood at different behavioral developmental stage. It presents the solution to the problem of standards of care at a Paradigmatic Stage 14. The solution at this stage is a deliberative, communicative process based around why certain norms should or should not apply in each specific case, by the use of "meta-norms". Part 3 proposes a Cross-Paradigmatic Stage 15 view of how the problem of changing standards of care can be solved. The proposed solution is to found the legal procedure in each case on well-established behavioral laws. We maintain that such a behavioristic, scientifically based justice would be much more proficient at effecting restorative legal interventions that create desired behaviors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. The Virginia History Standards and the Cold War

    ERIC Educational Resources Information Center

    Altschuler, Glenn C.; Rauchway, Eric

    2002-01-01

    President George W. Bush's approach to education policy has earned him cautious plaudits from otherwise hostile critics, who see much to admire in the implementation of standards for education. However useful such standards for testing students' technical skills like arithmetic and reading, they create problems for less-standardized processes like…

  3. Education Technology Standards Self-Efficacy (ETSSE) Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Simsek, Omer; Yazar, Taha

    2016-01-01

    Problem Statement: The educational technology standards for teachers set by the International Society for Technology in Education (the ISTE Standards-T) represent an important framework for using technology effectively in teaching and learning processes. These standards are widely used by universities, educational institutions, and schools. The…

  4. The Federal Government and Information Technology Standards: Building the National Information Infrastructure.

    ERIC Educational Resources Information Center

    Radack, Shirley M.

    1994-01-01

    Examines the role of the National Institute of Standards and Technology (NIST) in the development of the National Information Infrastructure (NII). Highlights include the standards process; voluntary standards; Open Systems Interconnection problems; Internet Protocol Suite; consortia; government's role; and network security. (16 references) (LRW)

  5. A Harmonious Accounting Duo?

    ERIC Educational Resources Information Center

    Schapperle, Robert F.; Hardiman, Patrick F.

    1992-01-01

    Accountants have urged "harmonization" of standards between the Governmental Accounting Standards Board and the Financial Accounting Standards Board, recommending similar reporting of like transactions. However, varying display of similar accounting events does not necessarily indicate disharmony. The potential for problems because of…

  6. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  7. Unified heuristics to solve routing problem of reverse logistics in sustainable supply chain

    NASA Astrophysics Data System (ADS)

    Anbuudayasankar, S. P.; Ganesh, K.; Lenny Koh, S. C.; Mohandas, K.

    2010-03-01

    A reverse logistics problem, motivated by many real-life applications, is examined where bottles/cans in which products are delivered from a processing depot to customers in one period are available for return to the depot in the following period. The picked-up bottles/cans need to be adjusted in the place of delivery load. This problem is termed as simultaneous delivery and pick-up problem with constrained capacity (SDPC). We develop three unified heuristics based on extended branch and bound heuristic, genetic algorithm and simulated annealing to solve SDPC. These heuristics are also designed to solve standard travelling salesman problem (TSP) and TSP with simultaneous delivery and pick-up (TSDP). We tested the heuristics on standard, derived and randomly generated datasets of TSP, TSDP and SDPC and obtained satisfying results with high convergence in reasonable time.

  8. Introductory Course Based on a Single Problem: Learning Nucleic Acid Biochemistry from AIDS Research

    ERIC Educational Resources Information Center

    Grover, Neena

    2004-01-01

    In departure from the standard approach of using several problems to cover specific topics in a class, I use a single problem to cover the contents of the entire semester-equivalent biochemistry classes. I have developed a problem-based service-learning (PBSL) problem on HIV/AIDS to cover nucleic acid concepts that are typically taught in the…

  9. The Role of Content Knowledge in Ill-Structured Problem Solving for High School Physics Students

    ERIC Educational Resources Information Center

    Milbourne, Jeff; Wiebe, Eric

    2018-01-01

    While Physics Education Research has a rich tradition of problem-solving scholarship, most of the work has focused on more traditional, well-defined problems. Less work has been done with ill-structured problems, problems that are better aligned with the engineering and design-based scenarios promoted by the Next Generation Science Standards. This…

  10. Dynamic simulation solves process control problem in Oman

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-11-16

    A dynamic simulation study solved the process control problems for a Saih Rawl, Oman, gas compressor station operated by Petroleum Development of Oman (PDO). PDO encountered persistent compressor failure that caused frequent facility shutdowns, oil production deferment, and gas flaring. It commissioned MSE (Consultants) Ltd., U.K., to find a solution for the problem. Saih Rawl, about 40 km from Qarn Alam, produces oil and associated gas from a large number of low and high-pressure wells. Oil and gas are separated in three separators. The oil is pumped to Qarn Alam for treatment and export. Associated gas is compressed in twomore » parallel trains. Train K-1115 is a 350,000 standard cu m/day, four-stage reciprocating compressor driven by a fixed-speed electric motor. Train K-1120 is a 1 million standard cu m/day, four-stage reciprocating compressor driven by a fixed-speed electric motor. Train K-1120 is a 1 million standard cu m/day, four-stage centrifugal compressor driven by a variable-speed motor. The paper describes tripping and surging problems with the gas compressor and the control simplifications that solved the problem.« less

  11. Naturalness of Electroweak Symmetry Breaking while Waiting for the LHC

    NASA Astrophysics Data System (ADS)

    Espinosa, J. R.

    2007-06-01

    After revisiting the hierarchy problem of the Standard Model and its implications for the scale of New Physics, I consider the finetuning problem of electroweak symmetry breaking in several scenarios beyond the Standard Model: SUSY, Little Higgs and "improved naturalness" models. The main conclusions are that: New Physics should appear on the reach of the LHC; some SUSY models can solve the hierarchy problem with acceptable residual tuning; Little Higgs models generically suffer from large tunings, many times hidden; and, finally, that "improved naturalness" models do not generically improve the naturalness of the SM.

  12. Duality in non-linear programming

    NASA Astrophysics Data System (ADS)

    Jeyalakshmi, K.

    2018-04-01

    In this paper we consider duality and converse duality for a programming problem involving convex objective and constraint functions with finite dimensional range. We do not assume any constraint qualification. The dual is presented by reducing the problem to a standard Lagrange multiplier problem.

  13. SCALE PROBLEMS IN REPORTING LANDSCAPE PATTERN AT THE REGIONAL SCALE

    EPA Science Inventory

    Remotely sensed data for Southeastern United States (Standard Federal Region 4) are used to examine the scale problems involved in reporting landscape pattern for a large, heterogeneous region. Frequency distributions of landscape indices illustrate problems associated with the g...

  14. Relationships between Problem Behaviors and Academic Achievement in Adolescents: The Unique Role of Attention Problems.

    ERIC Educational Resources Information Center

    Barriga, Alvaro Q.; Doran, Jeffrey W.; Newell, Stephanie B.; Morrison, Elizabeth M.; Barbetti, Victor; Robbins, Brent Dean

    2002-01-01

    This study examined relationships among eight teacher-reported problem behavior syndromes and standardized measures of academic achievement among 58 adolescents in an alternative school. Analysis suggested association between attention problems and academic achievement was primarily due to inattention component of the syndrome rather than the…

  15. Listening Responsively

    ERIC Educational Resources Information Center

    Callahan, Kadian M.

    2011-01-01

    Standards documents, such as the Common Core State Standards for Mathematics and "Principles and Standards for School Mathematics", expect teachers to foster mathematics learning by engaging students in meaningful mathematical discourse to expose students to different ways of thinking about and solving problems and positively influence their…

  16. 42 CFR 493.1233 - Standard: Complaint investigations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing General Laboratory Systems § 493.1233 Standard: Complaint investigations. The laboratory must have a system in place to ensure that it documents all complaints and problems reported to the laboratory...

  17. Three Perspectives on Standards: Positivism, Panopticism, and Intersubjectivism

    ERIC Educational Resources Information Center

    Lee, Cheu-jey

    2010-01-01

    Perhaps no other words occur more frequently than standards in today's discourse on educational reform. There is much debate about standards. Instead of taking sides on the debate, this paper argues that the problem with standards does not lie so much in standards themselves as in how they are viewed by those who make them and those who are held…

  18. [The environment and health. Of the difficulty of reconciling environmental and health standards in cultural nature].

    PubMed

    Mittelstrass, J

    1989-09-15

    Scientific cultures, i.e. modern industrial societies, create their own environment. The expression denoting such a creation is a Kultur-Natur ('cultural nature') determined by environmental and health standards. These standards are neither natural laws nor can they be derived from nature. They are instead a part of human rationality. They also have an ethical dimension. The argument focuses on the following aspects: (scientific and technological) rationality as problem solver and problem producer, exploration of the concept of the Kultur-Natur, the status of environmental and health standards, presenting the case for the concept of rational ethics (Vernunftethik) against the concept of ecological ethics and the supplementation of a research imperative by an ethical imperative.

  19. Ethics and choosing appropriate means to an end: problems with coal mine and nuclear workplace safety.

    PubMed

    Shrader-Frechette, Kristin; Cooke, Roger

    2004-02-01

    A common problem in ethics is that people often desire an end but fail to take the means necessary to achieve it. Employers and employees may desire the safety end mandated by performance standards for pollution control, but they may fail to employ the means, specification standards, necessary to achieve this end. This article argues that current (de jure) performance standards, for lowering employee exposures to ionizing radiation, fail to promote de facto worker welfare, in part because employers and employees do not follow the necessary means (practices known as specification standards) to achieve the end (performance standards) of workplace safety. To support this conclusion, the article argues that (1) safety requires attention to specification, as well as performance, standards; (2) coal-mine specification standards may fail to promote performance standards; (3) nuclear workplace standards may do the same; (4) choosing appropriate means to the end of safety requires attention to the ways uncertainties and variations in exposure may mask violations of standards; and (5) correcting regulatory inattention to differences between de jure and de facto is necessary for achievement of ethical goals for safety.

  20. Usability of HL7 and SNOMED CT standards in Java Persistence API environment.

    PubMed

    Antal, Gábor; Végh, Ádám Zoltán; Bilicki, Vilmos

    2014-01-01

    Due to the need for an efficient way of communication between the different stakeholders of healthcare (e.g. doctors, pharmacists, hospitals, patients etc.), the possibility of integrating different healthcare systems occurs. However, during the integration process several problems of heterogeneity might come up, which can turn integration into a difficult task. These problems motivated the development of healthcare information standards. The main goal of the HL7 family of standards is the standardization of communication between clinical systems and the unification of clinical document formats on the structural level. The SNOMED CT standard aims the unification of the healthcare terminology, thus the development of a standard on lexical level. The goal of this article is to introduce the usability of these two standards in Java Persistence API (JPA) environment, and to examine how standard-based system components can be efficiently generated. First, we shortly introduce the structure of the standards, their advantages and disadvantages. Then, we present an architecture design method, which can help to eliminate the possible structural drawbacks of the standards, and makes code generating tools applicable for the automatic production of certain system components.

  1. Keep It in Proportion.

    ERIC Educational Resources Information Center

    Snider, Richard G.

    1985-01-01

    The ratio factors approach involves recognizing a given fraction, then multiplying so that units cancel. This approach, which is grounded in concrete operational thinking patterns, provides a standard for science ratio and proportion problems. Examples are included for unit conversions, mole problems, molarity, speed/density problems, and…

  2. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  3. Faculty Perspectives on International Accounting Topics.

    ERIC Educational Resources Information Center

    Smith, L. Murphy; Salter, Stephen B.

    1996-01-01

    A survey of 63 professors specializing in international accounting identified the following topics as most important to incorporate into the curriculum: (1) foreign currency translation; (2) international accounting standards; (3) comparative standards and harmonizing of accounting standards; (4) reporting and disclosure problems of multinational…

  4. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR PROBLEM MANAGEMENT (G06)

    EPA Science Inventory

    The purpose of this SOP is to describe problem management, and to define a set of reporting actions to be taken in the event of a problem during any phase of the study. This procedure outlines the steps for making a problem known in order that it may be systematically resolved b...

  5. Neurons and the Process Standards

    ERIC Educational Resources Information Center

    Zambo, Ron; Zambo, Debby

    2011-01-01

    The classic Chickens and Pigs problem is considered to be an algebraic problem with two equations and two unknowns. In this article, the authors describe how third-grade teacher Maria is using it to develop a problem-based lesson because she is looking to her students' future needs. As Maria plans, she considers how a series of problems with the…

  6. Epistemic Beliefs about Justification Employed by Physics Students and Faculty in Two Different Problem Contexts

    ERIC Educational Resources Information Center

    Mercan, Fatih Caglayan

    2012-01-01

    This study examines the epistemic beliefs about justification employed by physics undergraduate and graduate students and faculty in the context of solving a standard classical physics problem and a frontier physics problem. Data were collected by a think-aloud problem solving session followed by a semi-structured interview conducted with 50…

  7. A Flipped Pedagogy for Expert Problem Solving

    NASA Astrophysics Data System (ADS)

    Pritchard, David

    The internet provides free learning opportunities for declarative (Wikipedia, YouTube) and procedural (Kahn Academy, MOOCs) knowledge, challenging colleges to provide learning at a higher cognitive level. Our ``Modeling Applied to Problem Solving'' pedagogy for Newtonian Mechanics imparts strategic knowledge - how to systematically determine which concepts to apply and why. Declarative and procedural knowledge is learned online before class via an e-text, checkpoint questions, and homework on edX.org (see http://relate.mit.edu/physicscourse); it is organized into five Core Models. Instructors then coach students on simple ``touchstone problems'', novel exercises, and multi-concept problems - meanwhile exercising three of the four C's: communication, collaboration, critical thinking and problem solving. Students showed 1.2 standard deviations improvement on the MIT final exam after three weeks instruction, a significant positive shift in 7 of the 9 categories in the CLASS, and their grades improved by 0.5 standard deviation in their following physics course (Electricity and Magnetism).

  8. Iterative algorithms for a non-linear inverse problem in atmospheric lidar

    NASA Astrophysics Data System (ADS)

    Denevi, Giulia; Garbarino, Sara; Sorrentino, Alberto

    2017-08-01

    We consider the inverse problem of retrieving aerosol extinction coefficients from Raman lidar measurements. In this problem the unknown and the data are related through the exponential of a linear operator, the unknown is non-negative and the data follow the Poisson distribution. Standard methods work on the log-transformed data and solve the resulting linear inverse problem, but neglect to take into account the noise statistics. In this study we show that proper modelling of the noise distribution can improve substantially the quality of the reconstructed extinction profiles. To achieve this goal, we consider the non-linear inverse problem with non-negativity constraint, and propose two iterative algorithms derived using the Karush-Kuhn-Tucker conditions. We validate the algorithms with synthetic and experimental data. As expected, the proposed algorithms out-perform standard methods in terms of sensitivity to noise and reliability of the estimated profile.

  9. 75 FR 22291 - Safety Standard for Toddler Beds

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-28

    ... the next most commonly reported problems. However, only two injuries--one laceration and one ingestion--resulted from these problems. Product integrity issues, mostly integrity of the mattress-support, were the... assembly instructions because consumer misassembly has been a problem with similar products, such as cribs...

  10. MOTOR VEHICLE SAFETY: NHTSA’s Ability to Detect and Recall Defective Replacement Crash Parts Is Limited

    DTIC Science & Technology

    2001-01-01

    incorporate airbags , under the used vehicle provision. NHTSA has not developed such standards because it has not identified significant problems with...might incorporate airbags . NHTSA has not developed such standards because it has not identified significant problems with occupant restraint systems...Appendix I: Scope and Methodology 24 Appendix II: State Legislation Governing Aftermarket Crash Parts and Recycled Airbags 27 Figures Figure 1: Replacement

  11. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    NASA Astrophysics Data System (ADS)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  12. An improved random walk algorithm for the implicit Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keady, Kendra P., E-mail: keadyk@lanl.gov; Cleveland, Mathew A.

    In this work, we introduce a modified Implicit Monte Carlo (IMC) Random Walk (RW) algorithm, which increases simulation efficiency for multigroup radiative transfer problems with strongly frequency-dependent opacities. To date, the RW method has only been implemented in “fully-gray” form; that is, the multigroup IMC opacities are group-collapsed over the full frequency domain of the problem to obtain a gray diffusion problem for RW. This formulation works well for problems with large spatial cells and/or opacities that are weakly dependent on frequency; however, the efficiency of the RW method degrades when the spatial cells are thin or the opacities aremore » a strong function of frequency. To address this inefficiency, we introduce a RW frequency group cutoff in each spatial cell, which divides the frequency domain into optically thick and optically thin components. In the modified algorithm, opacities for the RW diffusion problem are obtained by group-collapsing IMC opacities below the frequency group cutoff. Particles with frequencies above the cutoff are transported via standard IMC, while particles below the cutoff are eligible for RW. This greatly increases the total number of RW steps taken per IMC time-step, which in turn improves the efficiency of the simulation. We refer to this new method as Partially-Gray Random Walk (PGRW). We present numerical results for several multigroup radiative transfer problems, which show that the PGRW method is significantly more efficient than standard RW for several problems of interest. In general, PGRW decreases runtimes by a factor of ∼2–4 compared to standard RW, and a factor of ∼3–6 compared to standard IMC. While PGRW is slower than frequency-dependent Discrete Diffusion Monte Carlo (DDMC), it is also easier to adapt to unstructured meshes and can be used in spatial cells where DDMC is not applicable. This suggests that it may be optimal to employ both DDMC and PGRW in a single simulation.« less

  13. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  14. The SPH consistency problem and some astrophysical applications

    NASA Astrophysics Data System (ADS)

    Klapp, Jaime; Sigalotti, Leonardo; Rendon, Otto; Gabbasov, Ruslan; Torres, Ayax

    2017-11-01

    We discuss the SPH kernel and particle consistency problem and demonstrate that SPH has a limiting second-order convergence rate. We also present a solution to the SPH consistency problem. We present examples of how SPH implementations that are not mathematically consistent may lead to erroneous results. The new formalism has been implemented into the Gadget 2 code, including an improved scheme for the artificial viscosity. We present results for the ``Standard Isothermal Test Case'' of gravitational collapse and fragmentation of protostellar molecular cores that produce a very different evolution than with the standard SPH theory. A further application of accretion onto a black hole is presented.

  15. An assessment of RELAP5-3D using the Edwards-O'Brien Blowdown problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomlinson, E.T.; Aumiller, D.L.

    1999-07-01

    The RELAP5-3D (version bt) computer code was used to assess the United States Nuclear Regulatory Commission's Standard Problem 1 (Edwards-O'Brien Blowdown Test). The RELAP5-3D standard installation problem based on the Edwards-O'Brien Blowdown Test was modified to model the appropriate initial conditions and to represent the proper location of the instruments present in the experiment. The results obtained using the modified model are significantly different from the original calculation indicating the need to model accurately the experimental conditions if an accurate assessment of the calculational model is to be obtained.

  16. The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts.

    PubMed

    Coderre, Sylvain P; Harasym, Peter; Mandin, Henry; Fick, Gordon

    2004-11-05

    Pencil-and-paper examination formats, and specifically the standard, five-option multiple-choice question, have often been questioned as a means for assessing higher-order clinical reasoning or problem solving. This study firstly investigated whether two paper formats with differing number of alternatives (standard five-option and extended-matching questions) can test problem-solving abilities. Secondly, the impact of the alternatives number on psychometrics and problem-solving strategies was examined. Think-aloud protocols were collected to determine the problem-solving strategy used by experts and non-experts in answering Gastroenterology questions, across the two pencil-and-paper formats. The two formats demonstrated equal ability in testing problem-solving abilities, while the number of alternatives did not significantly impact psychometrics or problem-solving strategies utilized. These results support the notion that well-constructed multiple-choice questions can in fact test higher order clinical reasoning. Furthermore, it can be concluded that in testing clinical reasoning, the question stem, or content, remains more important than the number of alternatives.

  17. A bottom-up approach to the strong CP problem

    NASA Astrophysics Data System (ADS)

    Diaz-Cruz, J. L.; Hollik, W. G.; Saldana-Salazar, U. J.

    2018-05-01

    The strong CP problem is one of many puzzles in the theoretical description of elementary particle physics that still lacks an explanation. While top-down solutions to that problem usually comprise new symmetries or fields or both, we want to present a rather bottom-up perspective. The main problem seems to be how to achieve small CP violation in the strong interactions despite the large CP violation in weak interactions. In this paper, we show that with minimal assumptions on the structure of mass (Yukawa) matrices, they do not contribute to the strong CP problem and thus we can provide a pathway to a solution of the strong CP problem within the structures of the Standard Model and no extension at the electroweak scale is needed. However, to address the flavor puzzle, models based on minimal SU(3) flavor groups leading to the proposed flavor matrices are favored. Though we refrain from an explicit UV completion of the Standard Model, we provide a simple requirement for such models not to show a strong CP problem by construction.

  18. Solving Capacitated Closed Vehicle Routing Problem with Time Windows (CCVRPTW) using BRKGA with local search

    NASA Astrophysics Data System (ADS)

    Prasetyo, H.; Alfatsani, M. A.; Fauza, G.

    2018-05-01

    The main issue in vehicle routing problem (VRP) is finding the shortest route of product distribution from the depot to outlets to minimize total cost of distribution. Capacitated Closed Vehicle Routing Problem with Time Windows (CCVRPTW) is one of the variants of VRP that accommodates vehicle capacity and distribution period. Since the main problem of CCVRPTW is considered a non-polynomial hard (NP-hard) problem, it requires an efficient and effective algorithm to solve the problem. This study was aimed to develop Biased Random Key Genetic Algorithm (BRKGA) that is combined with local search to solve the problem of CCVRPTW. The algorithm design was then coded by MATLAB. Using numerical test, optimum algorithm parameters were set and compared with the heuristic method and Standard BRKGA to solve a case study on soft drink distribution. Results showed that BRKGA combined with local search resulted in lower total distribution cost compared with the heuristic method. Moreover, the developed algorithm was found to be successful in increasing the performance of Standard BRKGA.

  19. [Geriatric assessment. Development, status quo and perspectives].

    PubMed

    Lüttje, D; Varwig, D; Teigel, B; Gilhaus, B

    2011-08-01

    Multimorbidity is typical for geriatric patients. Problems not identified in time may lead to increased hospitalisation or prolonged hospital stay. Problems of multimorbidity are not covered by most guidelines or clinical pathways. The geriatric assessment supports standard clinical and technical assessment. Geriatric identification screening is basic for general practitioners and in emergency rooms to filter those patients bearing a special risk. Geriatric basic assessment covers most of the problems relevant for people in old age, revealing even problems that had so far been hidden. It permits to structure a comprehensive and holistic therapeutic approach and to evaluate the targets of treatment relevant for independent living and well-being. This results in reduction of morbidity and mortality. Assessment tools focusing on pain, nutrition and frailty should be added to the standardized geriatric basic assessment in Germany.

  20. Plasma equilibrium with fast ion orbit width, pressure anisotropy, and toroidal flow effects

    DOE PAGES

    Gorelenkov, Nikolai N.; Zakharov, Leonid E.

    2018-04-27

    Here, we formulate the problem of tokamak plasma equilibrium including the toroidal flow and fast ion (or energetic particle, EP) pressure anisotropy and the finite drift orbit width (FOW) effects. The problem is formulated via the standard Grad-Shafranov equation (GShE) amended by the solvability condition which imposes physical constraints on allowed spacial dependencies of the anisotropic pressure. The GShE problem employs the pressure coupling scheme and includes the dominant diagonal terms and non-diagonal corrections to the standard pressure tensor. The anisotropic tensor elements are obtained via the distribution function represented in the factorized form via the constants of motion. Consideredmore » effects on the plasma equilibrium are estimated analytically, if possible, to understand their importance for GShE tokamak plasma problem.« less

  1. Plasma equilibrium with fast ion orbit width, pressure anisotropy, and toroidal flow effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorelenkov, Nikolai N.; Zakharov, Leonid E.

    Here, we formulate the problem of tokamak plasma equilibrium including the toroidal flow and fast ion (or energetic particle, EP) pressure anisotropy and the finite drift orbit width (FOW) effects. The problem is formulated via the standard Grad-Shafranov equation (GShE) amended by the solvability condition which imposes physical constraints on allowed spacial dependencies of the anisotropic pressure. The GShE problem employs the pressure coupling scheme and includes the dominant diagonal terms and non-diagonal corrections to the standard pressure tensor. The anisotropic tensor elements are obtained via the distribution function represented in the factorized form via the constants of motion. Consideredmore » effects on the plasma equilibrium are estimated analytically, if possible, to understand their importance for GShE tokamak plasma problem.« less

  2. NEW U.S. EPA STANDARDS AND PROBLEMS ASSOCIATED WITH MEASUREMENT OF POLLUTANTS: IMPLICATION FOR FILTER MANUFACTURERS

    EPA Science Inventory

    This presentation will describe the following items: important epidemiologic data establishing the need for new particulate matter standards, the size distribution of suspended particulate matter, epidemiologic data demonstrating the need for a fine particle standard, indicator a...

  3. USL/DBMS NASA/PC R and D project system design standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1984-01-01

    A set of system design standards intended to assure the completeness and quality of designs developed for PC research and development projects is established. The standards presented address the areas of problem definition, initial design plan, design specification, and re-evaluation.

  4. Measurement standards for interdisciplinary medical rehabilitation.

    PubMed

    Johnston, M V; Keith, R A; Hinderer, S R

    1992-12-01

    Rehabilitation must address problems inherent in the measurement of human function and health-related quality of life, as well as problems in diagnosis and measurement of impairment. This educational document presents an initial set of standards to be used as guidelines for development and use of measurement and evaluation procedures and instruments for interdisciplinary, health-related rehabilitation. Part I covers general measurement principles and technical standards, beginning with validity, the central consideration for use of measures. Subsequent sections focus on reliability and errors of measurement, norms and scaling, development of measures, and technical manuals and guides. Part II covers principles and standards for use of measures. General principles of application of measures in practice are discussed first, followed by standards to protect persons being measured and then by standards for administrative applications. Many explanations, examples, and references are provided to help professionals understand measurement principles. Improved measurement will ensure the basis of rehabilitation as a science and nourish its success as a clinical service.

  5. Review of USGS Open-file Report 95-525 ("Cartographic and digital standard for geologic map information") and plans for development of Federal draft standards for geologic map information

    USGS Publications Warehouse

    Soller, David R.

    1996-01-01

    This report summarizes a technical review of USGS Open-File Report 95-525, 'Cartographic and Digital Standard for Geologic Map Information' and OFR 95-526 (diskettes containing digital representations of the standard symbols). If you are considering the purchase or use of those documents, you should read this report first. For some purposes, OFR 95-525 (the printed document) will prove to be an excellent resource. However, technical review identified significant problems with the two documents that will be addressed by various Federal and State committees composed of geologists and cartographers, as noted below. Therefore, the 2-year review period noted in OFR 95-525 is no longer applicable. Until those problems are resolved and formal standards are issued, you may consult the following World-Wide Web (WWW) site which contains information about development of geologic map standards: URL: http://ncgmp.usgs.gov/ngmdbproject/home.html

  6. Gravitational Field as a Pressure Force from Logarithmic Lagrangians and Non-Standard Hamiltonians: The Case of Stellar Halo of Milky Way

    NASA Astrophysics Data System (ADS)

    El-Nabulsi, Rami Ahmad

    2018-03-01

    Recently, the notion of non-standard Lagrangians was discussed widely in literature in an attempt to explore the inverse variational problem of nonlinear differential equations. Different forms of non-standard Lagrangians were introduced in literature and have revealed nice mathematical and physical properties. One interesting form related to the inverse variational problem is the logarithmic Lagrangian, which has a number of motivating features related to the Liénard-type and Emden nonlinear differential equations. Such types of Lagrangians lead to nonlinear dynamics based on non-standard Hamiltonians. In this communication, we show that some new dynamical properties are obtained in stellar dynamics if standard Lagrangians are replaced by Logarithmic Lagrangians and their corresponding non-standard Hamiltonians. One interesting consequence concerns the emergence of an extra pressure term, which is related to the gravitational field suggesting that gravitation may act as a pressure in a strong gravitational field. The case of the stellar halo of the Milky Way is considered.

  7. Introduction to Problem Solving, Grades 6-8 [with CD-ROM]. The Math Process Standards, Grades 6-8 Series

    ERIC Educational Resources Information Center

    Schackow, Joy Bronston; O'Connell, Susan

    2008-01-01

    The National Council of Teachers of Mathematics' (NCTM's) Process Standards support teaching that helps students develop independent, effective mathematical thinking. The books in the Heinemann Math Process Standards Series give every middle grades math teacher the opportunity to explore each standard in depth. The series offers friendly,…

  8. Cataloguing Standards; The Report of the Canadian Task Group on Cataloguing Standards.

    ERIC Educational Resources Information Center

    National Library of Canada, Ottawa (Ontario).

    Following the recommendations of the National Conference on Cataloguing Standards held at the National Library of Canada in May 1970, a Canadian Task Group on Cataloguing Standards was set up to study and identify present deficiencies in the organizing and processing of Canadian material, and the cataloging problems of Canadian libraries, and to…

  9. SODA FOUNTAIN-LUNCHEONETTE EQUIPMENT AND APPURTENANCES. NATIONAL SANITATION FOUNDATION STANDARD NO. 1.

    ERIC Educational Resources Information Center

    National Sanitation Foundation, Ann Arbor, MI.

    THIS STANDARD OF SODA FOUNTAIN-LUNCHEONETTE EQUIPMENT IS THE FIRST IN A SERIES OF NATIONAL SANITATION FOUNDATION STANDARDS. THESE STANDARDS ARE ISSUED IN RECOGNITION OF THE LONG FELT NEED FOR A COMMON UNDERSTANDING OF THE PROBLEMS OF SANITATION INVOLVING INDUSTRIAL AND ADMINISTRATIVE HEALTH OFFICIALS WHOSE OBLIGATION IT IS TO ENFORCE REGULATIONS.…

  10. Effect of Causal Stories in Solving Mathematical Story Problems

    ERIC Educational Resources Information Center

    Smith, Glenn Gordon; Gerretson, Helen; Olkun, Sinan; Joutsenlahti, Jorma

    2010-01-01

    This study investigated whether infusing "causal" story elements into mathematical word problems improves student performance. In one experiment in the USA and a second in USA, Finland and Turkey, undergraduate elementary education majors worked word problems in three formats: 1) standard (minimal verbiage), 2) potential causation…

  11. Ethical Principles, Practices, and Problems in Higher Education.

    ERIC Educational Resources Information Center

    Baca, M. Carlota, Ed.; Stein, Ronald H., Ed.

    Eighteen professionals analyze the ethical principles, practices, and problems in institutions of higher learning by examining the major issues facing higher education today. Focusing on ethical standards and judgements that affect decision-making and problem-solving, the contributors review the rights and responsibilities of academic freedom,…

  12. Peer Victimization as a Mediator of the Relation between Facial Attractiveness and Internalizing Problems

    ERIC Educational Resources Information Center

    Rosen, Lisa H.; Underwood, Marion K.; Beron, Kurt J.

    2011-01-01

    This study examined the relations among facial attractiveness, peer victimization, and internalizing problems in early adolescence. We hypothesized that experiences of peer victimization would partially mediate the relationship between attractiveness and internalizing problems. Ratings of attractiveness were obtained from standardized photographs…

  13. Transport and mixing of a volume of fluid in a complex geometry

    NASA Astrophysics Data System (ADS)

    Gavelli, Filippo

    This work presents the results of the experimental investigation of an entire sequence of events, leading to an unwanted injection of boron-depleted water into the core of a PWR. The study is subdivided into three tasks: the generation of a dilute volume in the primary system, its transport to the core, and the mixing encountered along the path. Experiments conducted at the University of Maryland (UM) facility show that, during a Small-Break LOCA transient, volumes of dilute coolant are segregated in the system, by means of phase-separating energy transport from the core to the steam generators (Boiler Condenser Mode). Two motion-initiating mechanisms are considered: the resumption of natural circulation during the recovery of the primary liquid inventory, and the reactor coolant pump startup under BCM conditions. During the inventory recovery, various phenomena are observed, that contribute to the mixing of the dilute volumes prior to the resumption of flow. The pump activation, instead, occurs in a stagnant system, therefore, no mixing of the unborated liquid has occurred. Since an unmixed slug has the potential for a larger reactivity excursion than a partially mixed one, the pump-initiated flow resumption represents the worst-case scenario. The impulse - response method is applied, for the first time, to the problem of mixing in the downcomer. This allows to express the mixing in terms of two parameters, the dispersion number and the residence time, characteristics of the flow distribution in the complex annular geometry. Other important results are obtained from the analysis of the experimental data with this procedure. It is shown that the turbulence generated by the pump impeller has a significant impact on the overall mixing. Also, the geometric discontinuities in the downcomer (in particular, the gap enlargement below the cold leg elevation) are shown to be the cause of vortex structures that highly enhance the mixing process.

  14. Theory of wide-angle photometry from standard stars

    NASA Technical Reports Server (NTRS)

    Usher, Peter D.

    1989-01-01

    Wide angle celestial structures, such as bright comet tails and nearby galaxies and clusters of galaxies, rely on photographic methods for quantified morphology and photometry, primarily because electronic devices with comparable resolution and sky coverage are beyond current technological capability. The problem of the photometry of extended structures and of how this problem may be overcome through calibration by photometric standard stars is examined. The perfect properties of the ideal field of view are stated in the guise of a radiometric paraxial approximation, in the hope that fields of view of actual telescopes will conform. Fundamental radiometric concepts are worked through before the issue of atmospheric attenuation is addressed. The independence of observed atmospheric extinction and surface brightness leads off the quest for formal solutions to the problem of surface photometry. Methods and problems of solution are discussed. The spectre is confronted in the spirit of standard stars and shown to be chimerical in that light, provided certain rituals are adopted. After a brief discussion of Baker-Sampson polynomials and the vexing issue of saturation, a pursuit is made of actual numbers to be expected in real cases. While the numbers crunched are gathered ex nihilo, they demonstrate the feasibility of Newton's method in the solution of this overdetermined, nonlinear, least square, multiparametric, photometric problem.

  15. Improving data quality in the linked open data: a survey

    NASA Astrophysics Data System (ADS)

    Hadhiatma, A.

    2018-03-01

    The Linked Open Data (LOD) is “web of data”, a different paradigm from “web of document” commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.

  16. The Future of Drought in the Southeastern U.S.: Projections from downscaled CMIP5 models

    NASA Astrophysics Data System (ADS)

    Keellings, D.; Engstrom, J.

    2017-12-01

    The Southeastern U.S. has been repeatedly impacted by severe droughts that have affected the environment and economy of the region. In this study the ability of 32 downscaled CMIP5 models, bias corrected using localized constructed analogs (LOCA), to simulate historical observations of dry spells from 1950-2005 are assessed using Perkins skill scores and significance tests. The models generally simulate the distribution of dry days well but there are significant differences between the ability of the best and worst performing models, particularly when it comes to the upper tail of the distribution. The best and worst performing models are then projected through 2099, using RCP 4.5 and 8.5, and estimates of 20 year return periods are compared. Only the higher skill models provide a good estimate of extreme dry spell lengths with simulations of 20 year return values within ± 5 days of observed values across the region. Projected return values differ by model grouping, but all models exhibit significant increases.

  17. Non-condensable gas effects in ROSA/AP600 small-break LOCA experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamura, Hideo; Kukita, Yutaka; Shaw, R.A.

    1996-06-01

    Integral experiments simulating the postulated accidents in the Westinghouse AP600 reactor have been conducted using the ROSA-V Large Scale Test Facility (LSTF). These experiments allowed the N{sub 2} gas for the pressurization of accumulator tanks to enter the primary system after the depletion of the tank water inventory. The gas migrated into the Passive Residual Heat Removal (PRHR) system heat exchanger tubes and into the Core Makeup Tanks (CMTs), and influenced the performance of these components which are unique to the AP600 reactor. Specifically, the PRHR was disabled soon after the N{sub 2} gas discharge in most of the experiments,more » although the core decay power was removed well by the steam discharge through the Automatic Depressurization System (ADS) after the PRHR was disabled. The N{sub 2} gas ingress into the CMTs occurred in the experiments with relatively large breaks ({ge} 2 inch in equivalent diameter), and contributed to a smooth draindown of the CMT inventory into the primary system.« less

  18. Nuclear reactor building

    DOEpatents

    Gou, P.F.; Townsend, H.E.; Barbanti, G.

    1994-04-05

    A reactor building for enclosing a nuclear reactor includes a containment vessel having a wetwell disposed therein. The wetwell includes inner and outer walls, a floor, and a roof defining a wetwell pool and a suppression chamber disposed there above. The wetwell and containment vessel define a drywell surrounding the reactor. A plurality of vents are disposed in the wetwell pool in flow communication with the drywell for channeling into the wetwell pool steam released in the drywell from the reactor during a LOCA for example, for condensing the steam. A shell is disposed inside the wetwell and extends into the wetwell pool to define a dry gap devoid of wetwell water and disposed in flow communication with the suppression chamber. In a preferred embodiment, the wetwell roof is in the form of a slab disposed on spaced apart support beams which define there between an auxiliary chamber. The dry gap, and additionally the auxiliary chamber, provide increased volume to the suppression chamber for improving pressure margin. 4 figures.

  19. Nuclear reactor building

    DOEpatents

    Gou, Perng-Fei; Townsend, Harold E.; Barbanti, Giancarlo

    1994-01-01

    A reactor building for enclosing a nuclear reactor includes a containment vessel having a wetwell disposed therein. The wetwell includes inner and outer walls, a floor, and a roof defining a wetwell pool and a suppression chamber disposed thereabove. The wetwell and containment vessel define a drywell surrounding the reactor. A plurality of vents are disposed in the wetwell pool in flow communication with the drywell for channeling into the wetwell pool steam released in the drywell from the reactor during a LOCA for example, for condensing the steam. A shell is disposed inside the wetwell and extends into the wetwell pool to define a dry gap devoid of wetwell water and disposed in flow communication with the suppression chamber. In a preferred embodiment, the wetwell roof is in the form of a slab disposed on spaced apart support beams which define therebetween an auxiliary chamber. The dry gap, and additionally the auxiliary chamber, provide increased volume to the suppression chamber for improving pressure margin.

  20. Methods for the mitigation of the chemical reactivity of beryllium in steam

    NASA Astrophysics Data System (ADS)

    Druyts, F.; Alves, E. C.; Wu, C. H.

    2004-08-01

    In the safety assessment of future fusion reactors, the reaction of beryllium with steam remains one of the main concerns. In case of a loss of coolant accident (LOCA), the use of beryllium in combination with pressurised water as coolant can lead to excessive hydrogen production due to the reaction Be + H 2O = BeO + H 2 + heat. Therefore, we started an R&D programme aimed at investigating mitigation methods for the beryllium/steam reaction. Beryllium samples were implanted with either calcium or aluminium ions in a 210 kV ion implanter at ITN Lisbon. The chemical reactivity of these samples in steam was measured at SCK • CEN in a dedicated experimental facility providing coupled thermogravimetry/mass spectrometry. In comparison to reference undoped material, the reactivity of doped beryllium after 30 min of exposure decreased with a factor 2 to 4. The mitigating effect was higher for calcium-doped than for aluminium-doped samples.

  1. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions.more » The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.« less

  2. Fabrication Control Plan for ORNL RH-LOCA ATF Test Specimens to be Irradiated in the ATR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Field, Kevin G.; Howard, Richard; Teague, Michael

    2014-06-01

    The purpose of this fabrication plan is (1) to summarize the design of a set of rodlets that will be fabricated and then irradiated in the Advanced Test Reactor (ATR) and (2) provide requirements for fabrication and acceptance criteria for inspections of the Light Water Reactor (LWR) – Accident Tolerant Fuels (ATF) rodlet components. The functional and operational (F&OR) requirements for the ATF program are identified in the ATF Test Plan. The scope of this document only covers fabrication and inspections of rodlet components detailed in drawings 604496 and 604497. It does not cover the assembly of these items tomore » form a completed test irradiation assembly or the inspection of the final assembly, which will be included in a separate INL final test assembly specification/inspection document. The controls support the requirements that the test irradiations must be performed safely and that subsequent examinations must provide valid results.« less

  3. The IRIS Spool-Type Reactor Coolant Pump

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kujawski, J.M.; Kitch, D.M.; Conway, L.E.

    2002-07-01

    IRIS (International Reactor Innovative and Secure) is a light water cooled, 335 MWe power reactor which is being designed by an international consortium as part of the US DOE NERI Program. IRIS features an integral reactor vessel that contains all the major reactor coolant system components including the reactor core, the coolant pumps, the steam generators and the pressurizer. This integral design approach eliminates the large coolant loop piping, and thus eliminates large loss-of-coolant accidents (LOCAs) as well as the individual component pressure vessels and supports. In addition, IRIS is being designed with a long life core and enhanced safetymore » to address the requirements defined by the US DOE for Generation IV reactors. One of the innovative features of the IRIS design is the adoption of a reactor coolant pump (called 'spool' pump) which is completely contained inside the reactor vessel. Background, status and future developments of the IRIS spool pump are presented in this paper. (authors)« less

  4. 40 CFR 92.11 - Compliance with emission standards in extraordinary circumstances.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standards in extraordinary circumstances. The provisions of this section are intended to address problems... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Compliance with emission standards in extraordinary circumstances. 92.11 Section 92.11 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  5. 40 CFR 63.5910 - What reports must I submit and when?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Standards for Hazardous Air Pollutants: Reinforced Plastic Composites Production Notifications, Reports, and... period into those that are due to startup, shutdown, control equipment problems, process problems, other...

  6. 40 CFR 63.5910 - What reports must I submit and when?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards for Hazardous Air Pollutants: Reinforced Plastic Composites Production Notifications, Reports, and... period into those that are due to startup, shutdown, control equipment problems, process problems, other...

  7. 42 CFR 493.1451 - Standard: Technical supervisor responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... testing samples; and (vi) Assessment of problem solving skills; and (9) Evaluating and documenting the... analysis and reporting of test results; (5) Resolving technical problems and ensuring that remedial actions...

  8. Literacy, Numeracy, and Problem Solving in Technology-Rich Environments among U.S. Adults: Results from the Program for the International Assessment of Adult Competencies 2012. Appendix D: Standard Error Tables. First Look. NCES 2014-008

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2013

    2013-01-01

    This paper provides Appendix D, Standard Error tables, for the full report, entitled. "Literacy, Numeracy, and Problem Solving in Technology-Rich Environments among U.S. Adults: Results from the Program for the International Assessment of Adult Competencies 2012. First Look. NCES 2014-008." The full report presents results of the Program…

  9. Visual field defects may not affect safe driving.

    PubMed

    Dow, Jamie

    2011-10-01

    In Quebec a driver whose acquired visual field defect renders them ineligible for a driver's permit renewal may request an exemption from the visual field standard by demonstrating safe driving despite the defect. For safety reasons it was decided to attempt to identify predictors of failure on the road test in order to avoid placing driving evaluators in potentially dangerous situations when evaluating drivers with visual field defects. During a 4-month period in 2009 all requests for exemptions from the visual field standard were collected and analyzed. All available medical and visual field data were collated for 103 individuals, of whom 91 successfully completed the evaluation process and obtained a waiver. The collated data included age, sex, type of visual field defect, visual field characteristics, and concomitant medical problems. No single factor, or combination of factors, could predict failure of the road test. All 5 failures of the road test had cognitive problems but 6 of the successful drivers also had known cognitive problems. Thus, cognitive problems influence the risk of failure but do not predict certain failure. Most of the applicants for an exemption were able to complete the evaluation process successfully, thereby demonstrating safe driving despite their handicap. Consequently, jurisdictions that have visual field standards for their driving permit should implement procedures to evaluate drivers with visual field defects that render them unable to meet the standard but who wish to continue driving.

  10. Estimates and Standard Errors for Ratios of Normalizing Constants from Multiple Markov Chains via Regeneration

    PubMed Central

    Doss, Hani; Tan, Aixin

    2017-01-01

    In the classical biased sampling problem, we have k densities π1(·), …, πk(·), each known up to a normalizing constant, i.e. for l = 1, …, k, πl(·) = νl(·)/ml, where νl(·) is a known function and ml is an unknown constant. For each l, we have an iid sample from πl,·and the problem is to estimate the ratios ml/ms for all l and all s. This problem arises frequently in several situations in both frequentist and Bayesian inference. An estimate of the ratios was developed and studied by Vardi and his co-workers over two decades ago, and there has been much subsequent work on this problem from many different perspectives. In spite of this, there are no rigorous results in the literature on how to estimate the standard error of the estimate. We present a class of estimates of the ratios of normalizing constants that are appropriate for the case where the samples from the πl’s are not necessarily iid sequences, but are Markov chains. We also develop an approach based on regenerative simulation for obtaining standard errors for the estimates of ratios of normalizing constants. These standard error estimates are valid for both the iid case and the Markov chain case. PMID:28706463

  11. Estimates and Standard Errors for Ratios of Normalizing Constants from Multiple Markov Chains via Regeneration.

    PubMed

    Doss, Hani; Tan, Aixin

    2014-09-01

    In the classical biased sampling problem, we have k densities π 1 (·), …, π k (·), each known up to a normalizing constant, i.e. for l = 1, …, k , π l (·) = ν l (·)/ m l , where ν l (·) is a known function and m l is an unknown constant. For each l , we have an iid sample from π l , · and the problem is to estimate the ratios m l /m s for all l and all s . This problem arises frequently in several situations in both frequentist and Bayesian inference. An estimate of the ratios was developed and studied by Vardi and his co-workers over two decades ago, and there has been much subsequent work on this problem from many different perspectives. In spite of this, there are no rigorous results in the literature on how to estimate the standard error of the estimate. We present a class of estimates of the ratios of normalizing constants that are appropriate for the case where the samples from the π l 's are not necessarily iid sequences, but are Markov chains. We also develop an approach based on regenerative simulation for obtaining standard errors for the estimates of ratios of normalizing constants. These standard error estimates are valid for both the iid case and the Markov chain case.

  12. The Dreaded "Work" Problems Revisited: Connections through Problem Solving from Basic Fractions to Calculus

    ERIC Educational Resources Information Center

    Shore, Felice S.; Pascal, Matthew

    2008-01-01

    This article describes several distinct approaches taken by preservice elementary teachers to solving a classic rate problem. Their approaches incorporate a variety of mathematical concepts, ranging from proportions to infinite series, and illustrate the power of all five NCTM Process Standards. (Contains 8 figures.)

  13. Activities: Activities to Introduce Maxima-Minima Problems.

    ERIC Educational Resources Information Center

    Pleacher, David

    1991-01-01

    Presented are student activities that involve two standard problems from geometry and calculus--the volume of a box and the bank shot on a pool table. Problem solving is emphasized as a method of inquiry and application with descriptions of the results using graphical, numerical, and physical models. (JJK)

  14. The Problem of Faculty Relocation.

    ERIC Educational Resources Information Center

    Tabachnick, Stephen E.

    1992-01-01

    A faculty move to a new campus can be traumatic, but colleges and universities can take steps to lessen the strain. Solutions to faculty relocation problems should be a standard part of any hiring package, not left to chance and individual negotiation. Some problems are inexpensive and easy to solve. (MSE)

  15. Child and Family Predictors of Therapy Outcome for Children with Behavioral and Emotional Problems

    ERIC Educational Resources Information Center

    Hemphill, Sheryl A.; Littlefield, Lyn

    2006-01-01

    This study investigated the characteristics of 106 children primarily referred for externalizing behavior problems and their families, and assessed the prediction of treatment outcome following a standardized short-term, cognitive behavioral group program. "Exploring Together" comprised a children's group (anger management, problem-solving and…

  16. [Research progress on standards of commodity classes of Chinese materia medica and discussion on several key problems].

    PubMed

    Yang, Guang; Zeng, Yan; Guo, Lan-Ping; Huang, Lu-Qi; Jin, Yan; Zheng, Yu-Guang; Wang, Yong-Yan

    2014-05-01

    Standards of commodity classes of Chinese materia medica is an important way to solve the "Lemons Problem" of traditional Chinese medicine market. Standards of commodity classes are also helpful to rebuild market mechanisms for "high price for good quality". The previous edition of commodity classes standards of Chinese materia medica was made 30 years ago. It is no longer adapted to the market demand. This article researched progress on standards of commodity classes of Chinese materia medica. It considered that biological activity is a better choice than chemical constituents for standards of commodity classes of Chinese materia medica. It is also considered that the key point to set standards of commodity classes is finding the influencing factors between "good quality" and "bad quality". The article also discussed the range of commodity classes of Chinese materia medica, and how to coordinate standards of pharmacopoeia and commodity classes. According to different demands, diversiform standards can be used in commodity classes of Chinese materia medica, but efficacy is considered the most important index of commodity standard. Decoction pieces can be included in standards of commodity classes of Chinese materia medica. The authors also formulated the standards of commodity classes of Notoginseng Radix as an example, and hope this study can make a positive and promotion effect on traditional Chinese medicine market related research.

  17. 40 CFR 63.4520 - What reports must I submit?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards for Hazardous Air Pollutants for Surface Coating of Plastic Parts and Products Notifications... problems, process problems, other known causes, and other unknown causes. (xi) A summary of the total...

  18. 40 CFR 63.4520 - What reports must I submit?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Standards for Hazardous Air Pollutants for Surface Coating of Plastic Parts and Products Notifications... problems, process problems, other known causes, and other unknown causes. (xi) A summary of the total...

  19. Meeting the New AASL Standards for the 21st-Century Learner via Big6 Problem Solving

    ERIC Educational Resources Information Center

    Needham, Joyce

    2010-01-01

    "AASL Standards for the 21st-Century Learner." New standards for library media programs! What does it mean to practicing library media specialists? Does this mean they must abandon all the strategies, activities, and lessons they have developed based upon "Information Power's Information Literacy Standards for Student Learning" and create all new…

  20. User-generated quality standards for youth mental health in primary care: a participatory research design using mixed methods

    PubMed Central

    Graham, Tanya; Rose, Diana; Murray, Joanna; Ashworth, Mark; Tylee, André

    2014-01-01

    Objectives To develop user-generated quality standards for young people with mental health problems in primary care using a participatory research model. Methods 50 young people aged 16–25 from community settings and primary care participated in focus groups and interviews about their views and experiences of seeking help for mental health problems in primary care, cofacilitated by young service users and repeated to ensure respondent validation. A second group of young people also aged 16–25 who had sought help for any mental health problem from primary care or secondary care within the last 5 years were trained as focus groups cofacilitators (n=12) developed the quality standards from the qualitative data and participated in four nominal groups (n=28). Results 46 quality standards were developed and ranked by young service users. Agreement was defined as 100% of scores within a two-point region. Group consensus existed for 16 quality standards representing the following aspects of primary care: better advertising and information (three); improved competence through mental health training and skill mix within the practice (two); alternatives to medication (three); improved referral protocol (three); and specific questions and reassurances (five). Alternatives to medication and specific questions and reassurances are aspects of quality which have not been previously reported. Conclusions We have demonstrated the feasibility of using participatory research methods in order to develop user-generated quality standards. The development of patient-generated quality standards may offer a more formal method of incorporating the views of service users into quality improvement initiatives. This method can be adapted for generating quality standards applicable to other patient groups. PMID:24920648

  1. Primary Discussion on Standardized Management of Purchasing Large Equipments for Measurement Technology Institution

    NASA Astrophysics Data System (ADS)

    Hu, Chang; Hu, Juanli; Zhou, Qi; Yang, Yue

    In view of current situation and existing problem on purchasing equipment for measurement technology institution, this paper analyzes key factors that affect the standardization of equipment procurement and it proposes a set of scientific and standardized solutions for equipment procurement based on actual work.

  2. Effect of Directed Study of Mathematics Vocabulary on Standardized Mathematics Assessment Questions

    ERIC Educational Resources Information Center

    Waite, Adel Marlane

    2017-01-01

    The problems under investigation included (a) Did a directed study of mathematics vocabulary significantly affect student performance levels on standardized mathematical questions? and (b) Did the strategies used in this study significantly affect student performance levels on standardized mathematical questions? The population consisted of…

  3. Choosing the Right Tool

    ERIC Educational Resources Information Center

    Boote, Stacy K.

    2016-01-01

    Students' success with fourth-grade content standards builds on mathematical knowledge learned in third grade and creates a conceptual foundation for division standards in subsequent grades that focus on the division algorithm. The division standards in fourth and fifth grade are similar; but in fourth grade, division problem divisors are only one…

  4. International Cooperation for a Single World Production Standard of High Definition Television.

    ERIC Educational Resources Information Center

    Hongcharu, Boonchai

    Broadcasters, television engineers and the production industry have encountered many problems with diverse television standards since the introduction of color television. With the advent of high definition television (HDTV), the chance to have a common production standard for international exchange of programs and technical information has…

  5. Supporting Mathematics Instruction through Community

    ERIC Educational Resources Information Center

    Amidon, Joel C.; Trevathan, Morgan L.

    2016-01-01

    Raising expectations is nothing new. Every iteration of standards elevates the expectations for what students should know and be able to do. The Common Core State Standards for Mathematics (CCSSM) is no exception, with standards for content and practice that move beyond memorization of traditional algorithms to "make sense of problems and…

  6. Planning Questions and Persevering in the Practices

    ERIC Educational Resources Information Center

    Gurl, Theresa J.; Fox, Ryan; Dabovic, Nikolina; Leavitt, Arielle Eager

    2016-01-01

    The implementation of the Common Core's Standards for Mathematical Practice can pose a challenge to all teachers of mathematics but especially to preservice teachers. These standards require teaching in a way that often differs from what preservice teachers have experienced as learners. Standard 1--"Make sense of problems and persevere in…

  7. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  8. Combinatorial algorithms for design of DNA arrays.

    PubMed

    Hannenhalli, Sridhar; Hubell, Earl; Lipshutz, Robert; Pevzner, Pavel A

    2002-01-01

    Optimal design of DNA arrays requires the development of algorithms with two-fold goals: reducing the effects caused by unintended illumination (border length minimization problem) and reducing the complexity of masks (mask decomposition problem). We describe algorithms that reduce the number of rectangles in mask decomposition by 20-30% as compared to a standard array design under the assumption that the arrangement of oligonucleotides on the array is fixed. This algorithm produces provably optimal solution for all studied real instances of array design. We also address the difficult problem of finding an arrangement which minimizes the border length and come up with a new idea of threading that significantly reduces the border length as compared to standard designs.

  9. Development of Finnish Elementary Pupils' Problem-Solving Skills in Mathematics

    ERIC Educational Resources Information Center

    Laine, Anu; Näveri, Liisa; Ahtee, Maija; Pehkonen, Erkki

    2014-01-01

    The purpose of this study is to determine how Finnish pupils' problem-solving skills develop from the 3rd to 5th grade. As research data, we use one non-standard problem from pre- and post-test material from a three-year follow-up study, in the area of Helsinki, Finland. The problems in both tests consisted of four questions related to each other.…

  10. Computational strategy for the solution of large strain nonlinear problems using the Wilkins explicit finite-difference approach

    NASA Technical Reports Server (NTRS)

    Hofmann, R.

    1980-01-01

    The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.

  11. The importance of production standard operating procedure in a family business company

    NASA Astrophysics Data System (ADS)

    Hongdiyanto, C.

    2017-12-01

    Plastic industry is a growing sector, therefore UD X which engage in this business has a great potential to grow as well. The problem faced by this family business company is that no standard operating procedure is used and it lead to problem in the quality and quantity produced. This research is aim to create a production standard operating procedure for UD X. Semistructure interview is used to gather information from respondent to help writer create the SOP. There are four SOP’s created, namely: classifying SOP, sorting SOP, milling SOP and packing SOP. Having SOP will improve the effectiveness of production because employees already know how to work in each stages of production process.

  12. The Posing of Arithmetic Problems by Mathematically Talented Students

    ERIC Educational Resources Information Center

    Espinoza González, Johan; Lupiáñez Gómez, José Luis; Segovia Alex, Isidoro

    2016-01-01

    Introduction: This paper analyzes the arithmetic problems posed by a group of mathematically talented students when given two problem-posing tasks, and compares these students' responses to those given by a standard group of public school students to the same tasks. Our analysis focuses on characterizing and identifying the differences between the…

  13. Following the Template: Transferring Modeling Skills to Nonstandard Problems

    ERIC Educational Resources Information Center

    Tyumeneva, Yu. A.; Goncharova, M. V.

    2017-01-01

    This study seeks to analyze how students apply a mathematical modeling skill that was previously learned by solving standard word problems to the solution of word problems with nonstandard contexts. During the course of an experiment involving 106 freshmen, we assessed how well they were able to transfer the mathematical modeling skill that is…

  14. An Introduction to Multilinear Formula Score Theory. Measurement Series 84-4.

    ERIC Educational Resources Information Center

    Levine, Michael V.

    Formula score theory (FST) associates each multiple choice test with a linear operator and expresses all of the real functions of item response theory as linear combinations of the operator's eigenfunctions. Hard measurement problems can then often be reformulated as easier, standard mathematical problems. For example, the problem of estimating…

  15. Pain as a Predictor of Sleep Problems in Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Tudor, Megan E.; Walsh, Caitlin E.; Mulder, Emile C.; Lerner, Matthew D.

    2015-01-01

    Evidence suggests that pain interferes with sleep in youth with developmental disabilities. This study examined the relationship between pain and sleep problems in a sample of youth with parent-reported autism spectrum disorder (N = 62). Mothers reported on standardized measures of pain and sleep problems. Youth demonstrated atypically high levels…

  16. Modelling Problem-Solving Situations into Number Theory Tasks: The Route towards Generalisation

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis; Iatridou, Maria

    2010-01-01

    This paper examines the way two 10th graders cope with a non-standard generalisation problem that involves elementary concepts of number theory (more specifically linear Diophantine equations) in the geometrical context of a rectangle's area. Emphasis is given on how the students' past experience of problem solving (expressed through interplay…

  17. Best Known Problem Solving Strategies in "High-Stakes" Assessments

    ERIC Educational Resources Information Center

    Hong, Dae S.

    2011-01-01

    In its mathematics standards, National Council of Teachers of Mathematics (NCTM) states that problem solving is an integral part of all mathematics learning and exposure to problem solving strategies should be embedded across the curriculum. Furthermore, by high school, students should be able to use, decide and invent a wide range of strategies.…

  18. The Impact of Tutoring on Early Reading Achievement for Children with and without Attention Problems

    ERIC Educational Resources Information Center

    Rabiner, David L.; Malone, Patrick S.

    2004-01-01

    This study examined whether the benefits of reading tutoring in first grade were moderated by children's level of attention problems. Participants were 581 children from the intervention and control samples of Fast Track, a longitudinal multisite investigation of the development and prevention of conduct problems. Standardized reading achievement…

  19. A Statewide Case Management, Surveillance, and Outcome Evaluation System for Children with Special Health Care Needs

    PubMed Central

    Monsen, Karen A.; Elsbernd, Scott A.; Barnhart, Linda; Stock, Jacquie; Prock, Carla E.; Looman, Wendy S.; Nardella, Maria

    2013-01-01

    Objectives. To evaluate the feasibility of implementing a statewide children with special health care needs (CSHCN) program evaluation, case management, and surveillance system using a standardized instrument and protocol that operationalized the United States Health and Human Services CSHCN National Performance Measures. Methods. Public health nurses in local public health agencies in Washington State jointly developed and implemented the standardized system. The instrument was the Omaha System. Descriptive statistics were used for the analysis of standardized data. Results. From the sample of CSHCN visit reports (n = 127), 314 problems and 853 interventions were documented. The most common problem identified was growth and development followed by health care supervision, communication with community resources, caretaking/parenting, income, neglect, and abuse. The most common intervention category was surveillance (60%), followed by case management (24%) and teaching, guidance, and counseling (16%). On average, there were 2.7 interventions per problem and 6.7 interventions per visit. Conclusions. This study demonstrates the feasibility of an approach for statewide CSHCN program evaluation, case management, and surveillance system. Knowledge, behavior, and status ratings suggest that there are critical unmet needs in the Washington State CSHCN population for six major problems. PMID:23533804

  20. Usability evaluation of Laboratory and Radiology Information Systems integrated into a hospital information system.

    PubMed

    Nabovati, Ehsan; Vakili-Arki, Hasan; Eslami, Saeid; Khajouei, Reza

    2014-04-01

    This study was conducted to evaluate the usability of widely used laboratory and radiology information systems. Three usability experts independently evaluated the user interfaces of Laboratory and Radiology Information Systems using heuristic evaluation method. They applied Nielsen's heuristics to identify and classify usability problems and Nielsen's severity rating to judge their severity. Overall, 116 unique heuristic violations were identified as usability problems. In terms of severity, 67 % of problems were rated as major and catastrophic. Among 10 heuristics, "consistency and standards" was violated most frequently. Moreover, mean severity of problems concerning "error prevention" and "help and documentation" heuristics was higher than of the others. Despite widespread use of specific healthcare information systems, they suffer from usability problems. Improving the usability of systems by following existing design standards and principles from the early phased of system development life cycle is recommended. Especially, it is recommended that the designers design systems that inhibit the initiation of erroneous actions and provide sufficient guidance to users.

  1. A Formidable Foe is Sabotaging Your Results: What You Should Know about Biofilms and Wound Healing

    PubMed Central

    Barker, Jenny C; Khansa, Ibrahim; Gordillo, Gayle M

    2017-01-01

    Learning Objectives After reading this article, the participant should be able to: 1. Describe biofilm pathogenesis as it relates to problem wounds, 2. Understand the pre-clinical and clinical evidence implicating biofilm in problem wounds, 3. Explain the diagnostic and treatment challenges that biofilms create for problem wounds, 4. Demonstrate a basic understanding of emerging strategies aimed at counteracting these processes. Summary Biofilm represents a protected mode of growth for bacteria, allowing them to evade standard diagnostic techniques and avoid eradication by standard therapies. Though only recently discovered, biofilm has existed for millennia and complicates nearly every aspect of medicine. Biofilm impacts wound healing by allowing bacteria to evade immune responses, prolonging inflammation and disabling skin barrier function. It is important to understand why problem wounds persist despite state-of-the-art treatment, why they are difficult to accurately diagnose, and why they recur. The aim of this article is to focus on current gaps in knowledge related to problem wounds, specifically, biofilm infection. PMID:28445380

  2. Extensions of the standard model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramond, P.

    1983-01-01

    In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinnmore » symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references.« less

  3. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  4. Outreach pharmacy service in old age homes: a Hong Kong experience.

    PubMed

    Lau, Wai-Man; Chan, Kit; Yung, Tsz-Ho; Lee, Anna See-Wing

    2003-06-01

    To explore drug-related problems in old age homes in Hong Kong through outreach pharmacy service. A standard form was used by outreach pharmacists to identify drug-related problems at old age homes. Homes were selected through random sampling, voluntary participation or adverse selection. Initial observation and assessment were performed in the first and second weeks. Appropriate advice and recommendations were given upon assessment and supplemented by a written report. Educational talks were provided to staff of the homes in addition to other drug information materials. At week 7 to 9, evaluations were carried out. Eighty-five homes were assessed and identified to have problems in the drug management system. These problems could generally be classified into physical storage (8.8%), quality of storage (19.2%), drug administration system (13.3%), documentation (16.4%), and drug knowledge of staff of homes (42.2%). Quality of drug storage was the most common problem found, followed by documentation and drug knowledge (73%, 50% and 44% of points assessed with problems, respectively). Apart from lack of drug knowledge and unawareness of potential risks by staff, minimal professional standards unmet may be fundamentally related to lack of professional input and inadequacy in legislation. Most homes demonstrated significant improvements upon simple interventions, from a majority of homes with more than 10 problems to a majority with less than 5 problems. Diverse problems in drug management are common in old age homes, which warrants attention and professional inputs. Simple interventions and education by pharmacists are shown to be effective in improving the quality of drug management and hence care to residents. While future financing of old age home service can be reviewed within the social context to provide incentives for improvement, review of regulatory policy with enforcement may be more fundamental and effective in upholding the service standard.

  5. Labor force participation and the influence of having back problems on income poverty in Australia.

    PubMed

    Schofield, Deborah J; Callander, Emily J; Shrestha, Rupendra N; Percival, Richard; Kelly, Simon J; Passey, Megan E

    2012-06-01

    Cross-sectional study of 45- to 64-year-old Australians. To assess the relationship between chronic back problems and being in income poverty among the older working-aged population. Older workers who leave the labor force due to chronic back problems have fragile economic situations and as such are likely to have poorer living standards. Poverty is one way of comparing the living standards of different individuals within society. The 2003 Survey of Disability, Ageing and Carers data were used, along with the 50% of the median equivalized income-unit income poverty line to identify those in poverty. Logistic regression models were used to look at the relationship between chronic back problems, labor force participation, and poverty. Regardless of labor force participation status (employed full-time, part-time, or not in the labor force at all), those with chronic back problems were significantly more likely to be in poverty. Those not in the labor force due to chronic back problems were significantly more likely to be in poverty than those in the labor force full-time with no chronic health condition (Odds ratio [OR]: 0.07, 95% CI: 0.07-0.07, P < 0.0001). Further, those employed part-time with no chronic health condition were 48% less likely to be in poverty (OR: 0.52, 95% CI: 0.51-0.53, P < 0.0001) than those also employed part-time but with chronic back problems. It was found that among those with back problems, those out of the labor force were significantly more likely to be in poverty than those employed part-time or full-time (OR: 0.44, 95% CI: 0.43-0.44, P < 0.0001; OR: 0.10, 95% CI: 0.10-0.10, P < 0.0001, respectively). This highlights the need to prevent and effectively treat chronic back problems, as these conditions are associated with reduced living standards.

  6. National Education Standards: Getting beneath the Surface. Policy Information Perspective

    ERIC Educational Resources Information Center

    Barton, Paul E.

    2009-01-01

    This report discusses issues involved in the debate over whether the United States should have national education standards, what must be considered in creating such standards, what problems must be addressed, and what trade-offs might be required among conflicting objectives. The first section provides a short summary of developments in education…

  7. Minority Language Standardisation and the Role of Users

    ERIC Educational Resources Information Center

    Lane, Pia

    2015-01-01

    Developing a standard for a minority language is not a neutral process; this has consequences for the status of the language and how the language users relate to the new standard. A potential inherent problem with standardisation is whether the language users themselves will accept and identify with the standard. When standardising minority…

  8. Beyond Standards: The Rest of the Agenda.

    ERIC Educational Resources Information Center

    Sobol, Thomas

    1997-01-01

    Argues that new high standards of curriculum content and student performance are important, but they alone are not enough. If traditional aspirations to make students wise and just are to be realized, It is necessary to move beyond standards to support teachers, provide necessary resources, nurture community, handle problems of race effectively,…

  9. The Problems of Educational Standards in the United States and Russia.

    ERIC Educational Resources Information Center

    Bespal'ko, V. P.

    1996-01-01

    Compares and contrasts the need for educational standards in the United States and Russia. Argues that both systems burden their students with an excess of peripheral and inconsequential material in order to satisfy outdated pedagogical objectives. Praises American efforts at creating national standards but questions their applicability to Russia.…

  10. Cost minimizing of cutting process for CNC thermal and water-jet machines

    NASA Astrophysics Data System (ADS)

    Tavaeva, Anastasia; Kurennov, Dmitry

    2015-11-01

    This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.

  11. What's the Problem? Familiarity Working Memory, and Transfer in a Problem-Solving Task.

    PubMed

    Kole, James A; Snyder, Hannah R; Brojde, Chandra L; Friend, Angela

    2015-01-01

    The contributions of familiarity and working memory to transfer were examined in the Tower of Hanoi task. Participants completed 3 different versions of the task: a standard 3-disk version, a clothing exchange task that included familiar semantic content, and a tea ceremony task that included unfamiliar semantic content. The constraints on moves were equivalent across tasks, and each could be solved with the same sequence of movements. Working memory demands were manipulated by the provision of a (static or dynamic) visual representation of the problem. Performance was equivalent for the standard Tower of Hanoi and clothing exchange tasks but worse for the tea ceremony task, and it decreased with increasing working memory demands. Furthermore, the standard Tower of Hanoi task and clothing exchange tasks independently, additively, and equivalently transferred to subsequent tasks, whereas the tea ceremony task did not. The results suggest that both familiarity and working memory demands determine overall level of performance, whereas familiarity influences transfer.

  12. The inverse problem of estimating the gravitational time dilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gusev, A. V., E-mail: avg@sai.msu.ru; Litvinov, D. A.; Rudenko, V. N.

    2016-11-15

    Precise testing of the gravitational time dilation effect suggests comparing the clocks at points with different gravitational potentials. Such a configuration arises when radio frequency standards are installed at orbital and ground stations. The ground-based standard is accessible directly, while the spaceborne one is accessible only via the electromagnetic signal exchange. Reconstructing the current frequency of the spaceborne standard is an ill-posed inverse problem whose solution depends significantly on the characteristics of the stochastic electromagnetic background. The solution for Gaussian noise is known, but the nature of the standards themselves is associated with nonstationary fluctuations of a wide class ofmore » distributions. A solution is proposed for a background of flicker fluctuations with a spectrum (1/f){sup γ}, where 1 < γ < 3, and stationary increments. The results include formulas for the error in reconstructing the frequency of the spaceborne standard and numerical estimates for the accuracy of measuring the relativistic redshift effect.« less

  13. The Health Care Financing Administration's new examination documentation criteria: minimum auditing standards for the neurologic examination to be used by Medicare and other payors. Report from the American Academy of Neurology Medical Economics and Management Subcommittee.

    PubMed

    Nuwer, M R; Sigsbee, B

    1998-02-01

    Medicare recently announced the adoption of minimum documentation criteria for the neurologic examination. These criteria are added to existing standards for the history and medical decision-making. These criteria will be used in compliance audits by Medicare and other payors. Given the current federal initiative to eliminate fraud in the Medicare program, all neurologists need to comply with these standards. These criteria are for documentation only. Neurologic standards of care require a more complex and diverse examination pertinent to the problem(s) under consideration. Further guidance as to the content of a neurologic evaluation is outlined in the article "Practice guidelines: Neurologic evaluation" (Neurology 1990; 40: 871). The level of history and examination required for specific services is defined in the American Medical Association current procedural terminology book. Documentation standards for examination of children are not yet defined.

  14. Standardized Tests as Outcome Measures for Evaluating Instructional Interventions in Mathematics and Science

    NASA Astrophysics Data System (ADS)

    Sussman, Joshua Michael

    This three-paper dissertation explores problems with the use of standardized tests as outcome measures for the evaluation of instructional interventions in mathematics and science. Investigators commonly use students' scores on standardized tests to evaluate the impact of instructional programs designed to improve student achievement. However, evidence suggests that the standardized tests may not measure, or may not measure well, the student learning caused by the interventions. This problem is special case of a basic problem in applied measurement related to understanding whether a particular test provides accurate and useful information about the impact of an educational intervention. The three papers explore different aspects of the issue and highlight the potential benefits of (a) using particular research methods and of (b) implementing changes to educational policy that would strengthen efforts to reform instructional intervention in mathematics and science. The first paper investigates measurement problems related to the use of standardized tests in applied educational research. Analysis of the research projects funded by the Institute of Education Sciences (IES) Mathematics and Science Education Program permitted me to address three main research questions. One, how often are standardized tests used to evaluate new educational interventions? Two, do the tests appear to measure the same thing that the intervention teaches? Three, do investigators establish validity evidence for the specific uses of the test? The research documents potential problems and actual problems related to the use of standardized tests in leading applied research, and suggests changes to policy that would address measurement issues and improve the rigor of applied educational research. The second paper explores the practical consequences of misalignment between an outcome measure and an educational intervention in the context of summative evaluation. Simulated evaluation data and a psychometric model of alignment grounded in item response modeling generate the results that address the following research question: how do differences between what a test measures and what an intervention teaches influence the results of an evaluation? The simulation derives a functional relationship between alignment, defined as the match between the test and the intervention, and treatment sensitivity, defined as the statistical power for detecting the impact of an intervention. The paper presents a new model of the effect of misalignment on the results of an evaluation and recommendations for outcome measure selection. The third paper documents the educational effectiveness of the Learning Mathematics through Representations (LMR) lesson sequence for students classified as English Learners (ELs). LMR is a research-based curricular unit designed to support upper elementary students' understandings of integers and fractions, areas considered foundational for the development of higher mathematics. The experimental evaluation contains a multilevel analysis of achievement data from two assessments: a standardized test and a researcher-developed assessment. The study coordinates the two sources of research data with a theoretical mechanism of action in order to rigorously document the effectiveness and educational equity of LMR for ELs using multiple sources of information.

  15. Development and validation of a new method for the registration of overuse injuries in sports injury epidemiology: the Oslo Sports Trauma Research Centre (OSTRC) overuse injury questionnaire.

    PubMed

    Clarsen, Benjamin; Myklebust, Grethe; Bahr, Roald

    2013-05-01

    Current methods for injury registration in sports injury epidemiology studies may substantially underestimate the true burden of overuse injuries due to a reliance on time-loss injury definitions. To develop and validate a new method for the registration of overuse injuries in sports. A new method, including a new overuse injury questionnaire, was developed and validated in a 13-week prospective study of injuries among 313 athletes from five different sports, cross-country skiing, floorball, handball, road cycling and volleyball. All athletes completed a questionnaire by email each week to register problems in the knee, lower back and shoulder. Standard injury registration methods were also used to record all time-loss injuries that occurred during the study period. The new method recorded 419 overuse problems in the knee, lower back and shoulder during the 3-month-study period. Of these, 142 were classified as substantial overuse problems, defined as those leading to moderate or severe reductions in sports performance or participation, or time loss. Each week, an average of 39% of athletes reported having overuse problems and 13% reported having substantial problems. In contrast, standard methods of injury registration registered only 40 overuse injuries located in the same anatomical areas, the majority of which were of minimal or mild severity. Standard injury surveillance methods only capture a small percentage of the overuse problems affecting the athletes, largely because few problems led to time loss from training or competition. The new method captured a more complete and nuanced picture of the burden of overuse injuries in this cohort.

  16. An investigation of dynamic-analysis methods for variable-geometry structures

    NASA Technical Reports Server (NTRS)

    Austin, F.

    1980-01-01

    Selected space structure configurations were reviewed in order to define dynamic analysis problems associated with variable geometry. The dynamics of a beam being constructed from a flexible base and the relocation of the completed beam by rotating the remote manipulator system about the shoulder joint were selected. Equations of motion were formulated in physical coordinates for both of these problems, and FORTRAN programs were developed to generate solutions by numerically integrating the equations. These solutions served as a standard of comparison to gauge the accuracy of approximate solution techniques that were developed and studied. Good control was achieved in both problems. Unstable control system coupling with the system flexibility did not occur. An approximate method was developed for each problem to enable the analyst to investigate variable geometry effects during a short time span using standard fixed geometry programs such as NASTRAN. The average angle and average length techniques are discussed.

  17. Impact of Early Intervention on Psychopathology, Crime, and Weil-Being at Age 25

    PubMed Central

    2015-01-01

    Objective This randomized controlled trial tested the efficacy of early intervention to prevent adult psychopathology and improve well-being in early-starting conduct-problem children. Method Kindergarteners (N=9,594) in three cohorts (1991–1993) at 55 schools in four communities were screened for conduct problems, yielding 979 early starters. A total of 891 (91%) consented (51% African American, 47% European American; 69% boys). Children were randomly assigned by school cluster to a 10-year intervention or control. The intervention goal was to develop social competencies in children that would carry them throughout life, through social skills training, parent behavior-management training with home visiting, peer coaching, reading tutoring, and classroom social-emotional curricula. Manualization and supervision ensured program fidelity. Ninety-eight percent participated during grade 1, and 80% continued through grade 10. At age 25, arrest records were reviewed (N=817,92%), and condition-blinded adults psychiatrically interviewed participants (N=702; 81% of living participants) and a peer (N=535) knowledgeable about the participant. Results Intent-to-treat logistic regression analyses indicated that 69% of participants in the control arm displayed at least one externalizing, internalizing, or substance abuse psychiatric problem (based on self- or peer interview) at age 25, in contrast with 59% of those assigned to intervention (odds ratio=0.59, CI=0.43–0.81; number needed to treat=8). This pattern also held for self-interviews, peer interviews, scores using an “and” rule for self- and peer reports, and separate tests for externalizing problems, internalizing problems, and substance abuse problems, as well as for each of three cohorts, four sites, male participants, female participants, African Americans, European Americans, moderate-risk, and high-risk subgroups. Intervention participants also received lower severity-weighted violent (standardized estimate=-0.37) and drug (standardized estimate=-0.43) crime conviction scores, lower risky sexual behavior scores (standardized estimate=-0.24), and higher well-being scores (standardized estimate=0.19). Conclusions This study provides evidence for the efficacy of early intervention in preventing adult psychopathology among high-risk early-starting conduct-problem children. PMID:25219348

  18. Problem Solving in Everyday Office Work--A Diary Study on Differences between Experts and Novices

    ERIC Educational Resources Information Center

    Rausch, Andreas; Schley, Thomas; Warwas, Julia

    2015-01-01

    Contemporary office work is becoming increasingly challenging as many routine tasks are automated or outsourced. The remaining problem solving activities may also offer potential for lifelong learning in the workplace. In this study, we analyzed problem solving in an office work setting using an Internet-based, semi-standardized diary to collect…

  19. Parental Divorce, Marital Conflict and Children's Behavior Problems: A Comparison of Adopted and Biological Children

    ERIC Educational Resources Information Center

    Amato, Paul R.; Cheadle, Jacob E.

    2008-01-01

    We used adopted and biological children from Waves 1 and 2 of the National Survey of Families and Households to study the links between parents' marital conflict, divorce and children's behavior problems. The standard family environment model assumes that marital conflict and divorce increase the risk of children's behavior problems. The passive…

  20. A Methodology for Validation of High Resolution Combat Models

    DTIC Science & Technology

    1988-06-01

    TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the

  1. The Effects of Differentiating Instruction by Learning Styles on Problem Solving in Cooperative Groups

    ERIC Educational Resources Information Center

    Westbrook, Amy F.

    2011-01-01

    It can be difficult to find adequate strategies when teaching problem solving in a standard based mathematics classroom. The purpose of this study was to improve students' problem solving skills and attitudes through differentiated instruction when working on lengthy performance tasks in cooperative groups. This action research studied for 15 days…

  2. Understanding Problem-Solving Errors by Students with Learning Disabilities in Standards-Based and Traditional Curricula

    ERIC Educational Resources Information Center

    Bouck, Emily C.; Bouck, Mary K.; Joshi, Gauri S.; Johnson, Linley

    2016-01-01

    Students with learning disabilities struggle with word problems in mathematics classes. Understanding the type of errors students make when working through such mathematical problems can further describe student performance and highlight student difficulties. Through the use of error codes, researchers analyzed the type of errors made by 14 sixth…

  3. Scale problems in reporting landscape pattern at the regional scale

    Treesearch

    R.V. O' Neill; C.T. Hunsaker; S.P. Timmins; B.L. Jackson; K.B. Jones; Kurt H. Riitters; James D. Wickham

    1996-01-01

    Remotely sensed data for Southeastern United States (Standard Federal Region 4) are used to examine the scale problems involved in reporting landscape pattern for a large, heterogeneous region. Frequency distribu-tions of landscape indices illustrate problems associated with the grain or resolution of the data. Grain should be 2 to 5 times smaller than the...

  4. Additive Relations Word Problems in the South African Curriculum and Assessment Policy Standard at Foundation Phase

    ERIC Educational Resources Information Center

    Roberts, Nicky

    2016-01-01

    Drawing on a literature review of classifications developed by each of Riley, Verschaffel and Carpenter and their respective research groups, a refined typology of additive relations word problems is proposed and then used as analytical tool to classify the additive relations word problems in South African Curriculum and Assessment Policy Standard…

  5. Progressing From Initially Ambiguous Functional Analyses: Three Case Examples

    PubMed Central

    Tiger, Jeffrey H.; Fisher, Wayne W.; Toussaint, Karen A.; Kodak, Tiffany

    2009-01-01

    Most often functional analyses are initiated using a standard set of test conditions, similar to those described by Iwata, Dorsey, Slifer, Bauman, and Richman (1982/1994). These test conditions involve the careful manipulation of motivating operations, discriminative stimuli, and reinforcement contingencies to determine the events related to the occurrence and maintenance of problem behavior. Some individuals display problem behavior that is occasioned and reinforced by idiosyncratic or otherwise unique combinations of environmental antecedents and consequences of behavior, which are unlikely to be detected using these standard assessment conditions. For these individuals, modifications to the standard test conditions or the inclusion of novel test conditions may result in clearer assessment outcomes. The current study provides three case examples of individuals whose functional analyses were initially undifferentiated; however, modifications to the standard conditions resulted in the identification of behavioral functions and the implementation of effective function-based treatments. PMID:19233611

  6. Self-aligned quadruple patterning-compliant placement

    NASA Astrophysics Data System (ADS)

    Nakajima, Fumiharu; Kodama, Chikaaki; Nakayama, Koichi; Nojima, Shigeki; Kotani, Toshiya

    2015-03-01

    Self-Aligned Quadruple Patterning (SAQP) will be one of the leading candidates for sub-14nm node and beyond. However, compared with triple patterning, making a feasible standard cell placement has following problems. (1) When coloring conflicts occur between two adjoining cells, they may not be solved easily since SAQP layout has stronger coloring constraints. (2) SAQP layout cannot use stitch to solve coloring conflict. In this paper, we present a framework of SAQP-aware standard cell placement considering the above problems. When standard cell is placed, the proposed method tries to solve coloring conflicts between two cells by exchanging two of three colors. If some conflicts remain between adjoining cells, dummy space will be inserted to keep coloring constraints of SAQP. We show some examples to confirm effectiveness of the proposed framework. To our best knowledge, this is the first framework of SAQP-aware standard cell placement.

  7. Using standardized patients versus video cases for representing clinical problems in problem-based learning.

    PubMed

    Yoon, Bo Young; Choi, Ikseon; Choi, Seokjin; Kim, Tae-Hee; Roh, Hyerin; Rhee, Byoung Doo; Lee, Jong-Tae

    2016-06-01

    The quality of problem representation is critical for developing students' problem-solving abilities in problem-based learning (PBL). This study investigates preclinical students' experience with standardized patients (SPs) as a problem representation method compared to using video cases in PBL. A cohort of 99 second-year preclinical students from Inje University College of Medicine (IUCM) responded to a Likert scale questionnaire on their learning experiences after they had experienced both video cases and SPs in PBL. The questionnaire consisted of 14 items with eight subcategories: problem identification, hypothesis generation, motivation, collaborative learning, reflective thinking, authenticity, patient-doctor communication, and attitude toward patients. The results reveal that using SPs led to the preclinical students having significantly positive experiences in boosting patient-doctor communication skills; the perceived authenticity of their clinical situations; development of proper attitudes toward patients; and motivation, reflective thinking, and collaborative learning when compared to using video cases. The SPs also provided more challenges than the video cases during problem identification and hypotheses generation. SPs are more effective than video cases in delivering higher levels of authenticity in clinical problems for PBL. The interaction with SPs engages preclinical students in deeper thinking and discussion; growth of communication skills; development of proper attitudes toward patients; and motivation. Considering the higher cost of SPs compared with video cases, SPs could be used most advantageously during the preclinical period in the IUCM curriculum.

  8. Stencils and problem partitionings: Their influence on the performance of multiple processor systems

    NASA Technical Reports Server (NTRS)

    Reed, D. A.; Adams, L. M.; Patrick, M. L.

    1986-01-01

    Given a discretization stencil, partitioning the problem domain is an important first step for the efficient solution of partial differential equations on multiple processor systems. Partitions are derived that minimize interprocessor communication when the number of processors is known a priori and each domain partition is assigned to a different processor. This partitioning technique uses the stencil structure to select appropriate partition shapes. For square problem domains, it is shown that non-standard partitions (e.g., hexagons) are frequently preferable to the standard square partitions for a variety of commonly used stencils. This investigation is concluded with a formalization of the relationship between partition shape, stencil structure, and architecture, allowing selection of optimal partitions for a variety of parallel systems.

  9. International aspects of problems associated with the use of psychoactive drugs.

    PubMed

    Chruściel, T L

    1976-01-01

    Problems of terminology, use and consumption, advertising, effectiveness and appropriate information and education on psychoactive drugs are outlined and advantages of international collaboration in attempts to establish standards for controlled clinical trials in psychopharmacology are discussed.

  10. Using General Education Student Data to Calibrate a Mathematics Alternate Assessment Based on Modified Academic Achievement Standards

    ERIC Educational Resources Information Center

    Jung, Eunju

    2012-01-01

    The U.S. Department of Education released regulations governing the development of alternate assessments for students with persistent learning problems who are eligible for Modified Academic Achievement Standards (MAAS) in 2007. To date, state regular assessments or alternate assessments based on Alternate Academic Achievement Standards have not…

  11. Proficiency Standards and Cut-Scores for Language Proficiency Tests.

    ERIC Educational Resources Information Center

    Moy, Raymond H.

    1984-01-01

    Discusses the problems associated with "grading on a curve," the approach often used for standard setting on language proficiency tests. Proposes four main steps presented in the setting of a non-arbitrary cut-score. These steps not only establish a proficiency standard checked by external criteria, but also check to see that the test covers the…

  12. Adapting to Change: Teacher Perceptions of Implementing the Common Core State Standards

    ERIC Educational Resources Information Center

    Burks, Brooke A.; Beziat, Tara L. R.; Danley, Sheree; Davis, Kashara; Lowery, Holly; Lucas, Jessica

    2015-01-01

    The current research study looked at secondary teachers' (grades 6-12) perceptions of their preparedness to implement the Common Core State Standards as well as their feelings about the training they have or have not received related to implementing the standards. The problem: Many conflicting views exist among teachers, parents, and others…

  13. An Examination of the Statistical Problem-Solving Process as a Potential Means for Developing an Understanding of Argumentation

    ERIC Educational Resources Information Center

    Smith Baum, Brittany Deshae

    2017-01-01

    As part of the recent history of the mathematics curriculum, reasoning and argument have been emphasized throughout mathematics curriculum standards. Specifically, as part of the Common Core State Standards for Mathematics, the Standards for Mathematical Practice were presented, which included the expectation that students develop arguments and…

  14. Lapses in Education Policy Formulation Processes in Nigeria: Implications for the Standard of Education

    ERIC Educational Resources Information Center

    Oyedeji, Samson Oyelola

    2015-01-01

    Nigeria's Education Policy is quite laudable yet her investments in education are not too rewarding considering the deteriorating educational standards. The poor performance of the education sector in Nigeria, which is evident in the falling in standard of education and poor quality, has become very worrisome. What is the problem? Is the…

  15. Contracting for Computer Software in Standardized Computer Languages

    PubMed Central

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  16. Contingency management treatment in cocaine using methadone maintained patients with and without legal problems.

    PubMed

    Ginley, Meredith K; Rash, Carla J; Olmstead, Todd A; Petry, Nancy M

    2017-11-01

    Legal difficulties and cocaine use are prevalent in methadone maintenance patients, and they are related to one another, as well as to poor response to methadone treatment. Contingency management (CM) is efficacious for decreasing cocaine use, but the relation of CM treatment to criminal activities has rarely been studied. This study evaluated whether baseline legal problems are related to subsequent substance use and illegal activities for cocaine using methadone maintained patients and whether CM differentially improves outcomes depending on baseline legal problems. Using data from four randomized CM trials (N=323), we compared methadone maintained patients with legal problems at the start of study participation to those without initial legal problems. Overall, the addition of CM to standard methadone care improved substance use outcomes regardless of initial legal problems. Endorsement of legal problems within 30days of study initiation was associated with reduced proportion of negative samples submitted during the 12-week treatment period. A significant interaction effect of baseline legal problems and treatment condition was present for subsequent self-reports of illegal activities. Those with baseline legal problems who were assigned to CM had reduced self-reports of reengagement in illegal activity throughout a six month follow-up compared to their counterparts randomized to standard care. Adding CM to methadone treatment improves substance use outcomes and reduces subsequent illegal activity in cocaine-using methadone patients with legal problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Solving the transient water age distribution problem in environmental flow systems

    NASA Astrophysics Data System (ADS)

    Cornaton, F. J.

    2011-12-01

    The temporal evolution of groundwater age and its frequency distributions can display important changes as flow regimes vary due to the natural change in climate and hydrologic conditions and/or to human induced pressures on the resource to satisfy the water demand. Groundwater age being nowadays frequently used to investigate reservoir properties and recharge conditions, special attention needs to be put on the way this property is characterized, would it be using isotopic methods, multiple tracer techniques, or mathematical modelling. Steady-state age frequency distributions can be modelled using standard numerical techniques, since the general balance equation describing age transport under steady-state flow conditions is exactly equivalent to a standard advection-dispersion equation. The time-dependent problem is however described by an extended transport operator that incorporates an additional coordinate for water age. The consequence is that numerical solutions can hardly be achieved, especially for real 3-D applications over large time periods of interest. The absence of any robust method has thus left us in the quantitative hydrogeology community dodging the issue of transience. Novel algorithms for solving the age distribution problem under time-varying flow regimes are presented and, for some specific configurations, extended to the problem of generalized component exposure time. The solution strategy is based on the combination of the Laplace Transform technique applied to the age (or exposure time) coordinate with standard time-marching schemes. The method is well-suited for groundwater problems with possible density-dependency of fluid flow (e.g. coupled flow and heat/salt concentration problems), but also presents significance to the homogeneous flow (compressible case) problem. The approach is validated using 1-D analytical solutions and exercised on some demonstration problems that are relevant to topical issues in groundwater age, including analysis of transfer times in the vadose zone, aquifer-aquitard interactions and the induction of transient age distributions when a well pump is started.

  18. Regularizing cosmological singularities by varying physical constants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dąbrowski, Mariusz P.; Marosek, Konrad, E-mail: mpdabfz@wmf.univ.szczecin.pl, E-mail: k.marosek@wmf.univ.szczecin.pl

    2013-02-01

    Varying physical constant cosmologies were claimed to solve standard cosmological problems such as the horizon, the flatness and the Λ-problem. In this paper, we suggest yet another possible application of these theories: solving the singularity problem. By specifying some examples we show that various cosmological singularities may be regularized provided the physical constants evolve in time in an appropriate way.

  19. Solving the "Rural School Problem": New State Aid, Standards, and Supervision of Local Schools, 1900-1933

    ERIC Educational Resources Information Center

    Steffes, Tracy L.

    2008-01-01

    In 1918, Minnesota county superintendent Julius Arp argued that the greatest educational problem facing the American people was the Rural School Problem, saying: "There is no defect more glaring today than the inequality that exists between the educational facilities of the urban and rural communities. Rural education in the United States has…

  20. Progress and Problems in Reforming Public Language Examinations in Europe: Cameos from the Baltic States, Greece, Hungary, Poland, Slovenia, France and Germany

    ERIC Educational Resources Information Center

    Eckes, Thomas; Ellis, Melanie; Kalnberzina, Vita; Pizorn, Karmen; Springer, Claude; Szollas, Krisztina; Tsagari, Constance

    2005-01-01

    Contributions from seven European countries pinpoint major projects, problems, and prospects of reforming public language assessment procedures. Each country has faced unique problems in the reform process, yet there have also been several common themes emerging, such as a focus on multilingualism, communicative skills, standardization, reference…

  1. Incorporation of epidemiological findings into radiation protection standards.

    PubMed

    Goldsmith, J R

    In standard setting there is a tendency to use data from experimental studies in preference to findings from epidemiological studies. Yet the epidemiological studies are usually the first and at times the only source of data on such critical effects as cancer, reproductive failure, and chronic cardiac and cardiovascular disease in exposed humans. A critique of the protection offered by current and proposed standards for ionizing and non-ionizing radiation illustrates some of the problems. Similar problems occur with water and air pollutants and with occupational exposures of many types. The following sorts of problems were noted: (a) Consideration of both thermal and non-thermal effects especially of non-ionizing radiation. (b) Interpretation of non-significant results as equivalent to no effect. (c) Accepting author's interpretation of a study, rather than examining its data independently for evidence of hazard. (d) Discounting data on unanticipated effects because of poor fit to preconceptions. (e) Dependence on threshold assumptions and demonstrations of dose-response relationships. (f) Choice of insensitive epidemiological indicators and procedures. (g) Consideration of each study separately, rather than giving weight to the conjunction of evidence from all available studies. These problems may be minimized by greater involvement of epidemiologists and their professional organizations in decisions about health protection.

  2. Impact of lightning strikes on hospital functions.

    PubMed

    Mortelmans, Luc J M; Van Springel, Gert L J; Van Boxstael, Sam; Herrijgers, Jan; Hoflacks, Stefaan

    2009-01-01

    Two regional hospitals were struck by lightning during a one-month period. The first hospital, which had 236 beds, suffered a direct strike to the building. This resulted in a direct spread of the power peak and temporary failure of the standard power supply. The principle problems, after restoring standard power supply, were with the fire alarm system and peripheral network connections in the digital radiology systems. No direct impact on the hardware could be found. Restarting the servers resolved all problems. The second hospital, which had 436 beds, had a lightning strike on the premises and mainly experienced problems due to induction. All affected installations had a cable connection from outside in one way or another. The power supplies never were endangered. The main problem was the failure of different communication systems (telephone, radio, intercom, fire alarm system). Also, the electronic entrance control went out. During the days after the lightening strike, multiple software problems became apparent, as well as failures of the network connections controlling the technical support systems. There are very few ways to prepare for induction problems. The use of fiber-optic networks can limit damage. To the knowledge of the authors, these are the first cases of lightning striking hospitals in medical literature.

  3. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE PAGES

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.; ...

    2018-03-26

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  4. A problem-solving task specialized for functional neuroimaging: validation of the Scarborough adaptation of the Tower of London (S-TOL) using near-infrared spectroscopy

    PubMed Central

    Ruocco, Anthony C.; Rodrigo, Achala H.; Lam, Jaeger; Di Domenico, Stefano I.; Graves, Bryanna; Ayaz, Hasan

    2014-01-01

    Problem-solving is an executive function subserved by a network of neural structures of which the dorsolateral prefrontal cortex (DLPFC) is central. Whereas several studies have evaluated the role of the DLPFC in problem-solving, few standardized tasks have been developed specifically for use with functional neuroimaging. The current study adapted a measure with established validity for the assessment of problem-solving abilities to design a test more suitable for functional neuroimaging protocols. The Scarborough adaptation of the Tower of London (S-TOL) was administered to 38 healthy adults while hemodynamic oxygenation of the PFC was measured using 16-channel continuous-wave functional near-infrared spectroscopy (fNIRS). Compared to a baseline condition, problems that required two or three steps to achieve a goal configuration were associated with higher activation in the left DLPFC and deactivation in the medial PFC. Individuals scoring higher in trait deliberation showed consistently higher activation in the left DLPFC regardless of task difficulty, whereas individuals lower in this trait displayed less activation when solving simple problems. Based on these results, the S-TOL may serve as a standardized task to evaluate problem-solving abilities in functional neuroimaging studies. PMID:24734017

  5. Eigensensitivity analysis of rotating clamped uniform beams with the asymptotic numerical method

    NASA Astrophysics Data System (ADS)

    Bekhoucha, F.; Rechak, S.; Cadou, J. M.

    2016-12-01

    In this paper, free vibrations of a rotating clamped Euler-Bernoulli beams with uniform cross section are studied using continuation method, namely asymptotic numerical method. The governing equations of motion are derived using Lagrange's method. The kinetic and strain energy expression are derived from Rayleigh-Ritz method using a set of hybrid variables and based on a linear deflection assumption. The derived equations are transformed in two eigenvalue problems, where the first is a linear gyroscopic eigenvalue problem and presents the coupled lagging and stretch motions through gyroscopic terms. While the second is standard eigenvalue problem and corresponds to the flapping motion. Those two eigenvalue problems are transformed into two functionals treated by continuation method, the Asymptotic Numerical Method. New method proposed for the solution of the linear gyroscopic system based on an augmented system, which transforms the original problem to a standard form with real symmetric matrices. By using some techniques to resolve these singular problems by the continuation method, evolution curves of the natural frequencies against dimensionless angular velocity are determined. At high angular velocity, some singular points, due to the linear elastic assumption, are computed. Numerical tests of convergence are conducted and the obtained results are compared to the exact values. Results obtained by continuation are compared to those computed with discrete eigenvalue problem.

  6. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  7. Supported employment: cost-effectiveness across six European sites

    PubMed Central

    Knapp, Martin; Patel, Anita; Curran, Claire; Latimer, Eric; Catty, Jocelyn; Becker, Thomas; Drake, Robert E; Fioritti, Angelo; Kilian, Reinhold; Lauber, Christoph; Rössler, Wulf; Tomov, Toma; van Busschbach, Jooske; Comas-Herrera, Adelina; White, Sarah; Wiersma, Durk; Burns, Tom

    2013-01-01

    A high proportion of people with severe mental health problems are unemployed but would like to work. Individual Placement and Support (IPS) offers a promising approach to establishing people in paid employment. In a randomized controlled trial across six European countries, we investigated the economic case for IPS for people with severe mental health problems compared to standard vocational rehabilitation. Individuals (n=312) were randomized to receive either IPS or standard vocational services and followed for 18 months. Service use and outcome data were collected. Cost-effectiveness analysis was conducted with two primary outcomes: additional days worked in competitive settings and additional percentage of individuals who worked at least 1 day. Analyses distinguished country effects. A partial cost-benefit analysis was also conducted. IPS produced better outcomes than alternative vocational services at lower cost overall to the health and social care systems. This pattern also held in disaggregated analyses for five of the six European sites. The inclusion of imputed values for missing cost data supported these findings. IPS would be viewed as more cost-effective than standard vocational services. Further analysis demonstrated cost-benefit arguments for IPS. Compared to standard vocational rehabilitation services, IPS is, therefore, probably cost-saving and almost certainly more cost-effective as a way to help people with severe mental health problems into competitive employment. PMID:23471803

  8. Evaluation of the Effects of Hidden Node Problems in IEEE 802.15.7 Uplink Performance

    PubMed Central

    Ley-Bosch, Carlos; Alonso-González, Itziar; Sánchez-Rodríguez, David; Ramírez-Casañas, Carlos

    2016-01-01

    In the last few years, the increasing use of LEDs in illumination systems has been conducted due to the emergence of Visible Light Communication (VLC) technologies, in which data communication is performed by transmitting through the visible band of the electromagnetic spectrum. In 2011, the Institute of Electrical and Electronics Engineers (IEEE) published the IEEE 802.15.7 standard for Wireless Personal Area Networks based on VLC. Due to limitations in the coverage of the transmitted signal, wireless networks can suffer from the hidden node problems, when there are nodes in the network whose transmissions are not detected by other nodes. This problem can cause an important degradation in communications when they are made by means of the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) access control method, which is used in IEEE 802.15.7 This research work evaluates the effects of the hidden node problem in the performance of the IEEE 802.15.7 standard We implement a simulator and analyze VLC performance in terms of parameters like end-to-end goodput and message loss rate. As part of this research work, a solution to the hidden node problem is proposed, based on the use of idle patterns defined in the standard. Idle patterns are sent by the network coordinator node to communicate to the other nodes that there is an ongoing transmission. The validity of the proposed solution is demonstrated with simulation results. PMID:26861352

  9. Evaluation of the Effects of Hidden Node Problems in IEEE 802.15.7 Uplink Performance.

    PubMed

    Ley-Bosch, Carlos; Alonso-González, Itziar; Sánchez-Rodríguez, David; Ramírez-Casañas, Carlos

    2016-02-06

    In the last few years, the increasing use of LEDs in illumination systems has been conducted due to the emergence of Visible Light Communication (VLC) technologies, in which data communication is performed by transmitting through the visible band of the electromagnetic spectrum. In 2011, the Institute of Electrical and Electronics Engineers (IEEE) published the IEEE 802.15.7 standard for Wireless Personal Area Networks based on VLC. Due to limitations in the coverage of the transmitted signal, wireless networks can suffer from the hidden node problems, when there are nodes in the network whose transmissions are not detected by other nodes. This problem can cause an important degradation in communications when they are made by means of the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) access control method, which is used in IEEE 802.15.7 This research work evaluates the effects of the hidden node problem in the performance of the IEEE 802.15.7 standard We implement a simulator and analyze VLC performance in terms of parameters like end-to-end goodput and message loss rate. As part of this research work, a solution to the hidden node problem is proposed, based on the use of idle patterns defined in the standard. Idle patterns are sent by the network coordinator node to communicate to the other nodes that there is an ongoing transmission. The validity of the proposed solution is demonstrated with simulation results.

  10. Focusing on Main Street's Problems from Secluded Laboratory Retreats

    ERIC Educational Resources Information Center

    Kushner, Lawrence M.

    1973-01-01

    A report on the National Bureau of Standards is presented. It provides national measurement standards for some 40 physical quantities related through the laws of physics to the basic six - length, time, mass, temperature, electric current, and luminous intensity. (DF)

  11. Evaluation of Standardized Instruments for Use in Universal Screening of Very Early School-Age Children: Suitability, Technical Adequacy, and Usability

    ERIC Educational Resources Information Center

    Miles, Sandra; Fulbrook, Paul; Mainwaring-Mägi, Debra

    2018-01-01

    Universal screening of very early school-age children (age 4-7 years) is important for early identification of learning problems that may require enhanced learning opportunity. In this context, use of standardized instruments is critical to obtain valid, reliable, and comparable assessment outcomes. A wide variety of standardized instruments is…

  12. Cracks in Continuing Education's Mirror and a Fix To Correct Its Distorted Internal and External Image.

    ERIC Educational Resources Information Center

    Loch, John R.

    2003-01-01

    Outlines problems in continuing higher education, suggesting that it lacks (1) a standard name; (2) a unified voice on national issues; (3) a standard set of roles and functions; (4) a standard title for the chief administrative officer; (5) an accreditation body and process; and (6) resolution of the centralization/decentralization issue. (SK)

  13. Framework for Assessing the ICT Competency in Teachers up to the Requirements of "Teacher" Occupational Standard

    ERIC Educational Resources Information Center

    Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana

    2016-01-01

    The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…

  14. Some Practical Solutions to Standard-Setting Problems: The Georgia Teacher Certification Test Experience.

    ERIC Educational Resources Information Center

    Cramer, Stephen E.

    A standard-setting procedure was developed for the Georgia Teacher Certification Testing Program as tests in 30 teaching fields were revised. A list of important characteristics of a standard-setting procedure was derived, drawing on the work of R. A. Berk (1986). The best method was found to be a highly formalized judgmental, empirical Angoff…

  15. Basic Materials for Electromagnetic Field Standards

    DTIC Science & Technology

    2003-03-04

    Stepanov. “Problem of population electromagnetic safety”. In- ternational Medical Congress “New technologies in medicine. National and interna- tional...Rubtcova N.B. Harmonization options EMF standards: proposals of Russian national committee on non-ionazing radiation protection (RNCNIRP). 3rd...international and national EMF standards of different countries as well as to evaluate the population health danger of electromag- netic fields of

  16. Learning to Love the Questions: How Essential Questions Promote Creativity and Deep Learning

    ERIC Educational Resources Information Center

    Wilhelm, Jeffrey D.

    2014-01-01

    Educators know that creativity and innovation involve questioning and the capacity to frame topics as problems to be solved. They know that we are living in a time of a new generation of standards, including the Common Core State Standards (CCSS). In the U.S., compliance with these standards requires that educators encourage students to ask…

  17. Impact of Gadget Based Learning of Grammar in English at Standard II

    ERIC Educational Resources Information Center

    Singaravelu, G.

    2014-01-01

    The study enlightens the impact of Gadget Based Learning of English Grammar at standard II. Objectives of the study is to find out the learning problems of the students of standard II in Learning English Grammar in Shri Vani Vilas Middle School and to find whether there is any significant difference in achievement mean score between pre test of…

  18. Better Serving the Children of Our Servicemen and Women: How the Common Core Improves Education for Military-Connected Children

    ERIC Educational Resources Information Center

    Center for American Progress, 2014

    2014-01-01

    States across the country have always established their own academic standards, curricula, and achievement goals. What students are expected to know and be able to do often differs from state to state. Additionally, states with low standards may leave students unprepared for higher standards in other states. This inconsistency creates problems for…

  19. A Phenomenological Study on the Lived Experience of First and Second Year Teachers in Standards-Based Grading Districts

    ERIC Educational Resources Information Center

    Battistone, William A., Jr.

    2017-01-01

    Problem: There is an existing cycle of questionable grading practices at the K-12 level. As a result, districts continue to search for innovative methods of evaluating and reporting student progress. One result of this effort has been the adoption of a standards-based grading approach. Research concerning standards-based grading implementation has…

  20. Numerical Solution of Time-Dependent Problems with a Fractional-Power Elliptic Operator

    NASA Astrophysics Data System (ADS)

    Vabishchevich, P. N.

    2018-03-01

    A time-dependent problem in a bounded domain for a fractional diffusion equation is considered. The first-order evolution equation involves a fractional-power second-order elliptic operator with Robin boundary conditions. A finite-element spatial approximation with an additive approximation of the operator of the problem is used. The time approximation is based on a vector scheme. The transition to a new time level is ensured by solving a sequence of standard elliptic boundary value problems. Numerical results obtained for a two-dimensional model problem are presented.

  1. [Principles and Methods for Formulating National Standards of "Regulations of Acupuncture-nee- dle Manipulating techniques"].

    PubMed

    Gang, Wei-juan; Wang, Xin; Wang, Fang; Dong, Guo-feng; Wu, Xiao-dong

    2015-08-01

    The national standard of "Regulations of Acupuncture-needle Manipulating Techniques" is one of the national Criteria of Acupuncturology for which a total of 22 items have been already established. In the process of formulation, a series of common and specific problems have been met. In the present paper, the authors expound these problems from 3 aspects, namely principles for formulation, methods for formulating criteria, and considerations about some problems. The formulating principles include selection and regulations of principles for technique classification and technique-related key factors. The main methods for formulating criteria are 1) taking the literature as the theoretical foundation, 2) taking the clinical practice as the supporting evidence, and 3) taking the expounded suggestions or conclusions through peer review.

  2. Finding the strong CP problem at the LHC

    NASA Astrophysics Data System (ADS)

    D'Agnolo, Raffaele Tito; Hook, Anson

    2016-11-01

    We show that a class of parity based solutions to the strong CP problem predicts new colored particles with mass at the TeV scale, due to constraints from Planck suppressed operators. The new particles are copies of the Standard Model quarks and leptons. The new quarks can be produced at the LHC and are either collider stable or decay into Standard Model quarks through a Higgs, a W or a Z boson. We discuss some simple but generic predictions of the models for the LHC and find signatures not related to the traditional solutions of the hierarchy problem. We thus provide alternative motivation for new physics searches at the weak scale. We also briefly discuss the cosmological history of these models and how to obtain successful baryogenesis.

  3. Related Rates and the Speed of Light.

    ERIC Educational Resources Information Center

    Althoen, S. C.; Weidner, J. F.

    1985-01-01

    Standard calculus textbooks often include a related rates problem involving light cast onto a straight line by a revolving light source. Mathematical aspects to these problems (both in the solution and in the method by which that solution is obtained) are examined. (JN)

  4. 45 CFR Appendix A to Part 1211 - Standards for Examiners

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., professional, investigative, or technical work which has demonstrated the possession of: (i) The personal... problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; Interpret and apply regulations and other complex written material; Communicate...

  5. 45 CFR Appendix A to Part 1211 - Standards for Examiners

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., professional, investigative, or technical work which has demonstrated the possession of: (i) The personal... problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; Interpret and apply regulations and other complex written material; Communicate...

  6. Quest for Quality.

    ERIC Educational Resources Information Center

    Wilson, Richard B.; Schmoker, Mike

    1992-01-01

    Unlike traditional school management, Toyota of America recognizes thinking employees and emphasizes problems and measurable approaches to improvement. Instead of meeting to discuss short-term goals, specific problems, and concrete successes, school leaders often alienate staff by leading year-end discussions of standardized test score data.…

  7. What Are the Signs of Alzheimer's Disease? | NIH MedlinePlus the Magazine

    MedlinePlus

    ... in behavior and personality Conduct tests of memory, problem solving, attention, counting, and language Carry out standard medical ... over and over having trouble paying bills or solving simple math problems getting lost losing things or putting them in ...

  8. The Four Billion Dollar Lunch

    ERIC Educational Resources Information Center

    Sautter, R. Craig

    1978-01-01

    Discusses problems with the National School Lunch Program, including the high proportion of food thrown away by students, problems with food preparation, nutritional standards, and competition from junk foods. Suggestions for nutrition education are offered and organizations and books for further reference are listed. (JMB)

  9. New Dental Accreditation Standard on Critical Thinking: A Call for Learning Models, Outcomes, Assessments.

    PubMed

    Johnsen, David C; Williams, John N; Baughman, Pauletta Gay; Roesch, Darren M; Feldman, Cecile A

    2015-10-01

    This opinion article applauds the recent introduction of a new dental accreditation standard addressing critical thinking and problem-solving, but expresses a need for additional means for dental schools to demonstrate they are meeting the new standard because articulated outcomes, learning models, and assessments of competence are still being developed. Validated, research-based learning models are needed to define reference points against which schools can design and assess the education they provide to their students. This article presents one possible learning model for this purpose and calls for national experts from within and outside dental education to develop models that will help schools define outcomes and assess performance in educating their students to become practitioners who are effective critical thinkers and problem-solvers.

  10. Digital combined instrument transformer for automated electric power supply control systems of mining companies

    NASA Astrophysics Data System (ADS)

    Topolsky, D. V.; Gonenko, T. V.; Khatsevskiy, V. F.

    2017-10-01

    The present paper discusses ways to solve the problem of enhancing operating efficiency of automated electric power supply control systems of mining companies. According to the authors, one of the ways to solve this problem is intellectualization of the electric power supply control system equipment. To enhance efficiency of electric power supply control and electricity metering, it is proposed to use specially designed digital combined instrument current and voltage transformers. This equipment conforms to IEC 61850 international standard and is adapted for integration into the digital substation structure. Tests were performed to check conformity of an experimental prototype of the digital combined instrument current and voltage transformer with IEC 61850 standard. The test results have shown that the considered equipment meets the requirements of the standard.

  11. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  12. On the Performance Evaluation of 3D Reconstruction Techniques from a Sequence of Images

    NASA Astrophysics Data System (ADS)

    Eid, Ahmed; Farag, Aly

    2005-12-01

    The performance evaluation of 3D reconstruction techniques is not a simple problem to solve. This is not only due to the increased dimensionality of the problem but also due to the lack of standardized and widely accepted testing methodologies. This paper presents a unified framework for the performance evaluation of different 3D reconstruction techniques. This framework includes a general problem formalization, different measuring criteria, and a classification method as a first step in standardizing the evaluation process. Performance characterization of two standard 3D reconstruction techniques, stereo and space carving, is also presented. The evaluation is performed on the same data set using an image reprojection testing methodology to reduce the dimensionality of the evaluation domain. Also, different measuring strategies are presented and applied to the stereo and space carving techniques. These measuring strategies have shown consistent results in quantifying the performance of these techniques. Additional experiments are performed on the space carving technique to study the effect of the number of input images and the camera pose on its performance.

  13. Standard Model—axion—seesaw—Higgs portal inflation. Five problems of particle physics and cosmology solved in one stroke

    NASA Astrophysics Data System (ADS)

    Ballesteros, Guillermo; Redondo, Javier; Ringwald, Andreas; Tamarit, Carlos

    2017-08-01

    We present a minimal extension of the Standard Model (SM) providing a consistent picture of particle physics from the electroweak scale to the Planck scale and of cosmology from inflation until today. Three right-handed neutrinos Ni, a new color triplet Q and a complex SM-singlet scalar σ, whose vacuum expectation value vσ ~ 1011 GeV breaks lepton number and a Peccei-Quinn symmetry simultaneously, are added to the SM. At low energies, the model reduces to the SM, augmented by seesaw generated neutrino masses and mixing, plus the axion. The latter solves the strong CP problem and accounts for the cold dark matter in the Universe. The inflaton is comprised by a mixture of σ and the SM Higgs, and reheating of the Universe after inflation proceeds via the Higgs portal. Baryogenesis occurs via thermal leptogenesis. Thus, five fundamental problems of particle physics and cosmology are solved at one stroke in this unified Standard Model—axion—seesaw—Higgs portal inflation (SMASH) model. It can be probed decisively by upcoming cosmic microwave background and axion dark matter experiments.

  14. Using standardized patients versus video cases for representing clinical problems in problem-based learning

    PubMed Central

    2016-01-01

    Purpose: The quality of problem representation is critical for developing students’ problem-solving abilities in problem-based learning (PBL). This study investigates preclinical students’ experience with standardized patients (SPs) as a problem representation method compared to using video cases in PBL. Methods: A cohort of 99 second-year preclinical students from Inje University College of Medicine (IUCM) responded to a Likert scale questionnaire on their learning experiences after they had experienced both video cases and SPs in PBL. The questionnaire consisted of 14 items with eight subcategories: problem identification, hypothesis generation, motivation, collaborative learning, reflective thinking, authenticity, patient-doctor communication, and attitude toward patients. Results: The results reveal that using SPs led to the preclinical students having significantly positive experiences in boosting patient-doctor communication skills; the perceived authenticity of their clinical situations; development of proper attitudes toward patients; and motivation, reflective thinking, and collaborative learning when compared to using video cases. The SPs also provided more challenges than the video cases during problem identification and hypotheses generation. Conclusion: SPs are more effective than video cases in delivering higher levels of authenticity in clinical problems for PBL. The interaction with SPs engages preclinical students in deeper thinking and discussion; growth of communication skills; development of proper attitudes toward patients; and motivation. Considering the higher cost of SPs compared with video cases, SPs could be used most advantageously during the preclinical period in the IUCM curriculum. PMID:26923094

  15. Clothes washer standards in China -- The problem of water andenergy trade-offs in establishing efficiency standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biermayer, Peter J.; Lin, Jiang

    2004-05-19

    Currently the sales of clothes washers in China consist ofseveral general varieties. Some use more energy (with or withoutincluding hot water energy use) and some use more water. Both energy andwater are in short supply in China. This poses the question - how do youtrade off water versus energy in establishing efficiency standards? Thispaper discusses how China dealt with this situation and how itestablished minimum efficiency standards for clothes washers.

  16. The standardization of urine particle counting in medical laboratories--a Polish experience with the EQA programme.

    PubMed

    Cwiklińska, Agnieszka; Kąkol, Judyta; Kuchta, Agnieszka; Kortas-Stempak, Barbara; Pacanis, Anastasis; Rogulski, Jerzy; Wróblewska, Małgorzata

    2012-02-01

    Given the common problems with the standardization of urine particle counting methods and the great variability in the results obtained by Polish laboratories under international Labquality External Quality Assessment (EQA), we initiated educational recovery activities. Detailed instructions on how to perform the standardized examination were sent to EQA participants, as was a questionnaire forms which enabled information to be gathered in respect to the procedures being applied. Laboratory results were grouped according to the method declared on the EQA 'Result' form or according to a manual examination procedure established on the basis of the questionnaire. The between-laboratory CVs for leukocyte and erythrocyte counts were calculated for each group and compared using the Mann-Whitney test. Significantly lower between-laboratory CVs (p = 0.03) were achieved for leukocyte counting among the laboratories that analysed control specimens in accordance with standardized procedures as compared with those which used non-standardized procedures. We also observed a visible lower variability for erythrocyte counting. Unfortunately despite our activities, only a few of the Polish laboratories applied the standardized examination procedures, and only 29% of the results could have been considered to be standardized (16% - manual methods, 13% - automated systems). The standardization of urine particle counting methods continues to be a significant problem in medical laboratories and requires further recovery activities which can be conducted using the EQA scheme.

  17. STEM Gives Meaning to Mathematics

    ERIC Educational Resources Information Center

    Hefty, Lukas J.

    2015-01-01

    The National Council of Teachers of Mathematics' (NCTM's) "Principles and Standards for School Mathematics" (2000) outlines fi ve Process Standards that are essential for developing deep understanding of mathematics: (1) Problem Solving; (2) Reasoning and Proof; (3) Communication; (4) Connections; and (5) Representation. The Common Core…

  18. "Standard" versus "Dialect" in Bilingual Education: An Old Problem in a New Context

    ERIC Educational Resources Information Center

    Fishman, Joshua A.

    1977-01-01

    A survey discussion of the question of standard languages versus dialects in education observes practice and conditions in America and Europe with attention to the definition of dialect. Responsibilities of the bilingual education teacher are outlined. (CHK)

  19. AN ELECTRIFYING NEW SOLUTION TO AN OLD PROBLEM?

    EPA Science Inventory

    The adverse health effects of particles have been linked to many factors, including particle size. The U.S. Environmental Protection Agency (EPA) first issued National Ambient Air QualityStandards (NAAQS) for particular matter (PM) in 1971, amended the standards in 1987 for part...

  20. OSI: Will It Ever See the Light of Day?

    ERIC Educational Resources Information Center

    Moloney, Peter

    1997-01-01

    Examines issues of viability and necessity regarding the Open System Interconnections (OSI) reference service model with a view on future developments. Discusses problems with the standards; conformance testing; OSI bureaucracy; standardized communications; security; the transport level; applications; the stakeholders (communications providers,…

  1. Education on Trial. Strategies for the Future.

    ERIC Educational Resources Information Center

    Johnston, William J., Ed.

    Problems and opportunities in educational reform at all educational levels are considered in this collection of 18 articles. Titles and authors are as follows: Introduction (William J. Johnston); "Evidence of Decline in Educational Standards" (Philip N. Marcus); "Standards--by What Criteria?" (Francis Keppel); "Educational…

  2. Identification of Gambling Problems in Primary Care: Properties of the NODS-CLiP Screening Tool.

    PubMed

    Cowlishaw, Sean; McCambridge, Jim; Kessler, David

    2018-06-25

    There are several brief screening tools for gambling that possess promising psychometric properties, but have uncertain utility in generalist healthcare environments which prioritize prevention and brief interventions. This study describes an examination of the National Opinion Research Centre Diagnostic and Statistical Manual of Mental Disorders Screen for Gambling Problems (NODS-CLiP), in comparison with the Problem Gambling Severity Index (PGSI), when used to operationalize gambling problems across a spectrum of severity. Data were obtained from 1058 primary care attendees recruited from 11 practices in England who completed various measures including the NODS-CLiP and PGSI. The performance of the former was defined by estimates of sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs), when PGSI indicators of problem gambling (5+) and any gambling problems (1+), respectively, were reference standards. The NODS-CLiP demonstrated perfect sensitivity for problem gambling, along with high specificity and a NPV, but a low PPV. There was much lower sensitivity when the indicator of any gambling problems was the reference standard, with capture rates indicating only 20% of patients exhibiting low to moderate severity gambling problems (PGSI 1-4) were identified by the NODS-CLiP. The NODS-CLiP performs well when identifying severe cases of problem gambling, but lacks sensitivity for less severe problems and may be unsuitable for settings which prioritize prevention and brief interventions. There is a need for screening measures which are sensitive across the full spectrum of risk and severity, and can support initiatives for improving identification and responses to gambling problems in healthcare settings such as primary care.

  3. International aerospace standards - An overview

    NASA Astrophysics Data System (ADS)

    Mason, J. L.

    1983-10-01

    Factors to be considered in adopting and extending international standards in the U.S. aerospace industry are reviewed. Cost-related advantages and disadvantages of standardization are weighed, and further obstacles are identified in the English/metric rivalry and the pacing of metrification. The problem of standard duplication is examined, and the issue of revenues from the sale of copyrighted documents describing standards is addressed. It is recommended that international metric-system standards be introduced, with proper timing, wherever possible, and that prompt negotiations be undertaken to prevent or resolve document-sales disagreements. The continuation of English-system standards for safety-related cockpit readouts and communications systems is suggested.

  4. An Experimental Copyright Moratorium: Study of a Proposed Solution to the Copyright Photocopying Problem. Final Report to the American Society for Testing and Materials (ASTM).

    ERIC Educational Resources Information Center

    Heilprin, Laurence B.

    The Committee to Investigate Copyright Problems (CICP), a non-profit organization dedicated to resolving the conflict known as the "copyright photocopying problem" was joined by the American Society for Testing and Materials (ASTM), a large national publisher of technical and scientific standards, in a plan to simulate a long-proposed…

  5. The Language Factor in Elementary Mathematics Assessments: Computational Skills and Applied Problem Solving in a Multidimensional IRT Framework

    ERIC Educational Resources Information Center

    Hickendorff, Marian

    2013-01-01

    The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…

  6. ACCESS: Design and Sub-System Performance

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary Elizabeth; Morris, Matthew J.; McCandliss, Stephan R.; Rasucher, Bernard J.; Kimble, Randy A.; Kruk, Jeffrey W.; Pelton, Russell; Mott, D. Brent; Wen, Hiting; Foltz, Roger; hide

    2012-01-01

    Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. ACCESS, "Absolute Color Calibration Experiment for Standard Stars", is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 -1.7 micrometer bandpass.

  7. Cladding burst behavior of Fe-based alloys under LOCA

    DOE PAGES

    Terrani, Kurt A.; Dryepondt, Sebastien N.; Pint, Bruce A.; ...

    2015-12-17

    Burst behavior of austenitic and ferritic Fe-based alloy tubes has been examined under a simulated large break loss of coolant accident. Specifically, type 304 stainless steel (304SS) and oxidation resistant FeCrAl tubes were studied alongside Zircaloy-2 and Zircaloy-4 that are considered reference fuel cladding materials. Following the burst test, characterization of the cladding materials was carried out to gain insights regarding the integral burst behavior. Given the widespread availability of a comprehensive set of thermo-mechanical data at elevated temperatures for 304SS, a modeling framework was implemented to simulate the various processes that affect burst behavior in this Fe-based alloy. Themore » most important conclusion is that cladding ballooning due to creep is negligible for Fe-based alloys. Thus, unlike Zr-based alloys, cladding cross-sectional area remains largely unchanged up to the point of burst. Furthermore, for a given rod internal pressure, the temperature onset of burst in Fe-based alloys appears to be simply a function of the alloy's ultimate tensile strength, particularly at high rod internal pressures.« less

  8. SPES-2, an experimental program to support the AP600 development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarantini, M.; Medich, C.

    1995-09-01

    In support of the development of the AP600 reactor, ENEA, ENEL, ANSALDO and Westinghouse have signed a research agreement. In the framework of this agreement a complex Full Height Full Pressure (FHFP) integral system testing program has been planned on SPES-2 facility. The main purpose of this paper is to point out the status of the test program; describe the hot per-operational test performed and the complete test matrix, giving all the necessary references on the work already published. Two identical Small Break LOCA transients, performed with Pressurizer to Core Make-up Tank (PRZ-CMT) balance line (Test S00203) and without PRZ-CMTmore » balance line (Test S00303) are then compared, to show how the SPES-2 facility can contribute in confirming the new AP600 reactor design choices and can give useful indications to designers. Although the detailed analysis of test data has not been completed, some consideration on the analytical tools utilized and on the SPES-2 capability to simulate the reference plant is then drawn.« less

  9. Experimental and statistical study on fracture boundary of non-irradiated Zircaloy-4 cladding tube under LOCA conditions

    NASA Astrophysics Data System (ADS)

    Narukawa, Takafumi; Yamaguchi, Akira; Jang, Sunghyon; Amaya, Masaki

    2018-02-01

    For estimating fracture probability of fuel cladding tube under loss-of-coolant accident conditions of light-water-reactors, laboratory-scale integral thermal shock tests were conducted on non-irradiated Zircaloy-4 cladding tube specimens. Then, the obtained binary data with respect to fracture or non-fracture of the cladding tube specimen were analyzed statistically. A method to obtain the fracture probability curve as a function of equivalent cladding reacted (ECR) was proposed using Bayesian inference for generalized linear models: probit, logit, and log-probit models. Then, model selection was performed in terms of physical characteristics and information criteria, a widely applicable information criterion and a widely applicable Bayesian information criterion. As a result, it was clarified that the log-probit model was the best among the three models to estimate the fracture probability in terms of the degree of prediction accuracy for both next data to be obtained and the true model. Using the log-probit model, it was shown that 20% ECR corresponded to a 5% probability level with a 95% confidence of fracture of the cladding tube specimens.

  10. Reliability enhancement of APR + diverse protection system regarding common cause failures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oh, Y. G.; Kim, Y. M.; Yim, H. S.

    2012-07-01

    The Advanced Power Reactor Plus (APR +) nuclear power plant design has been developed on the basis of the APR1400 (Advanced Power Reactor 1400 MWe) to further enhance safety and economics. For the mitigation of Anticipated Transients Without Scram (ATWS) as well as Common Cause Failures (CCF) within the Plant Protection System (PPS) and the Emergency Safety Feature - Component Control System (ESF-CCS), several design improvement features have been implemented for the Diverse Protection System (DPS) of the APR + plant. As compared to the APR1400 DPS design, the APR + DPS has been designed to provide the Safety Injectionmore » Actuation Signal (SIAS) considering a large break LOCA accident concurrent with the CCF. Additionally several design improvement features, such as channel structure with redundant processing modules, and changes of system communication methods and auto-system test methods, are introduced to enhance the functional reliability of the DPS. Therefore, it is expected that the APR + DPS can provide an enhanced safety and reliability regarding possible CCF in the safety-grade I and C systems as well as the DPS itself. (authors)« less

  11. Post Quench Ductility Evaluation of Zircaloy-4 and Select Iron Alloys under Design Basis and Extended LOCA Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Yong; Keiser, James R; Terrani, Kurt A

    2014-01-01

    Oxidation experiments were conducted at 1200 C in flowing steam with tubing specimens of Zircaloy-4, 317, 347 stainless steels, and the commercial FeCrAl alloy APMT. The purpose was to determine the oxidation behavior and post quench ductility of these alloys under postulated loss-of-coolant accident conditions. The parabolic rate constant for Zircaloy-4 tubing samples at 1200 were determined to be k = 2.173 107 g2/cm4/s C, in excellent agreement with the Cathcart-Pawel correlation. The APMT alloy experienced the slowest oxidation rate among all materials examined in this work. The ductility of post quenched samples was evaluated by ring compression tests atmore » 135 C. For Zircaloy-4, the ductile to brittle transition occurs at an equivalent cladding reacted (ECR) of 19.3%. SS-347 was still ductile after being oxidized for 2400 s (CP-ECR 50%), but the maximum load was reduced significantly owing to the metal layer thickness reduction. No ductility decrease was observed for the post-quenched APMT samples oxidized up to four hours.« less

  12. Experimental study of phase separation in dividing two phase flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian Yong; Yang Zhilin; Xu Jijun

    1996-12-31

    Experimental study of phase separation of air-water two phase bubbly, slug flow in the horizontal T-junction is carried out. The influences of the inlet mass quality X1, mass extraction rate G3/G1, and fraction of extracted liquid QL3/QL1 on phase separation characteristics are analyzed. For the first time, the authors have found and defined pulsating run effect by the visual experiments, which show that under certain conditions, the down stream flow of the T-junction has strangely affected the phase redistribution of the junction, and firstly point out that the downstream geometric condition is very important to the study of phase separationmore » phenomenon of two-phase flow in a T-junction. This kind of phenomenon has many applications in the field of energy, power, petroleum and chemical industries, such as the loss of coolant accident (LOCA) caused by a small break in a horizontal coolant pipe in nuclear reactor, and the flip-flop effect in the natural gas transportation pipeline system, etc.« less

  13. Final Report on ITER Task Agreement 81-08

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard L. Moore

    As part of an ITER Implementing Task Agreement (ITA) between the ITER US Participant Team (PT) and the ITER International Team (IT), the INL Fusion Safety Program was tasked to provide the ITER IT with upgrades to the fusion version of the MELCOR 1.8.5 code including a beryllium dust oxidation model. The purpose of this model is to allow the ITER IT to investigate hydrogen production from beryllium dust layers on hot surfaces inside the ITER vacuum vessel (VV) during in-vessel loss-of-cooling accidents (LOCAs). Also included in the ITER ITA was a task to construct a RELAP5/ATHENA model of themore » ITER divertor cooling loop to model the draining of the loop during a large ex-vessel pipe break followed by an in-vessel divertor break and compare the results to a simular MELCOR model developed by the ITER IT. This report, which is the final report for this agreement, documents the completion of the work scope under this ITER TA, designated as TA 81-08.« less

  14. Core characterization of the new CABRI Water Loop Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritter, G.; Rodiac, F.; Beretz, D.

    2011-07-01

    The CABRI experimental reactor is located at the Cadarache nuclear research center, southern France. It is operated by the Atomic Energy Commission (CEA) and devoted to IRSN (Institut de Radioprotection et de Surete Nucleaire) safety programmes. It has been successfully operated during the last 30 years, enlightening the knowledge of FBR and LWR fuel behaviour during Reactivity Insertion Accident (RIA) and Loss Of Coolant Accident (LOCA) transients in the frame of IPSN (Institut de Protection et de Surete Nucleaire) and now IRSN programmes devoted to reactor safety. This operation was interrupted in 2003 to allow for a whole facility renewalmore » programme for the need of the CABRI International Programme (CIP) carried out by IRSN under the OECD umbrella. The principle of operation of the facility is based on the control of {sup 3}He, a major gaseous neutron absorber, in the core geometry. The purpose of this paper is to illustrate how several dosimetric devices have been set up to better characterize the core during the upcoming commissioning campaign. It presents the schemes and tools dedicated to core characterization. (authors)« less

  15. Design optimisation of a nanofluid injection system for LOCA events in a nuclear power plant

    NASA Astrophysics Data System (ADS)

    Călimănescu, I.; Stan, L. C.; Velcea, D. D.

    2016-08-01

    The safety issues inside a Nuclear Power Plant (NPP) are encompassing their capacity to ensure the heat sink, meaning the capacity of the systems to release the heat from the rector to the environment. The nanofluids having good heat transfer properties, are recommended to be used in such applications. The paper is solving the following scenario: considering the Safety Injection tank and the Nanofluid injection Tank, and considering the Nanofluid injection Tank filled with a 10% alumina-water nanofluid, how can we select the best design of the connecting point between the pipes of the SIT and the Nanofluid Tank and the pressures inside of any of these tanks in order to have the biggest density of nanoparticles leaving the tanks toward the cold leg. In conclusion the biggest influence over the rate of disposal of the nanofluid inside ECCS is that of the pressure inside the SIT followed in order by the injection pipe diameter and the pressure inside the nanofluid tank. The optimum balance of these three design parameters may be reached following the procedure shown in this paper.

  16. Core cooling under accident conditions at the high-flux beam reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, P.; Cheng, L.; Fauske, H.

    The High-Flux Beam Reactor (HFBR) at Brookhaven National Laboratory (BNL) is cooled and moderated by heavy water and contains {sup 235}U in the form of narrow-channel, parallel-plate-type fuel elements. During normal operation, the flow direction is downward through the core. This flow direction is maintained at a reduced flow rate during routine shutdown and on loss of commercial power by means of redundant pumps and power supplies. However, in certain accident scenarios, e.g. loss-of-coolant accidents (LOCAs), all forced-flow cooling is lost. Although there was experimental evidence during the reactor design period (1958-1963) that the heat removal capacity in the fullymore » developed natural circulation cooling mode was relatively high, it was not possible to make a confident prediction of the heat removal capacity during the transition from downflow to natural circulation. Accordingly, a test program was initiated using an electrically heated section to simulate the fuel channel and a cooling loop to simulate the balance of the primary cooling system.« less

  17. Mechanism-based modeling of solute strengthening: application to thermal creep in Zr alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tome, Carlos; Wen, Wei; Capolungo, Laurent

    2017-08-01

    This report focuses on the development of a physics-based thermal creep model aiming to predict the behavior of Zr alloy under reactor accident condition. The current models used for this kind of simulations are mostly empirical in nature, based generally on fits to the experimental steady-state creep rates under different temperature and stress conditions, which has the following limitations. First, reactor accident conditions, such as RIA and LOCA, usually take place in short times and involve only the primary, not the steady-state creep behavior stage. Moreover, the empirical models cannot cover the conditions from normal operation to accident environments. Formore » example, Kombaiah and Murty [1,2] recently reported a transition between the low (n~4) and high (n~9) power law creep regimes in Zr alloys depending on the applied stress. Capturing such a behavior requires an accurate description of the mechanisms involved in the process. Therefore, a mechanism-based model that accounts for the evolution with time of microstructure is more appropriate and reliable for this kind of simulation.« less

  18. All Prime Contract Awards by State or Country, Place and Contractor. Part 5. (Adobe, Colorado-Washington, DC)

    DTIC Science & Technology

    1989-01-01

    U000000004000 0 4.001000000000 I- ft I COW Is f00 L00CO LOC Looooooooooo L0 O O OO O O 0 mf :4 Ir rr r or- ft fCl) I4 r’.4. I-- P. I.-4 -4 P P4’ P P0... LOC L o L&-oca L00 L00 L0 L0 LO O1- N1 ).3 - a00 N V 04 04 0-4 04444l- 044q 0-*- 04t 0-0 04W 0 I-N C. 3004 Na e L O C Q L LO LO0 . L0 NI3004 Nt Ŕ = mo... LOC La ) La)C N 1 a-4N< Nt 0-4 0 4 4 .4 I 040 0-4 *to 0-4 000( 00 0a) Nt 1 0- A4c N " N -j- 4d.q- 4- 41- -I-I "IN -to Ř.4 -4 -4111 -4-4 -4N 0 1 Q0

  19. Transient analysis of ”2 inch Direct Vessel Injection line break” in SPES-2 facility by using TRACE code

    NASA Astrophysics Data System (ADS)

    D'Amico, S.; Lombardo, C.; Moscato, I.; Polidori, M.; Vella, G.

    2015-11-01

    In the past few decades a lot of theoretical and experimental researches have been done to understand the physical phenomena characterizing nuclear accidents. In particular, after the Three Miles Island accident, several reactors have been designed to handle successfully LOCA events. This paper presents a comparison between experimental and numerical results obtained for the “2 inch Direct Vessel Injection line break” in SPES-2. This facility is an integral test facility built in Piacenza at the SIET laboratories and simulating the primary circuit, the relevant parts of the secondary circuits and the passive safety systems typical of the AP600 nuclear power plant. The numerical analysis here presented was performed by using TRACE and CATHARE thermal-hydraulic codes with the purpose of evaluating their prediction capability. The main results show that the TRACE model well predicts the overall behaviour of the plant during the transient, in particular it is able to simulate the principal thermal-hydraulic phenomena related to all passive safety systems. The performance of the presented CATHARE noding has suggested some possible improvements of the model.

  20. Multilayer (TiN, TiAlN) ceramic coatings for nuclear fuel cladding

    NASA Astrophysics Data System (ADS)

    Alat, Ece; Motta, Arthur T.; Comstock, Robert J.; Partezana, Jonna M.; Wolfe, Douglas E.

    2016-09-01

    In an attempt to develop an accident-tolerant fuel (ATF) that can delay the deleterious consequences of loss-of-coolant-accidents (LOCA), multilayer coatings were deposited onto ZIRLO® coupon substrates by cathodic arc physical vapor deposition (CA-PVD). Coatings were composed of alternating TiN (top) and Ti1-xAlxN (2-layer, 4-layer, 8-layer and 16-layer) layers. The minimum TiN top coating thickness and coating architecture were optimized for good corrosion and oxidation resistance. Corrosion tests were performed in static pure water at 360 °C and 18.7 MPa for up to 90 days. The optimized coatings showed no spallation/delamination and had a maximum of 6 mg/dm2 weight gain, which is 6 times smaller than that of a control sample of uncoated ZIRLO® which showed a weight gain of 40.2 mg/dm2. The optimized architecture features a ∼1 μm TiN top layer to prevent boehmite phase formation during corrosion and a TiN/TiAlN 8-layer architecture which provides the best corrosion performance.

  1. Problem solving and decisionmaking: An integration

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    An attempt was made to redress a critical fault of decisionmaking and problem solving research-a lack of a standard method to classify problem or decision states or conditions. A basic model was identified and expanded to indicate a possible taxonomy of conditions which may be used in reviewing previous research or for systematically pursuing new research designs. A generalization of the basic conditions was then made to indicate that the conditions are essentially the same for both concepts, problem solving and decisionmaking.

  2. Mental Health Problems in Adults with Williams Syndrome

    ERIC Educational Resources Information Center

    Stinton, Chris; Elison, Sarah; Howlin, Patricia

    2010-01-01

    Although many researchers have investigated emotional and behavioral difficulties in individuals with Williams syndrome, few have used standardized diagnostic assessments. We examined mental health problems in 92 adults with Williams syndrome using the Psychiatric Assessment Schedule for Adults with Developmental Disabilities--PAS-ADD (Moss,…

  3. College Basketball on the Line.

    ERIC Educational Resources Information Center

    Suggs, Welch

    1999-01-01

    The National Collegiate Athletic Association (NCAA) has convened a working group to address problems in recruiting, gambling, academic standards, and other corrupt practices in college basketball programs. Such problems are neither new nor unique to basketball, and changing college sports has proven to be difficult. Recommendations are anticipated…

  4. Fostering Perseverance

    ERIC Educational Resources Information Center

    Lewis, Jennifer M.; Özgün-Koca, S. Asli

    2016-01-01

    Sustaining engagement with a mathematics task is not a novel suggestion for effective mathematics teaching. "Principles and Standards for School Mathematics" (2000) specified that "students need to know that a challenging problem will take some time and that perseverance is an important aspect of the problem-solving process and of…

  5. Behaviour of 4- to 5-year-old nondisabled ELBW children: Outcomes following group-based physiotherapy intervention.

    PubMed

    Brown, L; Burns, Y R; Watter, P; Gray, P H; Gibbons, K S

    2018-03-01

    Extreme prematurity or extremely low birth weight (ELBW) can adversely affect behaviour. Nondisabled ELBW children are at risk of behavioural problems, which may become a particular concern after commencement of formal education. This study explored the frequency of behavioural and emotional problems amongst nondisabled ELBW children at 4 to 5 years of age and whether intervention had a positive influence on behaviour. The relationship between behaviour, gender, and other areas of performance at 5 years was explored. Fifty 4-year-old children (born <28 weeks gestation or birth weight <1,000 g) with minimal/mild motor impairment were randomly allocated to intervention (n = 24) or standard care (n = 26). Intervention was 6 group-based physiotherapy weekly sessions and home programme. Standard care was best practice advice. The Child Behavior Checklist (CBCL) for preschool children was completed at baseline and at 1-year post-baseline. Other measures at follow-up included Movement Assessment Battery for Children Second Edition, Beery Visual-Motor Integration Test 5th Edition, and Peabody Picture Vocabulary Test 4th Edition. The whole cohort improved on CBCL total problems score between baseline (mean 50.0, SD 11.1) and 1-year follow-up (mean 45.2, SD 10.3), p = .004. There were no significant differences between groups over time on CBCL internalizing, externalizing, or total problems scores. The intervention group showed a mean difference in total problems score of -3.8 (CI [1.5, 9.1]) between times, with standard care group values being -4.4 (CI [1.6, 7.1]). Males had higher total problems scores than females (p = .026), although still performed within the "normal" range. CBCL scores did not correlate with other scores. The behaviour of nondisabled ELBW children was within the "normal" range at 4 to 5 years, and both intervention and standard care may have contributed to improved behavioural outcomes. Behaviour was not related to performance in other developmental domains. © 2017 John Wiley & Sons Ltd.

  6. Design optimization of steel frames using an enhanced firefly algorithm

    NASA Astrophysics Data System (ADS)

    Carbas, Serdar

    2016-12-01

    Mathematical modelling of real-world-sized steel frames under the Load and Resistance Factor Design-American Institute of Steel Construction (LRFD-AISC) steel design code provisions, where the steel profiles for the members are selected from a table of steel sections, turns out to be a discrete nonlinear programming problem. Finding the optimum design of such design optimization problems using classical optimization techniques is difficult. Metaheuristic algorithms provide an alternative way of solving such problems. The firefly algorithm (FFA) belongs to the swarm intelligence group of metaheuristics. The standard FFA has the drawback of being caught up in local optima in large-sized steel frame design problems. This study attempts to enhance the performance of the FFA by suggesting two new expressions for the attractiveness and randomness parameters of the algorithm. Two real-world-sized design examples are designed by the enhanced FFA and its performance is compared with standard FFA as well as with particle swarm and cuckoo search algorithms.

  7. Randomized trial of intensive motivational interviewing for methamphetamine dependence.

    PubMed

    Polcin, Douglas L; Bond, Jason; Korcha, Rachael; Nayak, Madhabika B; Galloway, Gantt P; Evans, Kristy

    2014-01-01

    An intensive, 9-session motivational interviewing (IMI) intervention was assessed using a randomized clinical trial of 217 methamphetamine (MA) dependent individuals. Intensive motivational interviewing (IMI) was compared with a single standard session of MI (SMI) combined with eight nutrition education sessions. Interventions were delivered weekly over 2 months. All study participants also received standard outpatient group treatment three times per week. Both study groups showed significant decreases in MA use and Addiction Severity Index drug scores, but there were no significant differences between the two groups. However, reductions in Addiction Severity Index psychiatric severity scores and days of psychiatric problems during the past 30 days were found for clients in the IMI group but not the SMI group. SMI may be equally beneficial to IMI in reducing MA use and problem severity, but IMI may help alleviate co-occurring psychiatric problems that are unaffected by shorter MI interventions. Additional studies are needed to assess the problems, populations, and contexts for which IMI is effective.

  8. Reduced Risk-Taking After Prior Losses in Pathological Gamblers Under Treatment and Healthy Control Group but not in Problem Gamblers.

    PubMed

    Bonini, Nicolao; Grecucci, Alessandro; Nicolè, Manuel; Savadori, Lucia

    2018-06-01

    A group of pathological gamblers and a group of problem gamblers (i.e., gamblers at risk of becoming pathological) were compared to healthy controls on their risk-taking propensity after prior losses. Each participant played both the Balloon Analogue Risk Taking task (BART) and a modified version of the same task, where individuals face five repeated predetermined early losses at the onset of the game. No significant difference in risk-taking was found between groups on the standard BART task, while significant differences emerged when comparing behaviors in the two tasks: both pathological gamblers and controls reduced their risk-taking tendency after prior losses in the modified BART compared to the standard BART, whereas problem gamblers showed no reduction in risk-taking after prior losses. We interpret these results as a sign of a reduced sensitivity to negative feedback in problem gamblers which might contribute to explain their loss-chasing tendency.

  9. The problem of natural funnel asymmetries: a simulation analysis of meta-analysis in macroeconomics.

    PubMed

    Callot, Laurent; Paldam, Martin

    2011-06-01

    Effect sizes in macroeconomic are estimated by regressions on data published by statistical agencies. Funnel plots are a representation of the distribution of the resulting regression coefficients. They are normally much wider than predicted by the t-ratio of the coefficients and often asymmetric. The standard method of meta-analysts in economics assumes that the asymmetries are because of publication bias causing censoring and adjusts the average accordingly. The paper shows that some funnel asymmetries may be 'natural' so that they occur without censoring. We investigate such asymmetries by simulating funnels by pairs of data generating processes (DGPs) and estimating models (EMs), in which the EM has the problem that it disregards a property of the DGP. The problems are data dependency, structural breaks, non-normal residuals, non-linearity, and omitted variables. We show that some of these problems generate funnel asymmetries. When they do, the standard method often fails. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.

  10. The efficacy of problem-solving treatments after deliberate self-harm: meta-analysis of randomized controlled trials with respect to depression, hopelessness and improvement in problems.

    PubMed

    Townsend, E; Hawton, K; Altman, D G; Arensman, E; Gunnell, D; Hazell, P; House, A; Van Heeringen, K

    2001-08-01

    Brief problem-solving therapy is regarded as a pragmatic treatment for deliberate self-harm (DSH) patients. A recent meta-analysis of randomized controlled trials (RCTs) evaluating this approach indicated a trend towards reduced repetition of DSH but the pooled odds ratio was not statistically significant. We have now examined other important outcomes using this procedure, namely depression, hopelessness and improvement in problems. Six trials in which problem-solving therapy was compared with control treatment were identified from an extensive literature review of RCTs of treatments for DSH patients. Data concerning depression, hopelessness and improvement in problems were extracted. Where relevant statistical data (e.g. standard deviations) were missing these were imputed using various statistical methods. Results were pooled using meta-analytical procedures. At follow-up, patients who were offered problem-solving therapy had significantly greater improvement in scores for depression (standardized mean difference = -0.36; 95% CI -0.61 to -0.11) and hopelessness (weighted mean difference =-3.2; 95% CI -4.0 to -2.41), and significantly more reported improvement in their problems (odds ratio = 2.31; 95% CI 1.29 to 4.13), than patients who were in the control treatment groups. Problem-solving therapy for DSH patients appears to produce better results than control treatment with regard to improvement in depression, hopelessness and problems. It is desirable that this finding is confirmed in a large trial, which will also allow adequate testing of the impact of this treatment on repetition of DSH.

  11. Plantar pressure cartography reconstruction from 3 sensors.

    PubMed

    Abou Ghaida, Hussein; Mottet, Serge; Goujon, Jean-Marc

    2014-01-01

    Foot problem diagnosis is often made by using pressure mapping systems, unfortunately located and used in the laboratories. In the context of e-health and telemedicine for home monitoring of patients having foot problems, our focus is to present an acceptable system for daily use. We developed an ambulatory instrumented insole using 3 pressures sensors to visualize plantar pressure cartographies. We show that a standard insole with fixed sensor position could be used for different foot sizes. The results show an average error measured at each pixel of 0.01 daN, with a standard deviation of 0.005 daN.

  12. Workmanship Challenges for NASA Mission Hardware

    NASA Technical Reports Server (NTRS)

    Plante, Jeannette

    2010-01-01

    This slide presentation reviews several challenges in workmanship for NASA mission hardware development. Several standards for NASA workmanship exist, that are required for all programs, projects, contracts and subcontracts. These Standards contain our best known methods for avoiding past assembly problems and defects. These best practices may not be available if suppliers are used who are not compliant with them. Compliance includes having certified operators and inspectors. Some examples of problems that have occured from the lack of requirements flow-down to contractors are reviewed. The presentation contains a detailed example of the challenge in regards to The Packaging "Design" Dilemma.

  13. Screening for problem gambling within mental health services: a comparison of the classification accuracy of brief instruments.

    PubMed

    Dowling, Nicki A; Merkouris, Stephanie S; Manning, Victorian; Volberg, Rachel; Lee, Stuart J; Rodda, Simone N; Lubman, Dan I

    2018-06-01

    Despite the over-representation of people with gambling problems in mental health populations, there is limited information available to guide the selection of brief screening instruments within mental health services. The primary aim was to compare the classification accuracy of nine brief problem gambling screening instruments (two to five items) with a reference standard among patients accessing mental health services. The classification accuracy of nine brief screening instruments was compared with multiple cut-off scores on a reference standard. Eight mental health services in Victoria, Australia. A total of 837 patients were recruited consecutively between June 2015 and January 2016. The brief screening instruments were the Lie/Bet Questionnaire, Brief Problem Gambling Screen (BPGS) (two- to five-item versions), NODS-CLiP, NODS-CLiP2, Brief Biosocial Gambling Screen (BBGS) and NODS-PERC. The Problem Gambling Severity Index (PGSI) was the reference standard. The five-item BPGS was the only instrument displaying satisfactory classification accuracy in detecting any level of gambling problem (low-risk, moderate-risk or problem gambling) (sensitivity = 0.803, specificity = 0.982, diagnostic efficiency = 0.943). Several shorter instruments adequately detected both problem and moderate-risk, but not low-risk, gambling: two three-item instruments (NODS-CLiP, three-item BPGS) and two four-item instruments (NODS-PERC, four-item BPGS) (sensitivity = 0.854-0.966, specificity = 0.901-0.954, diagnostic efficiency = 0.908-0.941). The four-item instruments, however, did not provide any considerable advantage over the three-item instruments. Similarly, the very brief (two-item) instruments (Lie/Bet and two-item BPGS) adequately detected problem gambling (sensitivity = 0.811-0.868, specificity = 0.938-0.943, diagnostic efficiency = 0.933-0.934), but not moderate-risk or low-risk gambling. The optimal brief screening instrument for mental health services wanting to screen for any level of gambling problem is the five-item Brief Problem Gambling Screen (BPGS). Services wanting to employ a shorter instrument or to screen only for more severe gambling problems (moderate-risk/problem gambling) can employ the NODS-CLiP or the three-item BPGS. Services that are only able to accommodate a very brief instrument can employ the Lie/Bet Questionnaire or the two-item BPGS. © 2017 Society for the Study of Addiction.

  14. Implementation of a Standards-Based Grading Model: A Study of Parent and Teacher Perceptions of Success

    ERIC Educational Resources Information Center

    Wheeler, Amber D.

    2017-01-01

    The purpose of this study is to explore the perceptions of parents and teachers regarding the success of a standards-based grading initiative in meeting its goals. Furthermore, findings from this study will be used to inform decisions made in future grade level implementations. Standards-based grading meets all criteria for a problem of practice.…

  15. Neglecting the Importance of the Decision Making and Care Regimes of Personal Support Workers: A Critique of Standardization of Care Planning through the RAI/MDS

    ERIC Educational Resources Information Center

    Kontos, Pia C.; Miller, Karen-Lee; Mitchell, Gail J.

    2010-01-01

    Purpose: The Resident Assessment Instrument-Minimum Data Set (RAI/MDS) is an interdisciplinary standardized process that informs care plan development in nursing homes. This standardized process has failed to consistently result in individualized care planning, which may suggest problems with content and planning integrity. We examined the…

  16. A Descriptive Case Study of Writing Standards-Based Individualized Education Plan Goals via Problem-Based Learning in a Virtual World

    ERIC Educational Resources Information Center

    Blair, Peter J.

    2017-01-01

    The goal of this study was to examine the professional development experiences of two contrastive participants while they were creating standards-based individualized education plan (IEP) goals using a virtual world called TeacherSim. Two specific focuses of the study were on how special educators engaged with the task of creating standards-based…

  17. Materials and Process Specifications and Standards

    DTIC Science & Technology

    1977-11-01

    Integrity Requirements; Fracture Control 65 5.9.3 Some Special Problems in Electronic 66 Materials Specifications 5.9.3.1 Thermal Stresses 66...fatigue and fracture and by defining human engineering concepts. Conform to OSHA regulations such as toxicity, noise levels etc. Develop...Standardization Society of the Valves and Fittings Industry. 41 4.6.2.4 OTHER ORGANIZATIONS There are a number of standards-making organizations that cannot

  18. "It's Not My Problem": The Growth of Non-Standard Work and Its Impact on Vocational Education and Training in Australia.

    ERIC Educational Resources Information Center

    Hall, Richard; Bretherton, Tanya; Buchanan, John

    A study investigated implications of the increase in non-standard forms of employment (casual work, working through labor-hire companies, and work that is outsourced) for vocational education and training (VET) in Australia. Data sources were published statistics on growth of non-standard work; research on reasons for the growth and the business…

  19. Can Performance-Related Learning Outcomes Have Standards?

    ERIC Educational Resources Information Center

    Brockmann, Michaela; Clarke, Linda; Winch, Christopher

    2008-01-01

    Purpose: This paper aims to explain the distinction between educational standards and learning outcomes and to indicate the problems that potentially arise when a learning outcomes approach is applied to a qualification meta-framework like the European Qualification Framework, or indeed to national qualification frameworks.…

  20. Testing and the Testing Industry: A Third View.

    ERIC Educational Resources Information Center

    Williams, John D.

    Different viewpoints regarding educational testing are described. While some people advocate continuing reliance upon standardized tests, others favor the discontinuation of such achievement and intelligence tests. The author recommends a moderate view somewhere between these two extremes. Problems associated with standardized testing in the…

  1. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  2. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  3. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  4. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  5. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  6. Generating Linear Equations Based on Quantitative Reasoning

    ERIC Educational Resources Information Center

    Lee, Mi Yeon

    2017-01-01

    The Common Core's Standards for Mathematical Practice encourage teachers to develop their students' ability to reason abstractly and quantitatively by helping students make sense of quantities and their relationships within problem situations. The seventh-grade content standards include objectives pertaining to developing linear equations in…

  7. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  8. Detecting effects of the indicated prevention Programme for Externalizing Problem behaviour (PEP) on child symptoms, parenting, and parental quality of life in a randomized controlled trial.

    PubMed

    Hanisch, Charlotte; Freund-Braier, Inez; Hautmann, Christopher; Jänen, Nicola; Plück, Julia; Brix, Gabriele; Eichelberger, Ilka; Döpfner, Manfred

    2010-01-01

    Behavioural parent training is effective in improving child disruptive behavioural problems in preschool children by increasing parenting competence. The indicated Prevention Programme for Externalizing Problem behaviour (PEP) is a group training programme for parents and kindergarten teachers of children aged 3-6 years with externalizing behavioural problems. To evaluate the effects of PEP on child problem behaviour, parenting practices, parent-child interactions, and parental quality of life. Parents and kindergarten teachers of 155 children were randomly assigned to an intervention group (n = 91) and a nontreated control group (n = 64). They rated children's problem behaviour before and after PEP training; parents also reported on their parenting practices and quality of life. Standardized play situations were video-taped and rated for parent-child interactions, e.g. parental warmth. In the intention to treat analysis, mothers of the intervention group described less disruptive child behaviour and better parenting strategies, and showed more parental warmth during a standardized parent-child interaction. Dosage analyses confirmed these results for parents who attended at least five training sessions. Children were also rated to show less behaviour problems by their kindergarten teachers. Training effects were especially positive for parents who attended at least half of the training sessions. CBCL: Child Behaviour Checklist; CII: Coder Impressions Inventory; DASS: Depression anxiety Stress Scale; HSQ: Home-situation Questionnaire; LSS: Life Satisfaction Scale; OBDT: observed behaviour during the test; PCL: Problem Checklist; PEP: prevention programme for externalizing problem behaviour; PPC: Parent Problem Checklist; PPS: Parent Practices Scale; PS: Parenting Scale; PSBC: Problem Setting and Behaviour checklist; QJPS: Questionnaire on Judging Parental Strains; SEFS: Self-Efficacy Scale; SSC: Social Support Scale; TRF: Caregiver-Teacher Report Form.

  9. Constructed-Response Problems

    ERIC Educational Resources Information Center

    Swinford, Ashleigh

    2016-01-01

    With rigor outlined in state and Common Core standards and the addition of constructed-response test items to most state tests, math constructed-response questions have become increasingly popular in today's classroom. Although constructed-response problems can present a challenge for students, they do offer a glimpse of students' learning through…

  10. Anticipation Guides: Reading for Mathematics Understanding

    ERIC Educational Resources Information Center

    Adams, Anne E.; Pegg, Jerine; Case, Melissa

    2015-01-01

    With the acceptance by many states of the Common Core State Standards for Mathematics, new emphasis is being placed on students' ability to engage in mathematical practices such as understanding problems (including word problems), reading and critiquing arguments, and making explicit use of definitions (CCSSI 2010). Engaging students in…

  11. The Real World of the Beginning Teacher.

    ERIC Educational Resources Information Center

    National Education Association, Washington, DC. National Commission on Teacher Education and Professional Standards.

    Problems and goals of beginning teachers are the subject of these speeches presented by both experienced and beginning teachers at the 1965 national conference of the National Commission on Teacher Education and Professional Standards. The problems include the differences between teacher expectations and encounters, unrealistic teaching and…

  12. 45 CFR Appendix A to Part 1210 - Standard for Examiners

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... of: (i) The personal attributes essential to the effective performance of the duties of an Examiner... causes of complex problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; —Interpret and apply regulations and other complex written material...

  13. 45 CFR Appendix A to Part 1210 - Standard for Examiners

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of: (i) The personal attributes essential to the effective performance of the duties of an Examiner... causes of complex problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; —Interpret and apply regulations and other complex written material...

  14. The Cake Contest

    ERIC Educational Resources Information Center

    Haberern, Colleen

    2016-01-01

    With the adoption of the Common Core State Standards for Mathematics (CCSSM), many teachers are changing their classroom structure from teacher-directed to student-centered. When the author began designing and using problem-based tasks she saw a drastic improvement in student engagement and problem-solving skills. The author describes the Cake…

  15. Mastery Multiplied

    ERIC Educational Resources Information Center

    Shumway, Jessica F.; Kyriopoulos, Joan

    2014-01-01

    Being able to find the correct answer to a math problem does not always indicate solid mathematics mastery. A student who knows how to apply the basic algorithms can correctly solve problems without understanding the relationships between numbers or why the algorithms work. The Common Core standards require that students actually understand…

  16. [The status and current problems of the radiation protection support for Naval personnel].

    PubMed

    Sharaevskiĭ, G Iu; Murin, M B; Belikov, A D; Petrov, O I

    1999-07-01

    The article focuses on the radiation problems for the Navy personnel dealing with the nuclear and radioactive waste, since the existing standards become obsolete due to some new technologies in the development of the materials, endangering the environment and people's health.

  17. The Behavioural Profile of Psychiatric Disorders in Persons with Intellectual Disability

    ERIC Educational Resources Information Center

    Kishore, M. T.; Nizamie, S. H.; Nizamie, A.

    2005-01-01

    Background: Problems associated with psychiatric diagnoses could be minimized by identifying behavioural clusters of specific psychiatric disorders. Methods: Sixty persons with intellectual disability (ID) and behavioural problems, aged 12?55 years, were assessed with standardized Indian tools for intelligence and adaptive behaviour. Clinical…

  18. Unravelling the confusion caused by GASB, FASB accounting rules.

    PubMed

    Duis, T E

    1994-11-01

    Separate GASB and FASB accounting and financial reporting rules for governmental healthcare providers are producing confusion. Among other problems, they reduce the usefulness of aggregated data about the healthcare industry. This article addresses the inconsistencies of the various reporting standards and identified problems they can cause.

  19. Using information technology for an improved pharmaceutical care delivery in developing countries. Study case: Benin.

    PubMed

    Edoh, Thierry Oscar; Teege, Gunnar

    2011-10-01

    One of the problems in health care in developing countries is the bad accessibility of medicine in pharmacies for patients. Since this is mainly due to a lack of organization and information, it should be possible to improve the situation by introducing information and communication technology. However, for several reasons, standard solutions are not applicable here. In this paper, we describe a case study in Benin, a West African developing country. We identify the problem and the existing obstacles for applying standard ECommerce solutions. We develop an adapted system approach and describe a practical test which has shown that the approach has the potential of actually improving the pharmaceutical care delivery. Finally, we consider the security aspects of the system and propose an organizational solution for some specific security problems.

  20. Toward the automated analysis of plasma physics problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mynick, H.E.

    1989-04-01

    A program (CALC) is described, which carries out nontrivial plasma physics calculations, in a manner intended to emulate the approach of a human theorist. This includes the initial process of gathering the relevant equations from a plasma knowledge base, and then determining how to solve them. Solution of the sets of equations governing physics problems, which in general have a nonuniform,irregular structure, not amenable to solution by standardized algorithmic procedures, is facilitated by an analysis of the structure of the equations and the relations among them. This often permits decompositions of the full problem into subproblems, and other simplifications inmore » form, which renders the resultant subsystems soluble by more standardized tools. CALC's operation is illustrated by a detailed description of its treatment of a sample plasma calculation. 5 refs., 3 figs.« less

  1. Developing and validating the Youth Conduct Problems Scale-Rwanda: a mixed methods approach.

    PubMed

    Ng, Lauren C; Kanyanganzi, Frederick; Munyanah, Morris; Mushashi, Christine; Betancourt, Theresa S

    2014-01-01

    This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R). Qualitative free listing (n = 74) and key informant interviews (n = 47) identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research.

  2. The effectiveness of the Stop Now and Plan (SNAP) program for boys at risk for violence and delinquency.

    PubMed

    Burke, Jeffrey D; Loeber, Rolf

    2015-02-01

    Among the available treatments for disruptive behavior problems, a need remains for additional service options to reduce antisocial behavior and prevent further development along delinquent and violent pathways. The Stop Now and Plan (SNAP) Program is an intervention for antisocial behavior among boys between 6 and 11. This paper describes a randomized controlled treatment effectiveness study of SNAP versus standard behavioral health services. The treatment program was delivered to youth with aggressive, rule-breaking, or antisocial behavior in excess of clinical criterion levels. Outcomes were measured at 3, 9, and 15 months from baseline. Youth in the SNAP condition showed significantly greater reduction in aggression, conduct problems, and overall externalizing behavior, as well as counts of oppositional defiant disorder and attention deficit hyperactivity disorder symptoms. Additional benefits for SNAP were observed on measures of depression and anxiety. Further analyses indicated that the SNAP program was more effective among those with a higher severity of initial behavioral problems. At 1 year follow-up, treatment benefits for SNAP were maintained on some outcome measures (aggression, ADHD and ODD, depression and anxiety) but not others. Although overall juvenile justice system contact was not significantly different, youth in SNAP had significantly fewer charges against them relative to those standard services. The SNAP Program, when contrasted with standard services alone, was associated with greater, clinically meaningful, reductions in targeted behaviors. It may be particularly effective for youth with more severe behavioral problems and may result in improvements in internalizing problems as well.

  3. Evaluation of fluoride levels in bottled water and their contribution to health and teeth problems in the United Arab Emirates.

    PubMed

    Abouleish, Mohamed Yehia Z

    2016-10-01

    Fluoride is needed for better health, yet if ingested at higher levels it may lead to health problems. Fluoride can be obtained from different sources, with drinking water being a major contributor. In the United Arab Emirates (UAE), bottled water is the major source for drinking. The aim of this research is to measure fluoride levels in different bottled water brands sold in UAE, to determine whether fluoride contributes to better health or health problems. The results were compared to international and local standards. Fluoride was present in seven out of 23 brands. One brand exhibited high fluoride levels, which exceeded all standards, suggesting it may pose health problems. Other brands were either below or above standards, suggesting either contribution to better health or health problems, depending on ingested amount. A risk assessment suggested a potential for non-cancer effects from some brands. The results were compared to fluoride levels in bottled water sold in UAE and neighboring countries (e.g. Saudi Arabia, Qatar, Kuwait, and Bahrain), over 24 years, to reflect on changes in fluoride levels in bottled water in this region. The research presents the need for creating, stricter regulations that require careful fluoride monitoring and new regulations that require listing fluoride level on the bottled water label, internationally and regionally. The research will have local and global health impact, as bottled water sold in UAE and neighboring countries, is produced locally and imported from international countries, e.g. Switzerland, the USA, France, Italy, New Zealand, and Fiji.

  4. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  5. Optimal control of a harmonic oscillator: Economic interpretations

    NASA Astrophysics Data System (ADS)

    Janová, Jitka; Hampel, David

    2013-10-01

    Optimal control is a popular technique for modelling and solving the dynamic decision problems in economics. A standard interpretation of the criteria function and Lagrange multipliers in the profit maximization problem is well known. On a particular example, we aim to a deeper understanding of the possible economic interpretations of further mathematical and solution features of the optimal control problem: we focus on the solution of the optimal control problem for harmonic oscillator serving as a model for Phillips business cycle. We discuss the economic interpretations of arising mathematical objects with respect to well known reasoning for these in other problems.

  6. Technical Report, Onondaga Lake, New York, Main Report

    DTIC Science & Technology

    1992-01-01

    growth . Section 3 of this report will expand upon the specific water quality problems. EXISTING CONDITIONS Page 23 Table V - Comparison of Current...This technical report on Ononidaga Lake, New York has compi led existing data to determine which water quality and enviromental enhancements are... bacteria is a problem during storm events causing contravention of the State swimming standards. The source of the problem has been identified as the

  7. The associations of indoor environment and psychosocial factors on the subjective evaluation of Indoor Air Quality among lower secondary school students: a multilevel analysis.

    PubMed

    Finell, E; Haverinen-Shaughnessy, U; Tolvanen, A; Laaksonen, S; Karvonen, S; Sund, R; Saaristo, V; Luopa, P; Ståhl, T; Putus, T; Pekkanen, J

    2017-03-01

    Subjective evaluation of Indoor Air Quality (subjective IAQ) reflects both building-related and psychosocial factors, but their associations have rarely been studied other than on the individual level in occupational settings and their interactions have not been assessed. Therefore, we studied whether schools' observed indoor air problems and psychosocial factors are associated with subjective IAQ and their potential interactions. The analysis was performed with a nationwide sample (N = 195 schools/26946 students) using multilevel modeling. Two datasets were merged: (i) survey data from students, including information on schools' psychosocial environment and subjective IAQ, and (ii) data from school principals, including information on observed indoor air problems. On the student level, school-related stress, poor teacher-student relationship, and whether the student did not easily receive help from school personnel, were significantly associated with poor subjective IAQ. On the school level, observed indoor air problem (standardized β = -0.43) and poor teacher-student relationship (standardized β = -0.22) were significant predictors of poor subjective IAQ. In addition, school-related stress was associated with poor subjective IAQ, but only in schools without observed indoor air problem (standardized β = -0.44). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Adaptation of interoperability standards for cross domain usage

    NASA Astrophysics Data System (ADS)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  9. Health Complaints Associated with Poor Rental Housing Conditions in Arkansas: The Only State without a Landlord’s Implied Warranty of Habitability

    PubMed Central

    Bachelder, Ashley E.; Stewart, M. Kate; Felix, Holly C.; Sealy, Neil

    2016-01-01

    Arkansas is the only U.S. state that does not have a landlord’s implied warranty of habitability, meaning tenants have a requirement for maintaining their rental properties at certain habitability standards, but landlords are not legally required to contribute to those minimum health and safety standards. This project assessed the possibility that this lack of landlord responsibility affects tenants’ perceived health. Using surveys and interviews, we collected self-reported data on the prevalence and description of problems faced by renters who needed household repairs from their landlords. Of almost 1,000 renters, one-third of them had experienced a problem with their landlord making needed repairs; and one-quarter of those had a health issue they attributed to their housing conditions. Common issues included problems with plumbing, heating, or cooling systems, and pest or rodent control. Reported health problems included elevated stress levels, breathing problems, headaches, high blood pressure, and bites or infections. Hispanic respondents and those with less than a high school education were both significantly more likely to report problems with their landlords not making repairs as requested. These data suggest that the lack of landlord requirements may negatively impact the condition of rental properties and, therefore, may negatively impact the health of Arkansas renters. PMID:27933288

  10. Health Complaints Associated with Poor Rental Housing Conditions in Arkansas: The Only State without a Landlord's Implied Warranty of Habitability.

    PubMed

    Bachelder, Ashley E; Stewart, M Kate; Felix, Holly C; Sealy, Neil

    2016-01-01

    Arkansas is the only U.S. state that does not have a landlord's implied warranty of habitability, meaning tenants have a requirement for maintaining their rental properties at certain habitability standards, but landlords are not legally required to contribute to those minimum health and safety standards. This project assessed the possibility that this lack of landlord responsibility affects tenants' perceived health. Using surveys and interviews, we collected self-reported data on the prevalence and description of problems faced by renters who needed household repairs from their landlords. Of almost 1,000 renters, one-third of them had experienced a problem with their landlord making needed repairs; and one-quarter of those had a health issue they attributed to their housing conditions. Common issues included problems with plumbing, heating, or cooling systems, and pest or rodent control. Reported health problems included elevated stress levels, breathing problems, headaches, high blood pressure, and bites or infections. Hispanic respondents and those with less than a high school education were both significantly more likely to report problems with their landlords not making repairs as requested. These data suggest that the lack of landlord requirements may negatively impact the condition of rental properties and, therefore, may negatively impact the health of Arkansas renters.

  11. Problems and methods of calculating the Legendre functions of arbitrary degree and order

    NASA Astrophysics Data System (ADS)

    Novikova, Elena; Dmitrenko, Alexander

    2016-12-01

    The known standard recursion methods of computing the full normalized associated Legendre functions do not give the necessary precision due to application of IEEE754-2008 standard, that creates a problems of underflow and overflow. The analysis of the problems of the calculation of the Legendre functions shows that the problem underflow is not dangerous by itself. The main problem that generates the gross errors in its calculations is the problem named the effect of "absolute zero". Once appeared in a forward column recursion, "absolute zero" converts to zero all values which are multiplied by it, regardless of whether a zero result of multiplication is real or not. Three methods of calculating of the Legendre functions, that removed the effect of "absolute zero" from the calculations are discussed here. These methods are also of interest because they almost have no limit for the maximum degree of Legendre functions. It is shown that the numerical accuracy of these three methods is the same. But, the CPU calculation time of the Legendre functions with Fukushima method is minimal. Therefore, the Fukushima method is the best. Its main advantage is computational speed which is an important factor in calculation of such large amount of the Legendre functions as 2 401 336 for EGM2008.

  12. Cognitive, emotive, and cognitive-behavioral correlates of suicidal ideation among Chinese adolescents in Hong Kong.

    PubMed

    Kwok, Sylvia Lai Yuk Ching; Shek, Daniel Tan Lei

    2010-03-05

    Utilizing Daniel Goleman's theory of emotional competence, Beck's cognitive theory, and Rudd's cognitive-behavioral theory of suicidality, the relationships between hopelessness (cognitive component), social problem solving (cognitive-behavioral component), emotional competence (emotive component), and adolescent suicidal ideation were examined. Based on the responses of 5,557 Secondary 1 to Secondary 4 students from 42 secondary schools in Hong Kong, results showed that suicidal ideation was positively related to adolescent hopelessness, but negatively related to emotional competence and social problem solving. While standard regression analyses showed that all the above variables were significant predictors of suicidal ideation, hierarchical regression analyses showed that hopelessness was the most important predictor of suicidal ideation, followed by social problem solving and emotional competence. Further regression analyses found that all four subscales of emotional competence, i.e., empathy, social skills, self-management of emotions, and utilization of emotions, were important predictors of male adolescent suicidal ideation. However, the subscale of social skills was not a significant predictor of female adolescent suicidal ideation. Standard regression analysis also revealed that all three subscales of social problem solving, i.e., negative problem orientation, rational problem solving, and impulsiveness/carelessness style, were important predictors of suicidal ideation. Theoretical and practice implications of the findings are discussed.

  13. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  14. Epistemic Beliefs about Justification Employed by Physics Students and Faculty in Two Different Problem Contexts

    NASA Astrophysics Data System (ADS)

    Çağlayan Mercan, Fatih

    2012-06-01

    This study examines the epistemic beliefs about justification employed by physics undergraduate and graduate students and faculty in the context of solving a standard classical physics problem and a frontier physics problem. Data were collected by a think-aloud problem solving session followed by a semi-structured interview conducted with 50 participants, 10 participants at freshmen, seniors, masters, PhD, and faculty levels. Seven modes of justification were identified and used for exploring the relationships between each justification mode and problem context, and expertise level. The data showed that justification modes were not mutually exclusive and many respondents combined different modes in their responses in both problem contexts. Success on solving the standard classical physics problem was not related to any of the justification modes and was independent of expertise level. The strength of the association across the problem contexts for the authoritative, rational, and empirical justification modes fell in the medium range and for the modeling justification mode fell in the large range of practical significance. Expertise level was not related with the empirical and religious justification modes. The strength of the association between the expertise level and the authoritative, rational, experiential, and relativistic justification modes fell in the medium range, and the modeling justification mode fell in the large range of practical significance. The results provide support for the importance of context for the epistemic beliefs about justification and are discussed in terms of the implications for teaching and learning science.

  15. Bootstrap Estimates of Standard Errors in Generalizability Theory

    ERIC Educational Resources Information Center

    Tong, Ye; Brennan, Robert L.

    2007-01-01

    Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…

  16. Software database creature for investment property measurement according to international standards

    NASA Astrophysics Data System (ADS)

    Ponomareva, S. V.; Merzliakova, N. A.

    2018-05-01

    The article deals with investment property measurement and accounting problems at the international, national and enterprise levels. The need to create the software for investment property measurement according to International Accounting Standards was substantiated. The necessary software functions and the processes were described.

  17. 76 FR 38431 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-30

    ... Commission's minimum performance standards regarding registered transfer agents, and (2) to assure that issuers are aware of certain problems and poor performances with respect to the transfer agents that are... failure to comply with the Commission's minimum performance standards then the issuer will be unable to...

  18. Variations on an Historical Case Study

    ERIC Educational Resources Information Center

    Field, Patrick

    2006-01-01

    The National Inquiry Standard for Science Education Preparation requires science teachers to introduce students to scientific inquiry to solve problems by various methods, including active learning in a collaborative environment. In order for science teachers to comply with this inquiry standard, activities must be designed for students to…

  19. Qualification Journey in Teacher Training: Case in Northern Cyprus

    ERIC Educational Resources Information Center

    Erden, Hale

    2016-01-01

    Problem Statement: The identification of professional teaching standards has great value on initial teacher training, hiring teachers, assessing teacher performance, as well as planning and organizing teacher professional development. In Northern Cyprus there are not any identified professional teaching standards. This study aimed at filling this…

  20. Implementing the Curriculum and Evaluation Standards: First-Year Algebra.

    ERIC Educational Resources Information Center

    Kysh, Judith

    1991-01-01

    Described is an alternative first year algebra program developed to bridge the gap between the NCTM's Curriculum and Evaluation Standards and institutional demands of schools. Increased attention is given to graphing as a context for algebra, calculator use, solving "memorable problems," and incorporating geometry concepts, while…

  1. Early Identification of At-Risk LPN-to-RN Students

    ERIC Educational Resources Information Center

    Hawthorne, Lisa K.

    2013-01-01

    Nurse education programs are implementing standardized assessments without evaluating their effectiveness. Graduates of associate degree nursing programs continue to be unsuccessful with licensure examinations, despite standardized testing and stronger admission criteria. This problem is also prevalent for LPN-to-RN education programs due to a…

  2. The Best of Both Worlds

    ERIC Educational Resources Information Center

    Schneider, Jack; Feldman, Joe; French, Dan

    2016-01-01

    Relying on teachers' assessments for the information currently provided by standardized test scores would save instructional time, better capture the true abilities of diverse students, and reduce the problem of teaching to the test. A California high school is implementing standards-based reporting, ensuring that teacher-issued grades function as…

  3. 42 CFR 493.1233 - Standard: Complaint investigations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Complaint investigations. 493.1233 Section 493.1233 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... a system in place to ensure that it documents all complaints and problems reported to the laboratory...

  4. 40 CFR 171.5 - Standards for certification of private applicators.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Standards for certification of private applicators. 171.5 Section 171.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... practical knowledge of the pest problems and pest control practices associated with his agricultural...

  5. A Math-Box Tale

    ERIC Educational Resources Information Center

    Nelson, Catherine J.

    2012-01-01

    The author is a strong proponent of incorporating the Content and Process Standards (NCTM 2000) into the teaching of mathematics. For candidates in her methods course, she models research-based best practices anchored in the Standards. Her students use manipulatives, engage in problem-solving activities, listen to children's literature, and use…

  6. Gaussian-input Gaussian mixture model for representing density maps and atomic models.

    PubMed

    Kawabata, Takeshi

    2018-07-01

    A new Gaussian mixture model (GMM) has been developed for better representations of both atomic models and electron microscopy 3D density maps. The standard GMM algorithm employs an EM algorithm to determine the parameters. It accepted a set of 3D points with weights, corresponding to voxel or atomic centers. Although the standard algorithm worked reasonably well; however, it had three problems. First, it ignored the size (voxel width or atomic radius) of the input, and thus it could lead to a GMM with a smaller spread than the input. Second, the algorithm had a singularity problem, as it sometimes stopped the iterative procedure due to a Gaussian function with almost zero variance. Third, a map with a large number of voxels required a long computation time for conversion to a GMM. To solve these problems, we have introduced a Gaussian-input GMM algorithm, which considers the input atoms or voxels as a set of Gaussian functions. The standard EM algorithm of GMM was extended to optimize the new GMM. The new GMM has identical radius of gyration to the input, and does not suddenly stop due to the singularity problem. For fast computation, we have introduced a down-sampled Gaussian functions (DSG) by merging neighboring voxels into an anisotropic Gaussian function. It provides a GMM with thousands of Gaussian functions in a short computation time. We also have introduced a DSG-input GMM: the Gaussian-input GMM with the DSG as the input. This new algorithm is much faster than the standard algorithm. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  7. What consequences should result from failure to meet internal standards?

    PubMed

    Schramm, J

    1997-01-01

    This paper tries to approach a difficult problem, namely how to deal with a resident who has failed to meet the internal standards of a residency training program. First the problems of the definition of a standard and the associated problems of its reproducibility, documentation, teaching, update, and internal variability inside the same teaching program are dealt with. Consequently the question needs to be answered that constitutes a failure to meet the standard. The results of a survey of residents' attitudes are quoted as are some responses to a survey among the chiefs of teaching programs. Considering the attitudes of residents on how to handle breaches of standard the basic message was that residents want to be told that they do not function. Both parties want the collaboration of senior staff members on this topic. Whereas residents want to re-train, exercise and talk they do not want sanctions. Chiefs, however, want much less re-training, exercising and talking but earlier sanction. The difficult point of dealing with a true failure is discussed in the light of the German legal situation and the actual possibilities of how to handle the case. Before it comes to the point of discontinuing the training of a resident, it needs to be agreed upon what would be a classical situation of failure in which both the chiefs responsible for training and the residents agree that training is better discontinued. The author describes his experience with the real course of events in 7 cases he witnessed in 22 years.

  8. Developpement d'une methode de Monte Carlo dependante du temps et application au reacteur de type CANDU-6

    NASA Astrophysics Data System (ADS)

    Mahjoub, Mehdi

    La resolution de l'equation de Boltzmann demeure une etape importante dans la prediction du comportement d'un reacteur nucleaire. Malheureusement, la resolution de cette equation presente toujours un defi pour une geometrie complexe (reacteur) tout comme pour une geometrie simple (cellule). Ainsi, pour predire le comportement d'un reacteur nucleaire,un schema de calcul a deux etapes est necessaire. La premiere etape consiste a obtenir les parametres nucleaires d'une cellule du reacteur apres une etape d'homogeneisation et condensation. La deuxieme etape consiste en un calcul de diffusion pour tout le reacteur en utilisant les resultats de la premiere etape tout en simplifiant la geometrie du reacteur a un ensemble de cellules homogenes le tout entoure de reflecteur. Lors des transitoires (accident), ces deux etapes sont insuffisantes pour pouvoir predire le comportement du reacteur. Comme la resolution de l'equation de Boltzmann dans sa forme dependante du temps presente toujours un defi de taille pour tous types de geometries,un autre schema de calcul est necessaire. Afin de contourner cette difficulte, l'hypothese adiabatique est utilisee. Elle se concretise en un schema de calcul a quatre etapes. La premiere et deuxieme etapes demeurent les memes pour des conditions nominales du reacteur. La troisieme etape se resume a obtenir les nouvelles proprietes nucleaires de la cellule a la suite de la perturbation pour les utiliser, au niveau de la quatrieme etape, dans un nouveau calcul de reacteur et obtenir l'effet de la perturbation sur le reacteur. Ce projet vise a verifier cette hypothese. Ainsi, un nouveau schema de calcul a ete defini. La premiere etape de ce projet a ete de creer un nouveau logiciel capable de resoudre l'equation de Boltzmann dependante du temps par la methode stochastique Monte Carlo dans le but d'obtenir des sections efficaces qui evoluent dans le temps. Ce code a ete utilise pour simuler un accident LOCA dans un reacteur nucleaire de type CANDU-6. Les sections efficaces dependantes du temps ont ete par la suite utilisees dans un calcul de diffusion espace-temps pour un reacteur CANDU-6 subissant un accident de type LOCA affectant la moitie du coeur afin d'observer son comportement durant toutes les phases de la perturbation. Dans la phase de developpement, nous avons choisi de demarrer avec le code OpenMC, developpe au MIT,comme plateforme initiale de developpement. L'introduction et le traitement des neutrons retardes durant la simulation ont presente un grand defi a surmonter. Il est important de noter que le code developpe utilisant la methode Monte Carlo peut etre utilise a grande echelle pour la simulation de tous les types des reacteurs nucleaires si les supports informatiques sont disponibles.

  9. Comparing implementations of penalized weighted least-squares sinogram restoration.

    PubMed

    Forthmann, Peter; Koehler, Thomas; Defrise, Michel; La Riviere, Patrick

    2010-11-01

    A CT scanner measures the energy that is deposited in each channel of a detector array by x rays that have been partially absorbed on their way through the object. The measurement process is complex and quantitative measurements are always and inevitably associated with errors, so CT data must be preprocessed prior to reconstruction. In recent years, the authors have formulated CT sinogram preprocessing as a statistical restoration problem in which the goal is to obtain the best estimate of the line integrals needed for reconstruction from the set of noisy, degraded measurements. The authors have explored both penalized Poisson likelihood (PL) and penalized weighted least-squares (PWLS) objective functions. At low doses, the authors found that the PL approach outperforms PWLS in terms of resolution-noise tradeoffs, but at standard doses they perform similarly. The PWLS objective function, being quadratic, is more amenable to computational acceleration than the PL objective. In this work, the authors develop and compare two different methods for implementing PWLS sinogram restoration with the hope of improving computational performance relative to PL in the standard-dose regime. Sinogram restoration is still significant in the standard-dose regime since it can still outperform standard approaches and it allows for correction of effects that are not usually modeled in standard CT preprocessing. The authors have explored and compared two implementation strategies for PWLS sinogram restoration: (1) A direct matrix-inversion strategy based on the closed-form solution to the PWLS optimization problem and (2) an iterative approach based on the conjugate-gradient algorithm. Obtaining optimal performance from each strategy required modifying the naive off-the-shelf implementations of the algorithms to exploit the particular symmetry and sparseness of the sinogram-restoration problem. For the closed-form approach, the authors subdivided the large matrix inversion into smaller coupled problems and exploited sparseness to minimize matrix operations. For the conjugate-gradient approach, the authors exploited sparseness and preconditioned the problem to speed up convergence. All methods produced qualitatively and quantitatively similar images as measured by resolution-variance tradeoffs and difference images. Despite the acceleration strategies, the direct matrix-inversion approach was found to be uncompetitive with iterative approaches, with a computational burden higher by an order of magnitude or more. The iterative conjugate-gradient approach, however, does appear promising, with computation times half that of the authors' previous penalized-likelihood implementation. Iterative conjugate-gradient based PWLS sinogram restoration with careful matrix optimizations has computational advantages over direct matrix PWLS inversion and over penalized-likelihood sinogram restoration and can be considered a good alternative in standard-dose regimes.

  10. Issues Involved in Developing Ada Real-Time Systems

    DTIC Science & Technology

    1989-02-15

    expensive modifications to the compiler or Ada runtime system to fit a particular application. Whether we can solve the problems of programming real - time systems in...lock in solutions to problems that are not yet well understood in standards as rigorous as the Ada language. Moreover, real - time systems typically have

  11. On Present State of Teaching Russian Language in Russia

    ERIC Educational Resources Information Center

    Tekucheva, Irina V.; Gromova, Liliya Y.

    2016-01-01

    The article discusses the current state of teaching Russian language, discovers the nature of philological education, outlines the main problems of the implementation of the standard in school practice, analyzes the problems of formation of universal educational actions within the context of the implementation of cognitive-communicative approach,…

  12. College Students' Alcohol-Related Problems: An Autophotographic Approach

    ERIC Educational Resources Information Center

    Casey, Patrick F.; Dollinger, Stephen J.

    2007-01-01

    This study related standard self-report measures to an innovative approach (the autophotographic essay) as a way to provide insight into patterns of alcohol consumption and associated problem behaviors. College students (N = 135) completed self-report measures of alcohol consumption and created autophotographic essays of identity coded for alcohol…

  13. Group Mirrors to Support Interaction Regulation in Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Jermann, Patrick; Dillenbourg, Pierre

    2008-01-01

    Two experimental studies test the effect of group mirrors upon quantitative and qualitative aspects of participation in collaborative problem solving. Mirroring tools consist of a graphical representation of the group's actions which is dynamically updated and displayed to the collaborators. In addition, metacognitive tools display a standard for…

  14. 20 CFR 632.23 - Termination and corrective action of a CAP and/or Master Plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... substantiates serious management, fiscal and/or performance problems, information from the Inspector General or gained through incident reports of poor performance, serious administrative problems and/or inability to... termination: (1) Poor performance and inability to meet Federal standards related to such debt collection...

  15. The Locker Problem: An Open and Shut Case

    ERIC Educational Resources Information Center

    Kimani, Patrick M.; Olanoff, Dana; Masingila, Joanna O.

    2016-01-01

    This article discusses how teaching via problem solving helps enact the Mathematics Teaching Practices and supports students' learning and development of the Standards for Mathematical Practice. This approach involves selecting and implementing mathematical tasks that serve as vehicles for meeting the learning goals for the lesson. For the lesson…

  16. On the numerical treatment of Coulomb forces in scattering problems

    NASA Astrophysics Data System (ADS)

    Randazzo, J. M.; Ancarani, L. U.; Colavecchia, F. D.; Gasaneo, G.; Frapiccini, A. L.

    2012-11-01

    We investigate the limiting procedures to obtain Coulomb interactions from short-range potentials. The application of standard techniques used for the two-body case (exponential and sharp cutoff) to the three-body break-up problem is illustrated numerically by considering the Temkin-Poet (TP) model of e-H processes.

  17. The Problem of Underqualified Teachers: A Sociological Perspective

    ERIC Educational Resources Information Center

    Ingersoll, Richard M.

    2005-01-01

    Few educational problems have received more attention than has the failure to ensure that the nation's classrooms are staffed by qualified teachers. Many states have pushed for more-rigorous preservice teacher education, training, and certification standards. Moreover, a host of recruitment initiatives have attempted to increase the supply of…

  18. Problem Space Matters: The Development of Creativity and Intelligence in Primary School Children

    ERIC Educational Resources Information Center

    Welter, Marisete Maria; Jaarsveld, Saskia; Lachmann, Thomas

    2017-01-01

    Previous research showed that in primary school, children's intelligence develops continually, but creativity develops more irregularly. In this study, the development of intelligence, measured traditionally, i.e., operating within well-defined problem spaces (Standard Progressive Matrices) was compared with the development of intelligence…

  19. Evaluation of Undergraduate Teaching at Institutions of Higher Education in China: Problems and Reform

    ERIC Educational Resources Information Center

    Yukun, Chen

    2009-01-01

    This paper reviews the achievements of the first cycle of undergraduate teaching evaluation at institutions of higher education in China. Existing problems are identified, and suggestions are made for corresponding reforms for improving the standard and quality of China's undergraduate teaching evaluation.

  20. Protocol Analysis of Aptitude Differences in Figural Analogy Problem Representation.

    ERIC Educational Resources Information Center

    Schiano, Diane J.

    Individual differences in performance on figural analogy tests are usually attributed to quantitative differences in processing parameters rather than to qualitative differences in the formation and use of representations. Yet aptitude-related differences in categorizing standardized figural analogy problems between high and low scorers have been…

Top