Sample records for vhtr standard problem

  1. Next Generation Nuclear Plant Methods Research and Development Technical Program Plan -- PLN-2498

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg

    2008-09-01

    One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less

  2. Next Generation Nuclear Plant Methods Technical Program Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg

    2010-12-01

    One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less

  3. Next Generation Nuclear Plant Methods Technical Program Plan -- PLN-2498

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg

    2010-09-01

    One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less

  4. Investigation of Abnormal Heat Transfer and Flow in a VHTR Reactor Core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawaji, Masahiro; Valentin, Francisco I.; Artoun, Narbeh

    2015-12-21

    The main objective of this project was to identify and characterize the conditions under which abnormal heat transfer phenomena would occur in a Very High Temperature Reactor (VHTR) with a prismatic core. High pressure/high temperature experiments have been conducted to obtain data that could be used for validation of VHTR design and safety analysis codes. The focus of these experiments was on the generation of benchmark data for design and off-design heat transfer for forced, mixed and natural circulation in a VHTR core. In particular, a flow laminarization phenomenon was intensely investigated since it could give rise to hot spotsmore » in the VHTR core.« less

  5. Evaluation of RANS and LES models for Natural Convection in High-Aspect-Ratio Parallel Plate Channels

    NASA Astrophysics Data System (ADS)

    Fradeneck, Austen; Kimber, Mark

    2017-11-01

    The present study evaluates the effectiveness of current RANS and LES models in simulating natural convection in high-aspect ratio parallel plate channels. The geometry under consideration is based on a simplification of the coolant and bypass channels in the very high-temperature gas reactor (VHTR). Two thermal conditions are considered, asymmetric and symmetric wall heating with an applied heat flux to match Rayleigh numbers experienced in the VHTR during a loss of flow accident (LOFA). RANS models are compared to analogous high-fidelity LES simulations. Preliminary results demonstrate the efficacy of the low-Reynolds number k- ɛ formulations and their enhancement to the standard form and Reynolds stress transport model in terms of calculating the turbulence production due to buoyancy and overall mean flow variables.

  6. Evaluation of the DRAGON code for VHTR design analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-12

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less

  7. NGNP Data Management and Analysis System Modeling Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cynthia D. Gentillon

    2009-09-01

    Projects for the very-high-temperature reactor (VHTR) program provide data in support of Nuclear Regulatory Commission licensing of the VHTR. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high temperature and high fluence environments. In addition, thermal-hydraulic experiments are conducted to validate codes used to assess reactor safety. The VHTR Program has established the NGNP Data Management and Analysis System (NDMAS) to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the thirdmore » NDMAS objective. It describes capabilities for displaying the data in meaningful ways and identifying relationships among the measured quantities that contribute to their understanding.« less

  8. Analysis of supercritical CO{sub 2} cycle control strategies and dynamic response for Generation IV Reactors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moisseytsev, A.; Sienicki, J. J.

    2011-04-12

    The analysis of specific control strategies and dynamic behavior of the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle has been extended to the two reactor types selected for continued development under the Generation IV Nuclear Energy Systems Initiative; namely, the Very High Temperature Reactor (VHTR) and the Sodium-Cooled Fast Reactor (SFR). Direct application of the standard S-CO{sub 2} recompression cycle to the VHTR was found to be challenging because of the mismatch in the temperature drop of the He gaseous reactor coolant through the He-to-CO{sub 2} reactor heat exchanger (RHX) versus the temperature rise of the CO{sub 2} through themore » RHX. The reference VHTR features a large temperature drop of 450 C between the assumed core outlet and inlet temperatures of 850 and 400 C, respectively. This large temperature difference is an essential feature of the VHTR enabling a lower He flow rate reducing the required core velocities and pressure drop. In contrast, the standard recompression S-CO{sub 2} cycle wants to operate with a temperature rise through the RHX of about 150 C reflecting the temperature drop as the CO{sub 2} expands from 20 MPa to 7.4 MPa in the turbine and the fact that the cycle is highly recuperated such that the CO{sub 2} entering the RHX is effectively preheated. Because of this mismatch, direct application of the standard recompression cycle results in a relatively poor cycle efficiency of 44.9%. However, two approaches have been identified by which the S-CO{sub 2} cycle can be successfully adapted to the VHTR and the benefits of the S-CO{sub 2} cycle, especially a significant gain in cycle efficiency, can be realized. The first approach involves the use of three separate cascaded S-CO{sub 2} cycles. Each S-CO{sub 2} cycle is coupled to the VHTR through its own He-to-CO{sub 2} RHX in which the He temperature is reduced by 150 C. The three respective cycles have efficiencies of 54, 50, and 44%, respectively, resulting in a net cycle efficiency of 49.3 %. The other approach involves reducing the minimum cycle pressure significantly below the critical pressure such that the temperature drop in the turbine is increased while the minimum cycle temperature is maintained above the critical temperature to prevent the formation of a liquid phase. The latter approach also involves the addition of a precooler and a third compressor before the main compressor to retain the benefits of compression near the critical point with the main compressor. For a minimum cycle pressure of 1 MPa, a cycle efficiency of 49.5% is achieved. Either approach opens up the door to applying the SCO{sub 2} cycle to the VHTR. In contrast, the SFR system typically has a core outlet-inlet temperature difference of about 150 C such that the standard recompression cycle is ideally suited for direct application to the SFR. The ANL Plant Dynamics Code has been modified for application to the VHTR and SFR when the reactor side dynamic behavior is calculated with another system level computer code such as SAS4A/SYSSYS-1 in the SFR case. The key modification involves modeling heat exchange in the RHX, accepting time dependent tabular input from the reactor code, and generating time dependent tabular input to the reactor code such that both the reactor and S-CO{sub 2} cycle sides can be calculated in a convergent iterative scheme. This approach retains the modeling benefits provided by the detailed reactor system level code and can be applied to any reactor system type incorporating a S-CO{sub 2} cycle. This approach was applied to the particular calculation of a scram scenario for a SFR in which the main and intermediate sodium pumps are not tripped and the generator is not disconnected from the electrical grid in order to enhance heat removal from the reactor system thereby enhancing the cooldown rate of the Na-to-CO{sub 2} RHX. The reactor side is calculated with SAS4A/SASSYS-1 while the S-CO{sub 2} cycle is calculated with the Plant Dynamics Code with a number of iterations over a timescale of 500 seconds. It is found that the RHX undergoes a maximum cooldown rate of {approx} -0.3 C/s. The Plant Dynamics Code was also modified to decrease its running time by replacing the compressible flow form of the momentum equation with an incompressible flow equation for use inside of the cooler or recuperators where the CO{sub 2} has a compressibility similar to that of a liquid. Appendices provide a quasi-static control strategy for a SFR as well as the self-adaptive linear function fitting algorithm developed to produce the tabular data for input to the reactor code and Plant Dynamics Code from the detailed output of the other code.« less

  9. Total hemispherical emissivity of very high temperature reactor (VHTR) candidate materials: Hastelloy X, Haynes 230, and Alloy 617

    NASA Astrophysics Data System (ADS)

    Maynard, Raymond K.

    An experimental system was constructed in accordance with the standard ASTM C835-06 to measure the total hemispherical emissivity of structural materials of interest in Very High Temperature Reactor (VHTR) systems. The system was tested with304 stainless steel as well as for oxidized and un-oxidized nickel, and good reproducibility and agreement with the literature data was found. Emissivity of Hastelloy X was measured under different conditions that included: (i) "as received" (original sample) from the supplier; (ii) with increased surface roughness; (iii) oxidized, and; (iv) graphite coated. Measurements were made over a wide range of temperatures. Hastelloy X, as received from the supplier, was cleaned before additional roughening of the surface and coating with graphite. The emissivity of the original samples (cleaned after received) varied from around 0.18 to 0.28 in the temperature range of 473 K to 1498 K. The apparent emissivity increased only slightly as the roughness of the surface increased (without corrections for the increased surface area due to the increased surface roughness). When Hastelloy X was coated with graphite or oxidized however, its emissivity was observed to increase substantially. With a deposited graphite layer on the Hastelloy, emissivity increased from 0.2 to 0.53 at 473 K and from 0.25 to 0.6 at 1473 K; a finding that has strong favorable safety implications in terms of decay heat removal in post-accident VHTR environments. Although initial oxidation of Hastelloy X increased the emissivity prolonged oxidation did not significantly increase emissivity. However as there is some oxidation of Hastelloy X used in the construction of VHTRs, this represents an essentially neutral finding in terms of the safety implications in post-accident VHTR environments. The total hemispherical emissivity of Haynes 230 alloy, which is regarded as a leading candidate material for heat exchangers in VHTR systems, was measured under various surface conditions. The emissivity increased from 0.178 at 600 K to 0.235 at 1375 K for Haynes 230 as received sample. The emissivity increased significantly when its surface roughness was increased, or was oxidized in air, or coated with graphite dust, as compared to the as received material. The total hemispherical emissivity of Alloy 617 was measured as a function of temperature. The total emissivity increased from about 0.2 at 600 K to about 0.35 at 1275 K.

  10. Emissivity of Candidate Materials for VHTR Applicationbs: Role of Oxidation and Surface Modification Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sridharan, Kumar; Allen, Todd; Anderson, Mark

    The Generation IV (GEN IV) Nuclear Energy Systems Initiative was instituted by the Department of Energy (DOE) with the goal of researching and developing technologies and materials necessary for various types of future reactors. These GEN IV reactors will employ advanced fuel cycles, passive safety systems, and other innovative systems, leading to significant differences between these future reactors and current water-cooled reactors. The leading candidate for the Next Generation Nuclear Plant (NGNP) to be built at Idaho National Lab (INL) in the United States is the Very High Temperature Reactor (VHTR). Due to the high operating temperatures of the VHTR,more » the Reactor Pressure Vessel (RPV) will partially rely on heat transfer by radiation for cooling. Heat expulsion by radiation will become all the more important during high temperature excursions during off-normal accident scenarios. Radiant power is dictated by emissivity, a material property. The NGNP Materials Research and Development Program Plan [1] has identified emissivity and the effects of high temperature oxide formation on emissivity as an area of research towards the development of the VHTR.« less

  11. FY2012 summary of tasks completed on PROTEUS-thermal work.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C.H.; Smith, M.A.

    2012-06-06

    PROTEUS is a suite of the neutronics codes, both old and new, that can be used within the SHARP codes being developed under the NEAMS program. Discussion here is focused on updates and verification and validation activities of the SHARP neutronics code, DeCART, for application to thermal reactor analysis. As part of the development of SHARP tools, the different versions of the DeCART code created for PWR, BWR, and VHTR analysis were integrated. Verification and validation tests for the integrated version were started, and the generation of cross section libraries based on the subgroup method was revisited for the targetedmore » reactor types. The DeCART code has been reorganized in preparation for an efficient integration of the different versions for PWR, BWR, and VHTR analysis. In DeCART, the old-fashioned common blocks and header files have been replaced by advanced memory structures. However, the changing of variable names was minimized in order to limit problems with the code integration. Since the remaining stability problems of DeCART were mostly caused by the CMFD methodology and modules, significant work was performed to determine whether they could be replaced by more stable methods and routines. The cross section library is a key element to obtain accurate solutions. Thus, the procedure for generating cross section libraries was revisited to provide libraries tailored for the targeted reactor types. To improve accuracy in the cross section library, an attempt was made to replace the CENTRM code by the MCNP Monte Carlo code as a tool obtaining reference resonance integrals. The use of the Monte Carlo code allows us to minimize problems or approximations that CENTRM introduces since the accuracy of the subgroup data is limited by that of the reference solutions. The use of MCNP requires an additional set of libraries without resonance cross sections so that reference calculations can be performed for a unit cell in which only one isotope of interest includes resonance cross sections, among the isotopes in the composition. The OECD MHTGR-350 benchmark core was simulated using DeCART as initial focus of the verification/validation efforts. Among the benchmark problems, Exercise 1 of Phase 1 is a steady-state benchmark case for the neutronics calculation for which block-wise cross sections were provided in 26 energy groups. This type of problem was designed for a homogenized geometry solver like DIF3D rather than the high-fidelity code DeCART. Instead of the homogenized block cross sections given in the benchmark, the VHTR-specific 238-group ENDF/B-VII.0 library of DeCART was directly used for preliminary calculations. Initial results showed that the multiplication factors of a fuel pin and a fuel block with or without a control rod hole were off by 6, -362, and -183 pcm Dk from comparable MCNP solutions, respectively. The 2-D and 3-D one-third core calculations were also conducted for the all-rods-out (ARO) and all-rods-in (ARI) configurations, producing reasonable results. Figure 1 illustrates the intermediate (1.5 eV - 17 keV) and thermal (below 1.5 eV) group flux distributions. As seen from VHTR cores with annular fuels, the intermediate group fluxes are relatively high in the fuel region, but the thermal group fluxes are higher in the inner and outer graphite reflector regions than in the fuel region. To support the current project, a new three-year I-NERI collaboration involving ANL and KAERI was started in November 2011, focused on performing in-depth verification and validation of high-fidelity multi-physics simulation codes for LWR and VHTR. The work scope includes generating improved cross section libraries for the targeted reactor types, developing benchmark models for verification and validation of the neutronics code with or without thermo-fluid feedback, and performing detailed comparisons of predicted reactor parameters against both Monte Carlo solutions and experimental measurements. The following list summarizes the work conducted so far for PROTEUS-Thermal Tasks: Unification of different versions of DeCART was initiated, and at the same time code modernization was conducted to make code unification efficient; (2) Regeneration of cross section libraries was attempted for the targeted reactor types, and the procedure for generating cross section libraries was updated by replacing CENTRM with MCNP for reference resonance integrals; (3) The MHTGR-350 benchmark core was simulated using DeCART with VHTR-specific 238-group ENDF/B-VII.0 library, and MCNP calculations were performed for comparison; and (4) Benchmark problems for PWR and BWR analysis were prepared for the DeCART verification/validation effort. In the coming months, the work listed above will be completed. Cross section libraries will be generated with optimized group structures for specific reactor types.« less

  12. Three-dimensional NDE of VHTR core components via simulation-based testing. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guzina, Bojan; Kunerth, Dennis

    2014-09-30

    A next generation, simulation-driven-and-enabled testing platform is developed for the 3D detection and characterization of defects and damage in nuclear graphite and composite structures in Very High Temperature Reactors (VHTRs). The proposed work addresses the critical need for the development of high-fidelity Non-Destructive Examination (NDE) technologies for as-manufactured and replaceable in-service VHTR components. Centered around the novel use of elastic (sonic and ultrasonic) waves, this project deploys a robust, non-iterative inverse solution for the 3D defect reconstruction together with a non-contact, laser-based approach to the measurement of experimental waveforms in VHTR core components. In particular, this research (1) deploys three-dimensionalmore » Scanning Laser Doppler Vibrometry (3D SLDV) as a means to accurately and remotely measure 3D displacement waveforms over the accessible surface of a VHTR core component excited by mechanical vibratory source; (2) implements a powerful new inverse technique, based on the concept of Topological Sensitivity (TS), for non-iterative elastic waveform tomography of internal defects - that permits robust 3D detection, reconstruction and characterization of discrete damage (e.g. holes and fractures) in nuclear graphite from limited-aperture NDE measurements; (3) implements state-of-the art computational (finite element) model that caters for accurately simulating elastic wave propagation in 3D blocks of nuclear graphite; (4) integrates the SLDV testing methodology with the TS imaging algorithm into a non-contact, high-fidelity NDE platform for the 3D reconstruction and characterization of defects and damage in VHTR core components; and (5) applies the proposed methodology to VHTR core component samples (both two- and three-dimensional) with a priori induced, discrete damage in the form of holes and fractures. Overall, the newly established SLDV-TS testing platform represents a next-generation NDE tool that surpasses all existing techniques for the 3D ultrasonic imaging of material damage from non-contact, limited-aperture waveform measurements. Outlook. The next stage in the development of this technology includes items such as (a) non-contact generation of mechanical vibrations in VHTR components via thermal expansion created by high-intensity laser; (b) development and incorporation of Synthetic Aperture Focusing Technique (SAFT) for elevating the accuracy of 3D imaging in highly noisy environments with minimal accessible surface; (c) further analytical and computational developments to facilitate the reconstruction of diffuse damage (e.g. microcracks) in nuclear graphite as they lead to the dispersion of elastic waves, (d) concept of model updating for accurate tracking of the evolution of material damage via periodic inspections; (d) adoption of the Bayesian framework to obtain information on the certainty of obtained images; and (e) optimization of the computational scheme toward real-time, model-based imaging of damage in VHTR core components.« less

  13. INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Javier Ortensi; Sonat Sen

    2013-09-01

    The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less

  14. The use of a very high temperature nuclear reactor in the manufacture of synthetic fuels

    NASA Technical Reports Server (NTRS)

    Farbman, G. H.; Brecher, L. E.

    1976-01-01

    The three parts of a program directed toward creating a cost-effective nuclear hydrogen production system are described. The discussion covers the development of a very high temperature nuclear reactor (VHTR) as a nuclear heat and power source capable of producing the high temperature needed for hydrogen production and other processes; the development of a hydrogen generation process based on water decomposition, which can utilize the outputs of the VHTR and be integrated with many different ultimate hydrogen consuming processes; and the evaluation of the process applications of the nuclear hydrogen systems to assess the merits and potential payoffs. It is shown that the use of VHTR for the manufacture of synthetic fuels appears to have a very high probability of making a positive contribution to meeting the nation's energy needs in the future.

  15. Failure Predictions for VHTR Core Components using a Probabilistic Contiuum Damage Mechanics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fok, Alex

    2013-10-30

    The proposed work addresses the key research need for the development of constitutive models and overall failure models for graphite and high temperature structural materials, with the long-term goal being to maximize the design life of the Next Generation Nuclear Plant (NGNP). To this end, the capability of a Continuum Damage Mechanics (CDM) model, which has been used successfully for modeling fracture of virgin graphite, will be extended as a predictive and design tool for the core components of the very high- temperature reactor (VHTR). Specifically, irradiation and environmental effects pertinent to the VHTR will be incorporated into the modelmore » to allow fracture of graphite and ceramic components under in-reactor conditions to be modeled explicitly using the finite element method. The model uses a combined stress-based and fracture mechanics-based failure criterion, so it can simulate both the initiation and propagation of cracks. Modern imaging techniques, such as x-ray computed tomography and digital image correlation, will be used during material testing to help define the baseline material damage parameters. Monte Carlo analysis will be performed to address inherent variations in material properties, the aim being to reduce the arbitrariness and uncertainties associated with the current statistical approach. The results can potentially contribute to the current development of American Society of Mechanical Engineers (ASME) codes for the design and construction of VHTR core components.« less

  16. Experimental investigation and CFD analysis on cross flow in the core of PMR200

    DOE PAGES

    Lee, Jeong -Hun; Yoon, Su -Jong; Cho, Hyoung -Kyu; ...

    2015-04-16

    The Prismatic Modular Reactor (PMR) is one of the major Very High Temperature Reactor (VHTR) concepts, which consists of hexagonal prismatic fuel blocks and reflector blocks made of nuclear gradegraphite. However, the shape of the graphite blocks could be easily changed by neutron damage duringthe reactor operation and the shape change can create gaps between the blocks inducing the bypass flow.In the VHTR core, two types of gaps, a vertical gap and a horizontal gap which are called bypass gap and cross gap, respectively, can be formed. The cross gap complicates the flow field in the reactor core by connectingmore » the coolant channel to the bypass gap and it could lead to a loss of effective coolant flow in the fuel blocks. Thus, a cross flow experimental facility was constructed to investigate the cross flow phenomena in the core of the VHTR and a series of experiments were carried out under varying flow rates and gap sizes. The results of the experiments were compared with CFD (Computational Fluid Dynamics) analysis results in order to verify its prediction capability for the cross flow phenomena. Fairly good agreement was seen between experimental results and CFD predictions and the local characteristics of the cross flow was discussed in detail. Based on the calculation results, pressure loss coefficient across the cross gap was evaluated, which is necessary for the thermo-fluid analysis of the VHTR core using a lumped parameter code.« less

  17. Experimental and CFD Studies of Coolant Flow Mixing within Scaled Models of the Upper and Lower Plenums of NGNP Gas-Cooled Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, Yassin; Anand, Nk

    2016-03-30

    A 1/16th scaled VHTR experimental model was constructed and the preliminary test was performed in this study. To produce benchmark data for CFD validation in the future, the facility was first run at partial operation with five pipes being heated. PIV was performed to extract the vector velocity field for three adjacent naturally convective jets at statistically steady state. A small recirculation zone was found between the pipes, and the jets entered the merging zone at 3 cm from the pipe outlet but diverged as the flow approached the top of the test geometry. Turbulence analysis shows the turbulence intensitymore » peaked at 41-45% as the jets mixed. A sensitivity analysis confirmed that 1000 frames were sufficient to measure statistically steady state. The results were then validated by extracting the flow rate from the PIV jet velocity profile, and comparing it with an analytic flow rate and ultrasonic flowmeter; all flow rates lie within the uncertainty of the other two methods for Tests 1 and 2. This test facility can be used for further analysis of naturally convective mixing, and eventually produce benchmark data for CFD validation for the VHTR during a PCC or DCC accident scenario. Next, a PTV study of 3000 images (1500 image pairs) were used to quantify the velocity field in the upper plenum. A sensitivity analysis confirmed that 1500 frames were sufficient to precisely estimate the flow. Subsequently, three (3, 9, and 15 cm) Y-lines from the pipe output were extracted to consider the output differences between 50 to 1500 frames. The average velocity field and standard deviation error that accrued in the three different tests were calculated to assess repeatability. The error was varied, from 1 to 14%, depending on Y-elevation. The error decreased as the flow moved farther from the output pipe. In addition, turbulent intensity was calculated and found to be high near the output. Reynolds stresses and turbulent intensity were used to validate the data by comparing it with benchmark data. The experimental data gave the same pattern as the benchmark data. A turbulent single buoyant jet study was performed for the case of LOFC in the upper plenum of scaled VHTR. Time-averaged profiles show that 3,000 frames of images were sufficient for the study up to second-order statistics. Self-similarity is an important feature of jets since the behavior of jets is independent of Reynolds number and a sole function of geometry. Self-similarity profiles were well observed in the axial velocity and velocity magnitude profile regardless of z/D where the radial velocity did not show any similarity pattern. The normal components of Reynolds stresses have self-similarity within the expected range. The study shows that large vortices were observed close to the dome wall, indicating that the geometry of the VHTR has a significant impact on its safety and performance. Near the dome surface, large vortices were shown to inhibit the flows, resulting in reduced axial jet velocity. The vortices that develop subsequently reduce the Reynolds stresses that develop and the impact on the integrity of the VHTR upper plenum surface. Multiple jets study, including two, three and five jets, were investigated.« less

  18. Aqueous alteration of VHTR fuels particles under simulated geological conditions

    NASA Astrophysics Data System (ADS)

    Ait Chaou, Abdelouahed; Abdelouas, Abdesselam; Karakurt, Gökhan; Grambow, Bernd

    2014-05-01

    Very High Temperature Reactor (VHTR) fuels consist of the bistructural-isotropic (BISO) or tristructural-isotropic (TRISO)-coated particles embedded in a graphite matrix. Management of the spent fuel generated during VHTR operation would most likely be through deep geological disposal. In this framework we investigated the alteration of BISO (with pyrolytic carbon) and TRISO (with SiC) particles under geological conditions simulated by temperatures of 50 and 90 °C and in the presence of synthetic groundwater. Solid state (scanning electron microscopy (SEM), micro-Raman spectroscopy, electron probe microanalyses (EPMA) and X-ray photoelectron spectroscopy (XPS)) and solution analyses (ICP-MS, ionique chromatography (IC)) showed oxidation of both pyrolytic carbon and SiC at 90 °C. Under air this led to the formation of SiO2 and a clay-like Mg-silicate, while under reducing conditions (H2/N2 atmosphere) SiC and pyrolytic carbon were highly stable after a few months of alteration. At 50 °C, in the presence and absence of air, the alteration of the coatings was minor. In conclusion, due to their high stability in reducing conditions, HTR fuel disposal in reducing deep geological environments may constitute a viable solution for their long-term management.

  19. Scaling and design analyses of a scaled-down, high-temperature test facility for experimental investigation of the initial stages of a VHTR air-ingress accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arcilesi, David J.; Ham, Tae Kyu; Kim, In Hun

    2015-07-01

    A critical event in the safety analysis of the very high-temperature gas-cooled reactor (VHTR) is an air-ingress accident. This accident is initiated, in its worst case scenario, by a double-ended guillotine break of the coaxial cross vessel, which leads to a rapid reactor vessel depressurization. In a VHTR, the reactor vessel is located within a reactor cavity that is filled with air during normal operating conditions. Following the vessel depressurization, the dominant mode of ingress of an air–helium mixture into the reactor vessel will either be molecular diffusion or density-driven stratified flow. The mode of ingress is hypothesized to dependmore » largely on the break conditions of the cross vessel. Since the time scales of these two ingress phenomena differ by orders of magnitude, it is imperative to understand under which conditions each of these mechanisms will dominate in the air ingress process. Computer models have been developed to analyze this type of accident scenario. There are, however, limited experimental data available to understand the phenomenology of the air-ingress accident and to validate these models. Therefore, there is a need to design and construct a scaled-down experimental test facility to simulate the air-ingress accident scenarios and to collect experimental data. The current paper focuses on the analyses performed for the design and operation of a 1/8th geometric scale (by height and diameter), high-temperature test facility. A geometric scaling analysis for the VHTR, a time scale analysis of the air-ingress phenomenon, a transient depressurization analysis of the reactor vessel, a hydraulic similarity analysis of the test facility, a heat transfer characterization of the hot plenum, a power scaling analysis for the reactor system, and a design analysis of the containment vessel are discussed.« less

  20. Master Curve and Conventional Fracture Toughness of Modified 9Cr-1Mo Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ji-Hyun, Yoon; Sung-Ho, Kim; Bong-Sang, Lee

    2006-07-01

    Modified 9Cr-1Mo steel is a primary candidate material for reactor pressure vessel of Very High Temperature Gas-Cooled Reactor (VHTR) in Korean Nuclear Hydrogen Development and Demonstration (NHDD) program. In this study, T0 reference temperature, J-R fracture resistance and Charpy impact properties were evaluated for commercial Grade 91 steel as preliminary tests for the selection of the RPV material for VHTR. The fracture toughness of the modified 9Cr-1Mo steel was compared with those of SA508-Gr.3. The objective of this study was to obtain pre-irradiation fracture toughness properties of modified 9Cr-1Mo steel as reference data for the radiation effects investigation. The resultsmore » are as follows. Charpy impact properties of the modified 9Cr-1Mo steel were similar to those of SA508-Gr.3. T0 reference temperatures were measured as -67.7 deg C and -72.4 deg C from the tests with standard PCVN (pre-cracked Charpy V-notch) and half sized PCVN specimens respectively, which were similar to results for SA508-Gr.3. The K{sub Jc} values of modified 9Cr-1Mo with test temperatures are successfully expressed with the Master Curve. The J-R fracture resistance of modified 9Cr-1Mo steel at room temperature was almost the same as that of SA508-Gr.3. On the other hand it was a little bit higher at an elevated temperature. (authors)« less

  1. Fracture toughness and the master curve for modified 9Cr-1Mo steel

    NASA Astrophysics Data System (ADS)

    Yoon, Ji-Hyun; Yoon, Eui-Pak

    2006-12-01

    Modified 9Cr-1Mo steel is a primary candidate material for the reactor pressure vessel of a Very High Temperature Gas-Cooled Reactor (VHTR) in the Korean Nuclear Hydrogen Development and Demonstration (NHDD) program. In this study, the T0 reference temperature, J-R fracture resistance and Charpy impact properties were evaluated for commercial Grade 91 steel as part of the preliminary testing for a selection of the RPV material for the VHTR. The fracture toughness of the modified 9Cr-1Mo steel was compared with that of SA508-Gr.3. The objective of this study was to obtain the pre-irradiation fracture toughness properties of the modified 9Cr-1Mo steel as reference data for an investigation of radiation effects. Charpy impact properties of the modified 9Cr-1Mo steel were similar to those of SA508-Gr.3. T0 reference temperatures were measured as -67.7 and -72.4°C from the tests with standard PCVN (pre-cracked Charpy V-notch) and half-sized PCVN specimens respectively, which were similar to the results for SA508-Gr.3. The KJc values of the modified 9Cr-1Mo steel with the test temperatures are successfully expressed by the Master Curve. The J-R fracture resistance of the modified 9Cr-1Mo steel at room temperature was nearly identical to that of SA508-Gr.3; in contrast, it was slightly higher at an elevated temperature.

  2. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less

  3. Nuclear driven water decomposition plant for hydrogen production

    NASA Technical Reports Server (NTRS)

    Parker, G. H.; Brecher, L. E.; Farbman, G. H.

    1976-01-01

    The conceptual design of a hydrogen production plant using a very-high-temperature nuclear reactor (VHTR) to energize a hybrid electrolytic-thermochemical system for water decomposition has been prepared. A graphite-moderated helium-cooled VHTR is used to produce 1850 F gas for electric power generation and 1600 F process heat for the water-decomposition process which uses sulfur compounds and promises performance superior to normal water electrolysis or other published thermochemical processes. The combined cycle operates at an overall thermal efficiency in excess of 45%, and the overall economics of hydrogen production by this plant have been evaluated predicated on a consistent set of economic ground rules. The conceptual design and evaluation efforts have indicated that development of this type of nuclear-driven water-decomposition plant will permit large-scale economic generation of hydrogen in the 1990s.

  4. Comparison between the Strength Levels of Baseline Nuclear-Grade Graphite and Graphite Irradiated in AGC-2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, Mark Christopher

    2015-07-01

    This report details the initial comparison of mechanical strength properties between the cylindrical nuclear-grade graphite specimens irradiated in the second Advanced Graphite Creep (AGC-2) experiment with the established baseline, or unirradiated, mechanical properties compiled in the Baseline Graphite Characterization program. The overall comparative analysis will describe the development of an appropriate test protocol for irradiated specimens, the execution of the mechanical tests on the AGC-2 sample population, and will further discuss the data in terms of developing an accurate irradiated property distribution in the limited amount of irradiated data by leveraging the considerably larger property datasets being captured in themore » Baseline Graphite Characterization program. Integrating information on the inherent variability in nuclear-grade graphite with more complete datasets is one of the goals of the VHTR Graphite Materials program. Between “sister” specimens, or specimens with the same geometry machined from the same sub-block of graphite from which the irradiated AGC specimens were extracted, and the Baseline datasets, a comprehensive body of data will exist that can provide both a direct and indirect indication of the full irradiated property distributions that can be expected of irradiated nuclear-grade graphite while in service in a VHTR system. While the most critical data will remain the actual irradiated property measurements, expansion of this data into accurate distributions based on the inherent variability in graphite properties will be a crucial step in qualifying graphite for nuclear use as a structural material in a VHTR environment.« less

  5. Experimental and numerical investigations of high temperature gas heat transfer and flow in a VHTR reactor core

    NASA Astrophysics Data System (ADS)

    Valentin Rodriguez, Francisco Ivan

    High pressure/high temperature forced and natural convection experiments have been conducted in support of the development of a Very High Temperature Reactor (VHTR) with a prismatic core. VHTRs are designed with the capability to withstand accidents by preventing nuclear fuel meltdown, using passive safety mechanisms; a product of advanced reactor designs including the implementation of inert gases like helium as coolants. The present experiments utilize a high temperature/high pressure gas flow test facility constructed for forced and natural circulation experiments. This work examines fundamental aspects of high temperature gas heat transfer applied to VHTR operational and accident scenarios. Two different types of experiments, forced convection and natural circulation, were conducted under high pressure and high temperature conditions using three different gases: air, nitrogen and helium. The experimental data were analyzed to obtain heat transfer coefficient data in the form of Nusselt numbers as a function of Reynolds, Grashof and Prandtl numbers. This work also examines the flow laminarization phenomenon (turbulent flows displaying much lower heat transfer parameters than expected due to intense heating conditions) in detail for a full range of Reynolds numbers including: laminar, transition and turbulent flows under forced convection and its impact on heat transfer. This phenomenon could give rise to deterioration in convection heat transfer and occurrence of hot spots in the reactor core. Forced and mixed convection data analyzed indicated the occurrence of flow laminarization phenomenon due to the buoyancy and acceleration effects induced by strong heating. Turbulence parameters were also measured using a hot wire anemometer in forced convection experiments to confirm the existence of the flow laminarization phenomenon. In particular, these results demonstrated the influence of pressure on delayed transition between laminar and turbulent flow. The heat dissipating capabilities of helium flow, due to natural circulation in the system at both high and low pressure, were also examined. These experimental results are useful for the development and validation of VHTR design and safety analysis codes. Numerical simulations were performed using a Multiphysics computer code, COMSOL, displaying less than 5% error between the measured graphite temperatures in both the heated and cooled channels. Finally, new correlations have been proposed describing the thermal-hydraulic phenomena in buoyancy driven flows in both heated and cooled channels.

  6. HyPEP FY06 Report: Models and Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOE report

    2006-09-01

    The Department of Energy envisions the next generation very high-temperature gas-cooled reactor (VHTR) as a single-purpose or dual-purpose facility that produces hydrogen and electricity. The Ministry of Science and Technology (MOST) of the Republic of Korea also selected VHTR for the Nuclear Hydrogen Development and Demonstration (NHDD) Project. This research project aims at developing a user-friendly program for evaluating and optimizing cycle efficiencies of producing hydrogen and electricity in a Very-High-Temperature Reactor (VHTR). Systems for producing electricity and hydrogen are complex and the calculations associated with optimizing these systems are intensive, involving a large number of operating parameter variations andmore » many different system configurations. This research project will produce the HyPEP computer model, which is specifically designed to be an easy-to-use and fast running tool for evaluating nuclear hydrogen and electricity production facilities. The model accommodates flexible system layouts and its cost models will enable HyPEP to be well-suited for system optimization. Specific activities of this research are designed to develop the HyPEP model into a working tool, including (a) identifying major systems and components for modeling, (b) establishing system operating parameters and calculation scope, (c) establishing the overall calculation scheme, (d) developing component models, (e) developing cost and optimization models, and (f) verifying and validating the program. Once the HyPEP model is fully developed and validated, it will be used to execute calculations on candidate system configurations. FY-06 report includes a description of reference designs, methods used in this study, models and computational strategies developed for the first year effort. Results from computer codes such as HYSYS and GASS/PASS-H used by Idaho National Laboratory and Argonne National Laboratory, respectively will be benchmarked with HyPEP results in the following years.« less

  7. Optimizing Neutron Thermal Scattering Effects in very High Temperature Reactors. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawari, Ayman; Ougouag, Abderrafi

    2014-07-08

    This project aims to develop a holistic understanding of the phenomenon of neutron thermalization in the VHTR. Neutron thermalization is dependent on the type and structure of the moderating material. The fact that the moderator (and reflector) in the VHTR is a solid material will introduce new and interesting considerations that do not apply in other (e.g. light water) reactors. The moderator structure is expected to undergo radiation induced changes as the irradiation (or burnup) history progresses. In this case, the induced changes in structure will have a direct impact on many properties including the neutronic behavior. This can bemore » easily anticipated if one recognizes the dependence of neutron thermalization on the scattering law of the moderator. For the pebble bed reactor, it is anticipated that the moderating behavior can be tailored, e.g. using moderators that consist of composite materials, which could allow improved optimization of the moderator-to-fuel ratio.« less

  8. Next generation fuel irradiation capability in the High Flux Reactor Petten

    NASA Astrophysics Data System (ADS)

    Fütterer, Michael A.; D'Agata, Elio; Laurie, Mathias; Marmier, Alain; Scaffidi-Argentina, Francesco; Raison, Philippe; Bakker, Klaas; de Groot, Sander; Klaassen, Frodo

    2009-07-01

    This paper describes selected equipment and expertise on fuel irradiation testing at the High Flux Reactor (HFR) in Petten, The Netherlands. The reactor went critical in 1961 and holds an operating license up to at least 2015. While HFR has initially focused on Light Water Reactor fuel and materials, it also played a decisive role since the 1970s in the German High Temperature Reactor (HTR) development program. A variety of tests related to fast reactor development in Europe were carried out for next generation fuel and materials, in particular for Very High Temperature Reactor (V/HTR) fuel, fuel for closed fuel cycles (U-Pu and Th-U fuel cycle) and transmutation, as well as for other innovative fuel types. The HFR constitutes a significant European infrastructure tool for the development of next generation reactors. Experimental facilities addressed include V/HTR fuel tests, a coated particle irradiation rig, and tests on fast reactor, transmutation and thorium fuel. The rationales for these tests are given, results are provided and further work is outlined.

  9. ICP-MS measurement of iodine diffusion in IG-110 graphite for HTGR/VHTR

    NASA Astrophysics Data System (ADS)

    Carter, L. M.; Brockman, J. D.; Robertson, J. D.; Loyalka, S. K.

    2016-05-01

    Graphite functions as a structural material and as a barrier to fission product release in HTGR/VHTR designs, and elucidation of transport parameters for fission products in reactor-grade graphite is thus required for reactor source terms calculations. We measured iodine diffusion in spheres of IG-110 graphite using a release method based on Fickain diffusion kinetics. Two sources of iodine were loaded into the graphite spheres; molecular iodine (I2) and cesium iodide (CsI). Measurements of the diffusion coefficient were made over a temperature range of 873-1293 K. We have obtained the following Arrhenius expressions for iodine diffusion:DI , CsI infused =(6 ×10-12 2/s) exp(30,000 J/mol RT) And,DI , I2 infused =(4 ×10-10 m2/s) exp(-11,000 J/mol RT ) The results indicate that iodine diffusion in IG-110 graphite is not well-described by Fickan diffusion kinetics. To our knowledge, these are the first measurements of iodine diffusion in IG-110 graphite.

  10. In Situ Measurements of Spectral Emissivity of Materials for Very High Temperature Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. Cao; S. J. Weber; S. O. Martin

    2011-08-01

    An experimental facility for in situ measurements of high-temperature spectral emissivity of materials in environments of interest to the gas-cooled very high temperature reactor (VHTR) has been developed. The facility is capable of measuring emissivities of seven materials in a single experiment, thereby enhancing the accuracy in measurements due to even minor systemic variations in temperatures and environments. The system consists of a cylindrical silicon carbide (SiC) block with seven sample cavities and a deep blackbody cavity, a detailed optical system, and a Fourier transform infrared spectrometer. The reliability of the facility has been confirmed by comparing measured spectral emissivitiesmore » of SiC, boron nitride, and alumina (Al2O3) at 600 C against those reported in literature. The spectral emissivities of two candidate alloys for VHTR, INCONEL{reg_sign} alloy 617 (INCONEL is a registered trademark of the Special Metals Corporation group of companies) and SA508 steel, in air environment at 700 C were measured.« less

  11. Creep of A508/533 Pressure Vessel Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Wright

    2014-08-01

    ABSTRACT Evaluation of potential Reactor Pressure Vessel (RPV) steels has been carried out as part of the pre-conceptual Very High Temperature Reactor (VHTR) design studies. These design studies have generally focused on American Society of Mechanical Engineers (ASME) Code status of the steels, temperature limits, and allowable stresses. Initially, three candidate materials were identified by this process: conventional light water reactor (LWR) RPV steels A508 and A533, 2¼Cr-1Mo in the annealed condition, and Grade 91 steel. The low strength of 2¼Cr-1Mo at elevated temperature has eliminated this steel from serious consideration as the VHTR RPV candidate material. Discussions with themore » very few vendors that can potentially produce large forgings for nuclear pressure vessels indicate a strong preference for conventional LWR steels. This preference is based in part on extensive experience with forging these steels for nuclear components. It is also based on the inability to cast large ingots of the Grade 91 steel due to segregation during ingot solidification, thus restricting the possible mass of forging components and increasing the amount of welding required for completion of the RPV. Grade 91 steel is also prone to weld cracking and must be post-weld heat treated to ensure adequate high-temperature strength. There are also questions about the ability to produce, and very importantly, verify the through thickness properties of thick sections of Grade 91 material. The availability of large components, ease of fabrication, and nuclear service experience with the A508 and A533 steels strongly favor their use in the RPV for the VHTR. Lowering the gas outlet temperature for the VHTR to 750°C from 950 to 1000°C, proposed in early concept studies, further strengthens the justification for this material selection. This steel is allowed in the ASME Boiler and Pressure Vessel Code for nuclear service up to 371°C (700°F); certain excursions above that temperature are allowed by Code Case N-499-2 (now incorporated as an appendix to Section III Division 5 of the Code). This Code Case was developed with a rather sparse data set and focused primarily on rolled plate material (A533 specification). Confirmatory tests of creep behavior of both A508 and A533 are described here that are designed to extend the database in order to build higher confidence in ensuring the structural integrity of the VHTR RPV during off-normal conditions. A number of creep-rupture tests were carried out at temperatures above the 371°C (700°F) Code limit; longer term tests designed to evaluate minimum creep behavior are ongoing. A limited amount of rupture testing was also carried out on welded material. All of the rupture data from the current experiments is compared to historical values from the testing carried out to develop Code Case N-499-2. It is shown that the A508/533 basemetal tested here fits well with the rupture behavior reported from the historical testing. The presence of weldments significantly reduces the time to rupture. The primary purpose of this report is to summarize and record the experimental results in a single document.« less

  12. Utilization of Minor Actinides as a Fuel Component for Ultra-Long Life Bhr Configurations: Designs, Advantages and Limitations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Pavel V. Tsvetkov

    2009-05-20

    This project assessed the advantages and limitations of using minor actinides as a fuel component to achieve ultra-long life Very High Temperature Reactor (VHTR) configurations. Researchers considered and compared the capabilities of pebble-bed and prismatic core designs with advanced actinide fuels to achieve ultra-long operation without refueling. Since both core designs permit flexibility in component configuration, fuel utilization, and fuel management, it is possible to improve fissile properties of minor actinides by neutron spectrum shifting through configuration adjustments. The project studied advanced actinide fuels, which could reduce the long-term radio-toxicity and heat load of high-level waste sent to a geologicmore » repository and enable recovery of the energy contained in spent fuel. The ultra-long core life autonomous approach may reduce the technical need for additional repositories and is capable to improve marketability of the Generation IV VHTR by allowing worldwide deployment, including remote regions and regions with limited industrial resources. Utilization of minor actinides in nuclear reactors facilitates developments of new fuel cycles towards sustainable nuclear energy scenarios.« less

  13. Effect of Reacting Surface Density on the Overall Graphite Oxidation Rate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang H. Oh; Eung Kim; Jong Lim

    2009-05-01

    Graphite oxidation in an air-ingress accident is presently a very important issue for the reactor safety of the very high temperature gas cooled-reactor (VHTR), the concept of the next generation nuclear plant (NGNP) because of its potential problems such as mechanical degradation of the supporting graphite in the lower plenum of the VHTR might lead to core collapse if the countermeasure is taken carefully. The oxidation process of graphite has known to be affected by various factors, including temperature, pressure, oxygen concentration, types of graphite, graphite shape and size, flow distribution, etc. However, our recent study reveals that the internalmore » pore characteristics play very important roles in the overall graphite oxidation rate. One of the main issues regarding graphite oxidation is the potential core collapse problem that may occur following the degradation of graphite mechanical strength. In analyzing this phenomenon, it is very important to understand the relationship between the degree of oxidization and strength degradation. In addition, the change of oxidation rate by graphite oxidation degree characterization by burn-off (ratio of the oxidized graphite density to the original density) should be quantified because graphite strength degradation is followed by graphite density decrease, which highly affects oxidation rates and patterns. Because the density change is proportional to the internal pore surface area, they should be quantified in advance. In order to understand the above issues, the following experiments were performed: (1)Experiment on the fracture of the oxidized graphite and validation of the previous correlations, (2) Experiment on the change of oxidation rate using graphite density and data collection, (3) Measure the BET surface area of the graphite. The experiments were performed using H451 (Great Lakes Carbon Corporation) and IG-110 (Toyo Tanso Co., Ltd) graphite. The reason for the use of those graphite materials is because their chemical and mechanical characteristics are well identified by the previous investigations, and therefore it was convenient for us to access the published data, and to apply and validate our new methodologies. This paper presents preliminary results of compressive strength vs. burn-off and surface area density vs. burn-off, which can be used for the nuclear graphite selection for the NGNP.« less

  14. Very High-Temperature Reactor (VHTR) Proliferation Resistance and Physical Protection (PR&PP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moses, David Lewis

    2011-10-01

    This report documents the detailed background information that has been compiled to support the preparation of a much shorter white paper on the design features and fuel cycles of Very High-Temperature Reactors (VHTRs), including the proposed Next-Generation Nuclear Plant (NGNP), to identify the important proliferation resistance and physical protection (PR&PP) aspects of the proposed concepts. The shorter white paper derived from the information in this report was prepared for the Department of Energy Office of Nuclear Science and Technology for the Generation IV International Forum (GIF) VHTR Systems Steering Committee (SSC) as input to the GIF Proliferation Resistance and Physicalmore » Protection Working Group (PR&PPWG) (http://www.gen-4.org/Technology/horizontal/proliferation.htm). The short white paper was edited by the GIF VHTR SCC to address their concerns and thus may differ from the information presented in this supporting report. The GIF PR&PPWG will use the derived white paper based on this report along with other white papers on the six alternative Generation IV design concepts (http://www.gen-4.org/Technology/systems/index.htm) to employ an evaluation methodology that can be applied and will evolve from the earliest stages of design. This methodology will guide system designers, program policy makers, and external stakeholders in evaluating the response of each system, to determine each system's resistance to proliferation threats and robustness against sabotage and terrorism threats, and thereby guide future international cooperation on ensuring safeguards in the deployment of the Generation IV systems. The format and content of this report is that specified in a template prepared by the GIF PR&PPWG. Other than the level of detail, the key exception to the specified template format is the addition of Appendix C to document the history and status of coated-particle fuel reprocessing technologies, which fuel reprocessing technologies have yet to be deployed commercially and have only been demonstrated in testing at a laboratory scale.« less

  15. A Distributed Fiber Optic Sensor Network for Online 3-D Temperature and Neutron Fluence Mapping in a VHTR Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsvetkov, Pavel; Dickerson, Bryan; French, Joseph

    2014-04-30

    Robust sensing technologies allowing for 3D in-core performance monitoring in real time are of paramount importance for already established LWRs to enhance their reliability and availability per year, and therefore, to further facilitate their economic competitiveness via predictive assessment of the in-core conditions.

  16. Low Cycle Fatigue and Creep-Fatigue Behavior of Alloy 617 at High Temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabet, Celine; Carroll, Laura; Wright, Richard

    Alloy 617 is the leading candidate material for an intermediate heat exchanger (IHX) application of the Very High Temperature Nuclear Reactor (VHTR), expected to have an outlet temperature as high as 950 degrees C. Acceptance of Alloy 617 in Section III of the ASME Code for nuclear construction requires a detailed understanding of the creep-fatigue behavior. Initial creep-fatigue work on Alloy 617 suggests a more dominant role of environment with increasing temperature and/or hold times evidenced through changes in creep-fatigue crack growth mechanism/s and failure life. Continuous cycle fatigue and creep-fatigue testing of Alloy 617 was conducted at 950 degreesmore » C and 0.3% and 0.6% total strain in air to simulate damage modes expected in a VHTR application. Continuous cycle specimens exhibited transgranular cracking. Intergranular cracking was observed in the creep-fatigue specimens, although evidence of grain boundary cavitation was not observed. Despite the absence of grain boundary cavitation to accelerate crack propagation, the addition of a hold time at peak tensile strain was detrimental to cycle life. This suggests that creepfatigue interaction may occur by a different mechanism or that the environment may be partially responsible for accelerating failure.« less

  17. CFD Analyses of Air-Ingress Accident for VHTRs

    NASA Astrophysics Data System (ADS)

    Ham, Tae Kyu

    The Very High Temperature Reactor (VHTR) is one of six proposed Generation-IV concepts for the next generation of nuclear powered plants. The VHTR is advantageous because it is able to operate at very high temperatures, thus producing highly efficient electrical generation and hydrogen production. A critical safety event of the VHTR is a loss-of-coolant accident. This accident is initiated, in its worst-case scenario, by a double-ended guillotine break of the cross vessel that connects the reactor vessel and the power conversion unit. Following the depressurization process, the air (i.e., the air and helium mixture) in the reactor cavity could enter the reactor core causing an air-ingress event. In the event of air-ingress into the reactor core, the high-temperature in-core graphite structures will chemically react with the air and could lose their structural integrity. We designed a 1/8th scaled-down test facility to develop an experimental database for studying the mechanisms involved in the air-ingress phenomenon. The current research focuses on the analysis of the air-ingress phenomenon using the computational fluid dynamics (CFD) tool ANSYS FLUENT for better understanding of the air-ingress phenomenon. The anticipated key steps in the air-ingress scenario for guillotine break of VHTR cross vessel are: 1) depressurization; 2) density-driven stratified flow; 3) local hot plenum natural circulation; 4) diffusion into the reactor core; and 5) global natural circulation. However, the OSU air-ingress test facility covers the time from depressurization to local hot plenum natural circulation. Prior to beginning the CFD simulations for the OSU air-ingress test facility, benchmark studies for the mechanisms which are related to the air-ingress accident, were performed to decide the appropriate physical models for the accident analysis. In addition, preliminary experiments were performed with a simplified 1/30th scaled down acrylic set-up to understand the air-ingress mechanism and to utilize the CFD simulation in the analysis of the phenomenon. Previous air-ingress studies simulated the depressurization process using simple assumptions or 1-D system code results. However, recent studies found flow oscillations near the end of the depressurization which could influence the next stage of the air-ingress accident. Therefore, CFD simulations were performed to examine the air-ingress mechanisms from the depressurization through the establishment of local natural circulation initiate. In addition to the double-guillotine break scenario, there are other scenarios that can lead to an air-ingress event such as a partial break were in the cross vessel with various break locations, orientations, and shapes. These additional situations were also investigated. The simulation results for the OSU test facility showed that the discharged helium coolant from a reactor vessel during the depressurization process will be mixed with the air in the containment. This process makes the density of the gas mixture in the containment lower and the density-driven air-ingress flow slower because the density-driven flow is established by the density difference of the gas species between the reactor vessel and the containment. In addition, for the simulations with various initial and boundary conditions, the simulation results showed that the total accumulated air in the containment collapsed within 10% standard deviation by: 1. multiplying the density ratio and viscosity ratio of the gas species between the containment and the reactor vessel and 2. multiplying the ratio of the air mole fraction and gas temperature to the reference value. By replacing the gas mixture in the reactor cavity with a gas heavier than the air, the air-ingress speed slowed down. Based on the understanding of the air-ingress phenomena for the GT-MHR air-ingress scenario, several mitigation measures of air-ingress accident are proposed. The CFD results are utilized to plan experimental strategy and apparatus installation to obtain the best results when conducting an experiment. The validation of the generated CFD solutions will be performed with the OSU air-ingress experimental results. (Abstract shortened by UMI.).

  18. Methods Data Qualification Interim Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Sam Alessi; Tami Grimmett; Leng Vang

    The overall goal of the Next Generation Nuclear Plant (NGNP) Data Management and Analysis System (NDMAS) is to maintain data provenance for all NGNP data including the Methods component of NGNP data. Multiple means are available to access data stored in NDMAS. A web portal environment allows users to access data, view the results of qualification tests and view graphs and charts of various attributes of the data. NDMAS also has methods for the management of the data output from VHTR simulation models and data generated from experiments designed to verify and validate the simulation codes. These simulation models representmore » the outcome of mathematical representation of VHTR components and systems. The methods data management approaches described herein will handle data that arise from experiment, simulation, and external sources for the main purpose of facilitating parameter estimation and model verification and validation (V&V). A model integration environment entitled ModelCenter is used to automate the storing of data from simulation model runs to the NDMAS repository. This approach does not adversely change the why computational scientists conduct their work. The method is to be used mainly to store the results of model runs that need to be preserved for auditing purposes or for display to the NDMAS web portal. This interim report demonstrates the currently development of NDMAS for Methods data and discusses data and its qualification that is currently part of NDMAS.« less

  19. Bypass flow computations on the LOFA transient in a VHTR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tung, Yu-Hsin; Johnson, Richard W.; Ferng, Yuh-Ming

    2014-01-01

    Bypass flow in the prismatic gas-cooled very high temperature reactor (VHTR) is not intentionally designed to occur, but is present in the gaps between graphite blocks. Previous studies of the bypass flow in the core indicated that the cooling provided by flow in the bypass gaps had a significant effect on temperature and flow distributions for normal operating conditions. However, the flow and heat transports in the core are changed significantly after a Loss of Flow Accident (LOFA). This study aims to study the effect and role of the bypass flow after a LOFA in terms of the temperature andmore » flow distributions and for the heat transport out of the core by natural convection of the coolant for a 1/12 symmetric section of the active core which is composed of images and mirror images of two sub-region models. The two sub-region models, 9 x 1/12 and 15 x 1/12 symmetric sectors of the active core, are employed as the CFD flow models using computational grid systems of 70.2 million and 117 million nodes, respectively. It is concluded that the effect of bypass flow is significant for the initial conditions and the beginning of LOFA, but the bypass flow has little effect after a long period of time in the transient computation of natural circulation.« less

  20. Investigations of the Application of CFD to Flow Expected in the Lower Plenum of the Prismatic VHTR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard W.Johnson; Tara Gallaway; Donna P. Guillen

    2006-09-01

    The Generation IV (Gen IV) very high temperature reactor (VHTR) will either be a prismatic (block) or pebble bed design. However, a prismatic VHTR reference design, based on the General Atomics Gas Turbine-Modular Helium Reactor (GT-MHR) [General Atomics, 1996] has been developed for preliminary analysis purposes [MacDonald, et al., 2003]. Numerical simulation studies reported herein are based on this reference design. In the lower plenum of the prismatic reference design, the flow will be introduced by dozens of turbulent jets from the core above. The jet flow will encounter rows of columns that support the core. The flow from themore » core will have to turn ninety degrees and flow toward the exit duct as it passed through the forest of support columns. Due to the radial variation of the power density in the core, the jets will be at various temperatures at the inlet to the lower plenum. This presents some concerns, including that local hot spots may occur in the lower plenum. This may have a deleterious effect on the materials present as well as cause a variation in temperature to be present as the flow enters the power conversion system machinery, which could cause problems with the operation of the machinery. In the past, systems analysis codes have been used to model flow in nuclear reactor systems. It is recognized, however, that such codes are not capable of modeling the local physics of the flow to be able to analyze for local mixing and temperature variations. This has led to the determination that computational fluid dynamic (CFD) codes be used, which are generally regarded as having the capability of accurately simulating local flow physics. Accurate flow modeling involves determining appropriate modeling strategies needed to obtain accurate analyses. These include determining the fineness of the grid needed, the required iterative convergence tolerance, which numerical discretization method to use, and which turbulence model and wall treatment should be employed. It also involves validating the computer code and turbulence model against a series of separate and combined flow phenomena and selecting the data used for the validation. This report describes progress made to identify proper modeling strategies for simulating the lower plenum flow for the task entitled “CFD software validation of jets in crossflow,” which was designed to investigate the issues pertaining to the validation process. The flow phenomenon previously chosen to investigate is flow in a staggered tube bank because it is shown by preliminary simulations to be the location of the highest turbulence intensity in the lower plenum Numerical simulations were previously obtained assuming that the flow is steady. Various turbulence models were employed along with strategies to reduce numerical error to allow appropriate comparisons of the results. It was determined that the sophisticated Reynolds stress model (RSM) provided the best results. It was later determined that the flow is an unsteady flow wherein circulating eddies grow behind the tube and ‘peel off’ alternately from the top and the bottom of the tube. Additional calculations show that the mean velocity is well predicted when the flow is modeled as an unsteady flow. The results for U are clearly superior for the unsteady computations; the unsteady computations for the turbulence stress are similar to those for the steady calculations, showing the same trends. It is clear that strategie« less

  1. High temperature corrosion of a nickel base alloy by helium impurities

    NASA Astrophysics Data System (ADS)

    Rouillard, F.; Cabet, C.; Wolski, K.; Terlain, A.; Tabarant, M.; Pijolat, M.; Valdivieso, F.

    2007-05-01

    High temperature corrosion properties of Haynes 230 were investigated in a purposely-designed facility under a typical very high temperature reactor (VHTR) impure helium medium. The study was focused on the surface oxide scale formation and its stability at about 1223 K. The alloy developed a Mn/Cr rich oxide layer on its surface under impure helium at 1173 K. Nevertheless, a deleterious reaction destructing the chromium oxide was evidenced above a critical temperature, TA. Reagents and products of this last reaction were investigated.

  2. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implementmore » a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.« less

  3. Experimental study of forced convection heat transfer during upward and downward flow of helium at high pressure and high temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Francisco Valentin; Narbeh Artoun; Masahiro Kawaji

    2015-08-01

    Fundamental high pressure/high temperature forced convection experiments have been conducted in support of the development of a Very High Temperature Reactor (VHTR) with a prismatic core. The experiments utilize a high temperature/high pressure gas flow test facility constructed for forced convection and natural circulation experiments. The test section has a single 16.8 mm ID flow channel in a 2.7 m long, 108 mm OD graphite column with four 2.3kW electric heater rods placed symmetrically around the flow channel. This experimental study presents the role of buoyancy forces in enhancing or reducing convection heat transfer for helium at high pressures upmore » to 70 bar and high temperatures up to 873 degrees K. Wall temperatures have been compared among 10 cases covering the inlet Re numbers ranging from 500 to 3,000. Downward flows display higher and lower wall temperatures in the upstream and downstream regions, respectively, than the upward flow cases due to the influence of buoyancy forces. In the entrance region, convection heat transfer is reduced due to buoyancy leading to higher wall temperatures, while in the downstream region, buoyancyinduced mixing causes higher convection heat transfer and lower wall temperatures. However, their influences are reduced as the Reynolds number increases. This experimental study is of specific interest to VHTR design and validation of safety analysis codes.« less

  4. Parametric Study on the Tensile Properties of Ni-Based Alloy for a VHTR

    NASA Astrophysics Data System (ADS)

    Kim, Dong-Jin; Jung, Su Jin; Mun, Byung Hak; Kim, Sung Woo; Lim, Yun Soo

    2015-01-01

    A very high-temperature reactor (VHTR) has been studied among generation IV nuclear power plants owing to its many advantages such as high-electric efficiency and massive hydrogen production. The material used for the heat exchanger should sustain structural integrity for its life even though the material is exposed to a harsh environment at 1223 K (950 °C) in an impure helium coolant. Therefore, an enhancement of the material performance at high temperature gives a margin in determining the operating temperature and life time. This work is an effort to find an optimum combination of alloying elements and processing parameters to improve the material performance. The tensile property and microstructure for nickel-based alloys fabricated in a laboratory were evaluated as a function of the heat treatment, cold working, and grain boundary strengthener using a tension test at 1223 K (950 °C), scanning electron microscopy, and transmission electron microscopy. Elongation to rupture was increased by additional heat treatment and cold working, followed by additional heat treatment in the temperature range from 1293 K to 1383 K (1020 °C to 1110 °C) implying that the intergranular carbide contributes to grain boundary strengthening. The temperature at which the grain boundary is improved by carbide decoration was higher for a cold-worked specimen, which was described by the difference in carbide stability and carbide formation kinetics between no cold-worked and cold-worked specimens. Zr and Hf played a scavenging effect of harmful elements causing an increase in ductility.

  5. NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cynthia D. Gentillon

    2011-09-01

    Projects for the Very High Temperature Reactor (VHTR) Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. The NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory has been established to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities formore » displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities. The capabilities are described from the perspective of NDMAS users, starting with those who just view experimental data and analytical results on the INL NDMAS web portal. Web display and delivery capabilities are described in detail. Also the current web pages that show Advanced Gas Reactor, Advanced Graphite Capsule, and High Temperature Materials test results are itemized. Capabilities available to NDMAS developers are more extensive, and are described using a second series of examples. Much of the data analysis efforts focus on understanding how thermocouple measurements relate to simulated temperatures and other experimental parameters. Statistical control charts and correlation monitoring provide an ongoing assessment of instrument accuracy. Data analysis capabilities are virtually unlimited for those who use the NDMAS web data download capabilities and the analysis software of their choice. Overall, the NDMAS provides convenient data analysis and web delivery capabilities for studying a very large and rapidly increasing database of well-documented, pedigreed data.« less

  6. 3D thermal modeling of TRISO fuel coupled with neutronic simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Jianwei; Uddin, Rizwan

    2010-01-01

    The Very High Temperature Gas Reactor (VHTR) is widely considered as one of the top candidates identified in the Next Generation Nuclear Power-plant (NGNP) Technology Roadmap under the U.S . Depanment of Energy's Generation IV program. TRlSO particle is a common element among different VHTR designs and its performance is critical to the safety and reliability of the whole reactor. A TRISO particle experiences complex thermo-mechanical changes during reactor operation in high temperature and high burnup conditions. TRISO fuel performance analysis requires evaluation of these changes on micro scale. Since most of these changes are temperature dependent, 3D thermal modelingmore » of TRISO fuel is a crucial step of the whole analysis package. In this paper, a 3D numerical thermal model was developed to calculate temperature distribution inside TRISO and pebble under different scenarios. 3D simulation is required because pebbles or TRISOs are always subjected to asymmetric thermal conditions since they are randomly packed together. The numerical model was developed using finite difference method and it was benchmarked against ID analytical results and also results reported from literature. Monte-Carlo models were set up to calculate radial power density profile. Complex convective boundary condition was applied on the pebble outer surface. Three reactors were simulated using this model to calculate temperature distribution under different power levels. Two asymmetric boundary conditions were applied to the pebble to test the 3D capabilities. A gas bubble was hypothesized inside the TRISO kernel and 3D simulation was also carried out under this scenario. Intuition-coherent results were obtained and reported in this paper.« less

  7. ICP-MS measurement of diffusion coefficients of Cs in NBG-18 graphite

    NASA Astrophysics Data System (ADS)

    Carter, L. M.; Brockman, J. D.; Robertson, J. D.; Loyalka, S. K.

    2015-11-01

    Graphite is used in the HGTR/VHTR as moderator and it also functions as a barrier to fission product release. Therefore, an elucidation of transport of fission products in reactor-grade graphite is required. We have measured diffusion coefficients of Cs in graphite NBG-18 using the release method, wherein we infused spheres of NBG-18 with Cs and measured the release rates in the temperature range of 1090-1395 K. We have obtained: These seem to be the first reported values of Cs diffusion coefficients in NBG-18. The values are lower than those reported for other graphites in the literature.

  8. Modeling Fission Product Sorption in Graphite Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szlufarska, Izabela; Morgan, Dane; Allen, Todd

    2013-04-08

    The goal of this project is to determine changes in adsorption and desorption of fission products to/from nuclear-grade graphite in response to a changing chemical environment. First, the project team will employ principle calculations and thermodynamic analysis to predict stability of fission products on graphite in the presence of structural defects commonly observed in very high- temperature reactor (VHTR) graphites. Desorption rates will be determined as a function of partial pressure of oxygen and iodine, relative humidity, and temperature. They will then carry out experimental characterization to determine the statistical distribution of structural features. This structural information will yield distributionsmore » of binding sites to be used as an input for a sorption model. Sorption isotherms calculated under this project will contribute to understanding of the physical bases of the source terms that are used in higher-level codes that model fission product transport and retention in graphite. The project will include the following tasks: Perform structural characterization of the VHTR graphite to determine crystallographic phases, defect structures and their distribution, volume fraction of coke, and amount of sp2 versus sp3 bonding. This information will be used as guidance for ab initio modeling and as input for sorptivity models; Perform ab initio calculations of binding energies to determine stability of fission products on the different sorption sites present in nuclear graphite microstructures. The project will use density functional theory (DFT) methods to calculate binding energies in vacuum and in oxidizing environments. The team will also calculate stability of iodine complexes with fission products on graphite sorption sites; Model graphite sorption isotherms to quantify concentration of fission products in graphite. The binding energies will be combined with a Langmuir isotherm statistical model to predict the sorbed concentration of fission products on each type of graphite site. The model will include multiple simultaneous adsorbing species, which will allow for competitive adsorption effects between different fission product species and O and OH (for modeling accident conditions).« less

  9. Using Directional Diffusion Coefficients for Nonlinear Diffusion Acceleration of the First Order SN Equations in Near-Void Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunert, Sebastian; Hammer, Hans; Lou, Jijie

    2016-11-01

    The common definition of the diffusion coeffcient as the inverse of three times the transport cross section is not compat- ible with voids. Morel introduced a non-local tensor diffusion coeffcient that remains finite in voids[1]. It can be obtained by solving an auxiliary transport problem without scattering or fission. Larsen and Trahan successfully applied this diffusion coeffcient for enhancing the accuracy of diffusion solutions of very high temperature reactor (VHTR) problems that feature large, optically thin channels in the z-direction [2]. It is demonstrated that a significant reduction of error can be achieved in particular in the optically thin region.more » Along the same line of thought, non-local diffusion tensors are applied modeling the TREAT reactor confirming the findings of Larsen and Trahan [3]. Previous work of the authors have introduced a flexible Nonlinear-Diffusion Acceleration (NDA) method for the first order S N equations discretized with the discontinuous finite element method (DFEM), [4], [5], [6]. This NDA method uses a scalar diffusion coeffcient in the low-order system that is obtained as the flux weighted average of the inverse transport cross section. Hence, it su?ers from very large and potentially unbounded diffusion coeffcients in the low order problem. However, it was noted that the choice of the diffusion coeffcient does not influence consistency of the method at convergence and hence the di?usion coeffcient is essentially a free parameter. The choice of the di?usion coeffcient does, however, affect the convergence behavior of the nonlinear di?usion iterations. Within this work we use Morel’s non-local di?usion coef- ficient in the aforementioned NDA formulation in lieu of the flux weighted inverse of three times the transport cross section. The goal of this paper is to demonstrate that significant en- hancement of the spectral properties of NDA can be achieved in near void regions. For testing the spectral properties of the NDA with non-local diffusion coeffcients, the periodical horizontal interface problem is used [7]. This problem consists of alternating stripes of optically thin and thick materials both of which feature scattering ratios close to unity.« less

  10. Materials, Turbomachinery and Heat Exchangers for Supercritical CO2 Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark; Nellis, Greg; Corradini, Michael

    2012-10-19

    The objective of this project is to produce the necessary data to evaluate the performance of the supercritical carbon dioxide cycle. The activities include a study of materials compatibility of various alloys at high temperatures, the heat transfer and pressure drop in compact heat exchanger units, and turbomachinery issues, primarily leakage rates through dynamic seals. This experimental work will serve as a test bed for model development and design calculations, and will help define further tests necessary to develop high-efficiency power conversion cycles for use on a variety of reactor designs, including the sodium fast reactor (SFR) and very high-temperaturemore » gas reactor (VHTR). The research will be broken into three separate tasks. The first task deals with the analysis of materials related to the high-temperature S-CO{sub 2} Brayton cycle. The most taxing materials issues with regard to the cycle are associated with the high temperatures in the reactor side heat exchanger and in the high-temperature turbine. The system could experience pressures as high as 20MPa and temperatures as high as 650°C. The second task deals with optimization of the heat exchangers required by the S-CO{sub 2} cycle; the S-CO{sub 2} flow passages in these heat exchangers are required whether the cycle is coupled with a VHTR or an SFR. At least three heat exchangers will be required: the pre-cooler before compression, the recuperator, and the heat exchanger that interfaces with the reactor coolant. Each of these heat exchangers is unique and must be optimized separately. The most challenging heat exchanger is likely the pre-cooler, as there is only about a 40°C temperature change but it operates close to the CO{sub 2} critical point, therefore inducing substantial changes in properties. The proposed research will focus on this most challenging component. The third task examines seal leakage through various dynamic seal designs under the conditions expected in the S-CO{sub 2} cycle, including supercritical, choked, and two-phase flow conditions.« less

  11. Advanced reactors and associated fuel cycle facilities: safety and environmental impacts.

    PubMed

    Hill, R N; Nutt, W M; Laidler, J J

    2011-01-01

    The safety and environmental impacts of new technology and fuel cycle approaches being considered in current U.S. nuclear research programs are contrasted to conventional technology options in this paper. Two advanced reactor technologies, the sodium-cooled fast reactor (SFR) and the very high temperature gas-cooled reactor (VHTR), are being developed. In general, the new reactor technologies exploit inherent features for enhanced safety performance. A key distinction of advanced fuel cycles is spent fuel recycle facilities and new waste forms. In this paper, the performance of existing fuel cycle facilities and applicable regulatory limits are reviewed. Technology options to improve recycle efficiency, restrict emissions, and/or improve safety are identified. For a closed fuel cycle, potential benefits in waste management are significant, and key waste form technology alternatives are described. Copyright © 2010 Health Physics Society

  12. AGC-2 Graphite Pre-irradiation Data Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Swank; Joseph Lord; David Rohrbaugh

    2010-08-01

    The NGNP Graphite R&D program is currently establishing the safe operating envelope of graphite core components for a Very High Temperature Reactor (VHTR) design. The program is generating quantitative data necessary for predicting the behavior and operating performance of the new nuclear graphite grades. To determine the in-service behavior of the graphite for pebble bed and prismatic designs, the Advanced Graphite Creep (AGC) experiment is underway. This experiment is examining the properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences and compressive loads. Each experiment consists of over 400 graphite specimens that are characterizedmore » prior to irradiation and following irradiation. Six experiments are planned with the first, AGC-1, currently being irradiated in the Advanced Test Reactor (ATR) and pre-irradiation characterization of the second, AGC-2, completed. This data package establishes the readiness of 512 specimens for assembly into the AGC-2 capsule.« less

  13. Creep-Fatigue Behavior of Alloy 617 at 850 and 950°C, Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carroll, L.; Carroll, M.

    Alloy 617 is the leading candidate material for an Intermediate Heat Exchanger (IHX) of the Very High Temperature Reactor (VHTR). To evaluate the behavior of this material in the expected service conditions, strain-controlled cyclic tests including hold times up to 9000 s at maximum tensile strain were conducted at 850 and 950 degrees C. At both temperatures, the fatigue resistance decreased when a hold time was added at peak tensile strain. The magnitude of this effect depended on the specific mechanisms and whether they resulted in a change in fracture mode from transgranular in pure fatigue to intergranular in creep-fatiguemore » for a particular temperature and strain range combination. Increases in the tensile hold duration beyond an initial value were not detrimental to the creep-fatigue resistance at 950 degrees C but did continue to degrade the lifetimes at 850 degrees C.« less

  14. Investigation on the Core Bypass Flow in a Very High Temperature Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, Yassin

    2013-10-22

    Uncertainties associated with the core bypass flow are some of the key issues that directly influence the coolant mass flow distribution and magnitude, and thus the operational core temperature profiles, in the very high-temperature reactor (VHTR). Designers will attempt to configure the core geometry so the core cooling flow rate magnitude and distribution conform to the design values. The objective of this project is to study the bypass flow both experimentally and computationally. Researchers will develop experimental data using state-of-the-art particle image velocimetry in a small test facility. The team will attempt to obtain full field temperature distribution using racksmore » of thermocouples. The experimental data are intended to benchmark computational fluid dynamics (CFD) codes by providing detailed information. These experimental data are urgently needed for validation of the CFD codes. The following are the project tasks: • Construct a small-scale bench-top experiment to resemble the bypass flow between the graphite blocks, varying parameters to address their impact on bypass flow. Wall roughness of the graphite block walls, spacing between the blocks, and temperature of the blocks are some of the parameters to be tested. • Perform CFD to evaluate pre- and post-test calculations and turbulence models, including sensitivity studies to achieve high accuracy. • Develop the state-of-the art large eddy simulation (LES) using appropriate subgrid modeling. • Develop models to be used in systems thermal hydraulics codes to account and estimate the bypass flows. These computer programs include, among others, RELAP3D, MELCOR, GAMMA, and GAS-NET. Actual core bypass flow rate may vary considerably from the design value. Although the uncertainty of the bypass flow rate is not known, some sources have stated that the bypass flow rates in the Fort St. Vrain reactor were between 8 and 25 percent of the total reactor mass flow rate. If bypass flow rates are on the high side, the quantity of cooling flow through the core may be considerably less than the nominal design value, causing some regions of the core to operate at temperatures in excess of the design values. These effects are postulated to lead to localized hot regions in the core that must be considered when evaluating the VHTR operational and accident scenarios.« less

  15. Creep-Fatigue Damage Investigation and Modeling of Alloy 617 at High Temperatures

    NASA Astrophysics Data System (ADS)

    Tahir, Fraaz

    The Very High Temperature Reactor (VHTR) is one of six conceptual designs proposed for Generation IV nuclear reactors. Alloy 617, a solid solution strengthened Ni-base superalloy, is currently the primary candidate material for the tubing of the Intermediate Heat Exchanger (IHX) in the VHTR design. Steady-state operation of the nuclear power plant at elevated temperatures leads to creep deformation, whereas loading transients including startup and shutdown generate fatigue. A detailed understanding of the creep-fatigue interaction in Alloy 617 is necessary before it can be considered as a material for nuclear construction in ASME Boiler and Pressure Vessel Code. Current design codes for components undergoing creep-fatigue interaction at elevated temperatures require creep-fatigue testing data covering the entire range from fatigue-dominant to creep-dominant loading. Classical strain-controlled tests, which produce stress relaxation during the hold period, show a saturation in cycle life with increasing hold periods due to the rapid stress-relaxation of Alloy 617 at high temperatures. Therefore, applying longer hold time in these tests cannot generate creep-dominated failure. In this study, uniaxial isothermal creep-fatigue tests with non-traditional loading waveforms were designed and performed at 850 and 950°C, with an objective of generating test data in the creep-dominant regime. The new loading waveforms are hybrid strain-controlled and force-controlled testing which avoid stress relaxation during the creep hold. The experimental data showed varying proportions of creep and fatigue damage, and provided evidence for the inadequacy of the widely-used time fraction rule for estimating creep damage under creep-fatigue conditions. Micro-scale damage features in failed test specimens, such as fatigue cracks and creep voids, were quantified using a Scanning Electron Microscope (SEM) to find a correlation between creep and fatigue damage. Quantitative statistical imaging analysis showed that the microstructural damage features (cracks and voids) are correlated with a new mechanical driving force parameter. The results from this image-based damage analysis were used to develop a phenomenological life-prediction methodology called the effective time fraction approach. Finally, the constitutive creep-fatigue response of the material at 950°C was modeled using a unified viscoplastic model coupled with a damage accumulation model. The simulation results were used to validate an energy-based constitutive life-prediction model, as a mechanistic model for potential component and structure level creep-fatigue analysis.

  16. Baseline Concept Description of a Small Modular High Temperature Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hans Gougar

    2014-05-01

    The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNPmore » were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.« less

  17. Baseline Concept Description of a Small Modular High Temperature Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans D.

    2014-10-01

    The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNPmore » were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.« less

  18. Analytical modeling of helium turbomachinery using FORTRAN 77

    NASA Astrophysics Data System (ADS)

    Balaji, Purushotham

    Advanced Generation IV modular reactors, including Very High Temperature Reactors (VHTRs), utilize helium as the working fluid, with a potential for high efficiency power production utilizing helium turbomachinery. Helium is chemically inert and nonradioactive which makes the gas ideal for a nuclear power-plant environment where radioactive leaks are a high concern. These properties of helium gas helps to increase the safety features as well as to decrease the aging process of plant components. The lack of sufficient helium turbomachinery data has made it difficult to study the vital role played by the gas turbine components of these VHTR powered cycles. Therefore, this research work focuses on predicting the performance of helium compressors. A FORTRAN77 program is developed to simulate helium compressor operation, including surge line prediction. The resulting design point and off design performance data can be used to develop compressor map files readable by Numerical Propulsion Simulation Software (NPSS). This multi-physics simulation software that was developed for propulsion system analysis has found applications in simulating power-plant cycles.

  19. Comparison of Standardized Test Scores from Traditional Classrooms and Those Using Problem-Based Learning

    ERIC Educational Resources Information Center

    Needham, Martha Elaine

    2010-01-01

    This research compares differences between standardized test scores in problem-based learning (PBL) classrooms and a traditional classroom for 6th grade students using a mixed-method, quasi-experimental and qualitative design. The research shows that problem-based learning is as effective as traditional teaching methods on standardized tests. The…

  20. A Natural Fit: Problem-based Learning and Technology Standards.

    ERIC Educational Resources Information Center

    Sage, Sara M.

    2000-01-01

    Discusses the use of problem-based learning to meet technology standards. Highlights include technology as a tool for locating and organizing information; the Wolf Wars problem for elementary and secondary school students that provides resources, including Web sites, for information; Web-based problems; and technology as assessment and as a…

  1. Standardized Definitions for Code Verification Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  2. Molecular Tagging Velocimetry Development for In-situ Measurement in High-Temperature Test Facility

    NASA Technical Reports Server (NTRS)

    Andre, Matthieu A.; Bardet, Philippe M.; Burns, Ross A.; Danehy, Paul M.

    2015-01-01

    The High Temperature Test Facility, HTTF, at Oregon State University (OSU) is an integral-effect test facility designed to model the behavior of a Very High Temperature Gas Reactor (VHTR) during a Depressurized Conduction Cooldown (DCC) event. It also has the ability to conduct limited investigations into the progression of a Pressurized Conduction Cooldown (PCC) event in addition to phenomena occurring during normal operations. Both of these phenomena will be studied with in-situ velocity field measurements. Experimental measurements of velocity are critical to provide proper boundary conditions to validate CFD codes, as well as developing correlations for system level codes, such as RELAP5 (http://www4vip.inl.gov/relap5/). Such data will be the first acquired in the HTTF and will introduce a diagnostic with numerous other applications to the field of nuclear thermal hydraulics. A laser-based optical diagnostic under development at The George Washington University (GWU) is presented; the technique is demonstrated with velocity data obtained in ambient temperature air, and adaptation to high-pressure, high-temperature flow is discussed.

  3. Fatigue and Creep Crack Propagation behaviour of Alloy 617 in the Annealed and Aged Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Julian K. Benz; Richard N. Wright

    2013-10-01

    The crack propagation behaviour of Alloy 617 was studied under various conditions. Elevated temperature fatigue and creep-fatigue crack growth experiments were conducted at 650 and 800 degrees C under constant stress intensity (triangle K) conditions and triangular or trapezoidal waveforms at various frequencies on as-received, aged, and carburized material. Environmental conditions included both laboratory air and characteristic VHTR impure helium. As-received Alloy 617 displayed an increase in the crack growth rate (da/dN) as the frequency was decreased in air which indicated a time-dependent contribution component in fatigue crack propagation. Material aged at 650°C did not display any influence on themore » fatigue crack growth rates nor the increasing trend of crack growth rate with decreasing frequency even though significant microstructural evolution, including y’ (Ni3Al) after short times, occurred during aging. In contrast, carburized Alloy 617 showed an increase in crack growth rates at all frequencies tested compared to the material in the standard annealed condition. Crack growth studies under quasi-constant K (i.e. creep) conditions were also completed at 650 degrees C and a stress intensity of K = 40 MPa9 (square root)m. The results indicate that crack growth is primarily intergranular and increased creep crack growth rates exist in the impure helium environment when compared to the results in laboratory air. Furthermore, the propagation rates (da/dt) continually increased for the duration of the creep crack growth either due to material aging or evolution of a crack tip creep zone. Finally, fatigue crack propagation tests at 800 degrees C on annealed Alloy 617 indicated that crack propagation rates were higher in air than impure helium at the largest frequencies and lowest stress intensities. The rates in helium, however, eventually surpass the rates in air as the frequency is reduced and the stress intensity is decreased which was not observed at 650 degrees C.« less

  4. INL Experimental Program Roadmap for Thermal Hydraulic Code Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glenn McCreery; Hugh McIlroy

    2007-09-01

    Advanced computer modeling and simulation tools and protocols will be heavily relied on for a wide variety of system studies, engineering design activities, and other aspects of the Next Generation Nuclear Power (NGNP) Very High Temperature Reactor (VHTR), the DOE Global Nuclear Energy Partnership (GNEP), and light-water reactors. The goal is for all modeling and simulation tools to be demonstrated accurate and reliable through a formal Verification and Validation (V&V) process, especially where such tools are to be used to establish safety margins and support regulatory compliance, or to design a system in a manner that reduces the role ofmore » expensive mockups and prototypes. Recent literature identifies specific experimental principles that must be followed in order to insure that experimental data meet the standards required for a “benchmark” database. Even for well conducted experiments, missing experimental details, such as geometrical definition, data reduction procedures, and manufacturing tolerances have led to poor Benchmark calculations. The INL has a long and deep history of research in thermal hydraulics, especially in the 1960s through 1980s when many programs such as LOFT and Semiscle were devoted to light-water reactor safety research, the EBRII fast reactor was in operation, and a strong geothermal energy program was established. The past can serve as a partial guide for reinvigorating thermal hydraulic research at the laboratory. However, new research programs need to fully incorporate modern experimental methods such as measurement techniques using the latest instrumentation, computerized data reduction, and scaling methodology. The path forward for establishing experimental research for code model validation will require benchmark experiments conducted in suitable facilities located at the INL. This document describes thermal hydraulic facility requirements and candidate buildings and presents examples of suitable validation experiments related to VHTRs, sodium-cooled fast reactors, and light-water reactors. These experiments range from relatively low-cost benchtop experiments for investigating individual phenomena to large electrically-heated integral facilities for investigating reactor accidents and transients.« less

  5. Youth Top Problems: using idiographic, consumer-guided assessment to identify treatment needs and to track change during psychotherapy.

    PubMed

    Weisz, John R; Chorpita, Bruce F; Frye, Alice; Ng, Mei Yi; Lau, Nancy; Bearman, Sarah Kate; Ugueto, Ana M; Langer, David A; Hoagwood, Kimberly E

    2011-06-01

    To complement standardized measurement of symptoms, we developed and tested an efficient strategy for identifying (before treatment) and repeatedly assessing (during treatment) the problems identified as most important by caregivers and youths in psychotherapy. A total of 178 outpatient-referred youths, 7-13 years of age, and their caregivers separately identified the 3 problems of greatest concern to them at pretreatment and then rated the severity of those problems weekly during treatment. The Top Problems measure thus formed was evaluated for (a) whether it added to the information obtained through empirically derived standardized measures (e.g., the Child Behavior Checklist [CBCL; Achenbach & Rescorla, 2001] and the Youth Self-Report [YSR; Achenbach & Rescorla, 2001]) and (b) whether it met conventional psychometric standards. The problems identified were significant and clinically relevant; most matched CBCL/YSR items while adding specificity. The top problems also complemented the information yield of the CBCL/YSR; for example, for 41% of caregivers and 79% of youths, the identified top problems did not correspond to any items of any narrowband scales in the clinical range. Evidence on test-retest reliability, convergent and discriminant validity, sensitivity to change, slope reliability, and the association of Top Problems slopes with standardized measure slopes supported the psychometric strength of the measure. The Top Problems measure appears to be a psychometrically sound, client-guided approach that complements empirically derived standardized assessment; the approach can help focus attention and treatment planning on the problems that youths and caregivers consider most important and can generate evidence on trajectories of change in those problems during treatment. (PsycINFO Database Record (c) 2011 APA, all rights reserved).

  6. Proficiency Standards and Cut-Scores for Language Proficiency Tests.

    ERIC Educational Resources Information Center

    Moy, Raymond H.

    The problem of standard setting on language proficiency tests is often approached by the use of norms derived from the group being tested, a process commonly known as "grading on the curve." One particular problem with this ad hoc method of standard setting is that it will usually result in a fluctuating standard dependent on the particular group…

  7. Automated Hypothesis Tests and Standard Errors for Nonstandard Problems with Description of Computer Package: A Draft.

    ERIC Educational Resources Information Center

    Lord, Frederic M.; Stocking, Martha

    A general Computer program is described that will compute asymptotic standard errors and carry out significance tests for an endless variety of (standard and) nonstandard large-sample statistical problems, without requiring the statistician to derive asymptotic standard error formulas. The program assumes that the observations have a multinormal…

  8. When procedures discourage insight: epistemological consequences of prompting novice physics students to construct force diagrams

    NASA Astrophysics Data System (ADS)

    Kuo, Eric; Hallinen, Nicole R.; Conlin, Luke D.

    2017-05-01

    One aim of school science instruction is to help students become adaptive problem solvers. Though successful at structuring novice problem solving, step-by-step problem-solving frameworks may also constrain students' thinking. This study utilises a paradigm established by Heckler [(2010). Some consequences of prompting novice physics students to construct force diagrams. International Journal of Science Education, 32(14), 1829-1851] to test how cuing the first step in a standard framework affects undergraduate students' approaches and evaluation of solutions in physics problem solving. Specifically, prompting the construction of a standard diagram before problem solving increases the use of standard procedures, decreasing the use of a conceptual shortcut. Providing a diagram prompt also lowers students' ratings of informal approaches to similar problems. These results suggest that reminding students to follow typical problem-solving frameworks limits their views of what counts as good problem solving.

  9. Intensive motivational interviewing for women with concurrent alcohol problems and methamphetamine dependence.

    PubMed

    Korcha, Rachael A; Polcin, Douglas L; Evans, Kristy; Bond, Jason C; Galloway, Gantt P

    2014-02-01

    Motivational interviewing (MI) for the treatment of alcohol and drug problems is typically conducted over 1 to 3 sessions. The current work evaluates an intensive 9-session version of MI (Intensive MI) compared to a standard single MI session (Standard MI) using 163 methamphetamine (MA) dependent individuals. The primary purpose of this paper is to report the unexpected finding that women with co-occurring alcohol problems in the Intensive MI condition reduced the severity of their alcohol problems significantly more than women in the Standard MI condition at the 6-month follow-up. Stronger perceived alliance with the therapist was inversely associated with alcohol problem severity scores. Findings indicate that Intensive MI is a beneficial treatment for alcohol problems among women with MA dependence. © 2013.

  10. Air Pollution over the States

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1972

    1972-01-01

    State plans for implementing air quality standards are evaluated together with problems in modeling procedures and enforcement. Monitoring networks, standards, air quality regions, and industrial problems are also discussed. (BL)

  11. SGML-Based Markup for Literary Texts: Two Problems and Some Solutions.

    ERIC Educational Resources Information Center

    Barnard, David; And Others

    1988-01-01

    Identifies the Standard Generalized Markup Language (SGML) as the best basis for a markup standard for encoding literary texts. Outlines solutions to problems using SGML and discusses the problem of maintaining multiple views of a document. Examines several ways of reducing the burden of markups. (GEA)

  12. Mission Mathematics: Linking Aerospace and the NCTM Standards, K-6.

    ERIC Educational Resources Information Center

    Hynes, Mary Ellen, Ed.

    This book is designed to present mathematical problems and tasks that focus on the National Council of Teachers of Mathematics (NCTM) curriculum and evaluation standards in the context of aerospace activities. It aims at actively engaging students in NCTM's four process standards: (1) problem solving; (2) mathematical reasoning; (3) communicating…

  13. Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations

    NASA Astrophysics Data System (ADS)

    Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans

    2017-01-01

    Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.

  14. Addressing Beyond Standard Model physics using cosmology

    NASA Astrophysics Data System (ADS)

    Ghalsasi, Akshay

    We have consensus models for both particle physics (i.e. standard model) and cosmology (i.e. LambdaCDM). Given certain assumptions about the initial conditions of the universe, the marriage of the standard model (SM) of particle physics and LambdaCDM cosmology has been phenomenally successful in describing the universe we live in. However it is quite clear that all is not well. The three biggest problems that the SM faces today are baryogenesis, dark matter and dark energy. These problems, along with the problem of neutrino masses, indicate the existence of physics beyond SM. Evidence of baryogenesis, dark matter and dark energy all comes from astrophysical and cosmological observations. Cosmology also provides the best (model dependent) constraints on neutrino masses. In this thesis I will try address the following problems 1) Addressing the origin of dark energy (DE) using non-standard neutrino cosmology and exploring the effects of the non-standard neutrino cosmology on terrestrial and cosmological experiments. 2) Addressing the matter anti-matter asymmetry of the universe.

  15. Incorporating the Common Core's Problem Solving Standard for Mathematical Practice into an Early Elementary Inclusive Classroom

    ERIC Educational Resources Information Center

    Fletcher, Nicole

    2014-01-01

    Mathematics curriculum designers and policy decision makers are beginning to recognize the importance of problem solving, even at the earliest stages of mathematics learning. The Common Core includes sense making and perseverance in solving problems in its standards for mathematical practice for students at all grade levels. Incorporating problem…

  16. Promoting Access to Common Core Mathematics for Students with Severe Disabilities through Mathematical Problem Solving

    ERIC Educational Resources Information Center

    Spooner, Fred; Saunders, Alicia; Root, Jenny; Brosh, Chelsi

    2017-01-01

    There is a need to teach the pivotal skill of mathematical problem solving to students with severe disabilities, moving beyond basic skills like computation to higher level thinking skills. Problem solving is emphasized as a Standard for Mathematical Practice in the Common Core State Standards across grade levels. This article describes a…

  17. Analyzing Multilevel Data: An Empirical Comparison of Parameter Estimates of Hierarchical Linear Modeling and Ordinary Least Squares Regression

    ERIC Educational Resources Information Center

    Rocconi, Louis M.

    2011-01-01

    Hierarchical linear models (HLM) solve the problems associated with the unit of analysis problem such as misestimated standard errors, heterogeneity of regression and aggregation bias by modeling all levels of interest simultaneously. Hierarchical linear modeling resolves the problem of misestimated standard errors by incorporating a unique random…

  18. The Problem of Correspondence of Educational and Professional Standards (Results of Empirical Research)

    ERIC Educational Resources Information Center

    Piskunova, Elena; Sokolova, Irina; Kalimullin, Aydar

    2016-01-01

    In the article, the problem of correspondence of educational standards of higher pedagogical education and teacher professional standards in Russia is actualized. Modern understanding of the quality of vocational education suggests that in the process of education the student develops a set of competencies that will enable him or her to carry out…

  19. Status and analysis of test standard for on-board charger

    NASA Astrophysics Data System (ADS)

    Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan

    2018-05-01

    This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.

  20. An information geometric approach to least squares minimization

    NASA Astrophysics Data System (ADS)

    Transtrum, Mark; Machta, Benjamin; Sethna, James

    2009-03-01

    Parameter estimation by nonlinear least squares minimization is a ubiquitous problem that has an elegant geometric interpretation: all possible parameter values induce a manifold embedded within the space of data. The minimization problem is then to find the point on the manifold closest to the origin. The standard algorithm for minimizing sums of squares, the Levenberg-Marquardt algorithm, also has geometric meaning. When the standard algorithm fails to efficiently find accurate fits to the data, geometric considerations suggest improvements. Problems involving large numbers of parameters, such as often arise in biological contexts, are notoriously difficult. We suggest an algorithm based on geodesic motion that may offer improvements over the standard algorithm for a certain class of problems.

  1. Pupils' Visual Representations in Standard and Problematic Problem Solving in Mathematics: Their Role in the Breach of the Didactical Contract

    ERIC Educational Resources Information Center

    Deliyianni, Eleni; Monoyiou, Annita; Elia, Iliada; Georgiou, Chryso; Zannettou, Eleni

    2009-01-01

    This study investigated the modes of representations generated by kindergarteners and first graders while solving standard and problematic problems in mathematics. Furthermore, it examined the influence of pupils' visual representations on the breach of the didactical contract rules in problem solving. The sample of the study consisted of 38…

  2. Application of a Mixed Consequential Ethical Model to a Problem Regarding Test Standards.

    ERIC Educational Resources Information Center

    Busch, John Christian

    The work of the ethicist Charles Curran and the problem-solving strategy of the mixed consequentialist ethical model are applied to a traditional social science measurement problem--that of how to adjust a recommended standard in order to be fair to the test-taker and society. The focus is on criterion-referenced teacher certification tests.…

  3. Assessment of RANS and LES Turbulence Modeling for Buoyancy-Aided/Opposed Forced and Mixed Convection

    NASA Astrophysics Data System (ADS)

    Clifford, Corey; Kimber, Mark

    2017-11-01

    Over the last 30 years, an industry-wide shift within the nuclear community has led to increased utilization of computational fluid dynamics (CFD) to supplement nuclear reactor safety analyses. One such area that is of particular interest to the nuclear community, specifically to those performing loss-of-flow accident (LOFA) analyses for next-generation very-high temperature reactors (VHTR), is the capacity of current computational models to predict heat transfer across a wide range of buoyancy conditions. In the present investigation, a critical evaluation of Reynolds-averaged Navier-Stokes (RANS) and large-eddy simulation (LES) turbulence modeling techniques is conducted based on CFD validation data collected from the Rotatable Buoyancy Tunnel (RoBuT) at Utah State University. Four different experimental flow conditions are investigated: (1) buoyancy-aided forced convection; (2) buoyancy-opposed forced convection; (3) buoyancy-aided mixed convection; (4) buoyancy-opposed mixed convection. Overall, good agreement is found for both forced convection-dominated scenarios, but an overly-diffusive prediction of the normal Reynolds stress is observed for the RANS-based turbulence models. Low-Reynolds number RANS models perform adequately for mixed convection, while higher-order RANS approaches underestimate the influence of buoyancy on the production of turbulence.

  4. Nodal Diffusion Burnable Poison Treatment for Prismatic Reactor Cores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. M. Ougouag; R. M. Ferrer

    2010-10-01

    The prismatic block version of the High Temperature Reactor (HTR) considered as a candidate Very High Temperature Reactor (VHTR)design may use burnable poison pins in locations at some corners of the fuel blocks (i.e., assembly equivalent structures). The presence of any highly absorbing materials, such as these burnable poisons, within fuel blocks for hexagonal geometry, graphite-moderated High Temperature Reactors (HTRs) causes a local inter-block flux depression that most nodal diffusion-based method have failed to properly model or otherwise represent. The location of these burnable poisons near vertices results in an asymmetry in the morphology of the assemblies (or blocks). Hencemore » the resulting inadequacy of traditional homogenization methods, as these “spread” the actually local effect of the burnable poisons throughout the assembly. Furthermore, the actual effect of the burnable poison is primarily local with influence in its immediate vicinity, which happens to include a small region within the same assembly as well as similar regions in the adjacent assemblies. Traditional homogenization methods miss this artifact entirely. This paper presents a novel method for treating the local effect of the burnable poison explicitly in the context of a modern nodal method.« less

  5. Nodal Green’s Function Method Singular Source Term and Burnable Poison Treatment in Hexagonal Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.A. Bingham; R.M. Ferrer; A.M. ougouag

    2009-09-01

    An accurate and computationally efficient two or three-dimensional neutron diffusion model will be necessary for the development, safety parameters computation, and fuel cycle analysis of a prismatic Very High Temperature Reactor (VHTR) design under Next Generation Nuclear Plant Project (NGNP). For this purpose, an analytical nodal Green’s function solution for the transverse integrated neutron diffusion equation is developed in two and three-dimensional hexagonal geometry. This scheme is incorporated into HEXPEDITE, a code first developed by Fitzpatrick and Ougouag. HEXPEDITE neglects non-physical discontinuity terms that arise in the transverse leakage due to the transverse integration procedure application to hexagonal geometry andmore » cannot account for the effects of burnable poisons across nodal boundaries. The test code being developed for this document accounts for these terms by maintaining an inventory of neutrons by using the nodal balance equation as a constraint of the neutron flux equation. The method developed in this report is intended to restore neutron conservation and increase the accuracy of the code by adding these terms to the transverse integrated flux solution and applying the nodal Green’s function solution to the resulting equation to derive a semi-analytical solution.« less

  6. Biaxial Thermal Creep of Alloy 617 and Alloy 230 for VHTR Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Kun; Lv, Wei; Tung, Hsiao-Ming

    2016-05-18

    In this study, we employed pressurized creep tubes to investigate the biaxial thermal creep behavior of Inconel 617 (alloy 617) and Haynes 230 (alloy 230). Both alloys are considered to he the primary candidate structural materials for very high-temperature reactors (VITITRs) due to their exceptional high-temperature mechanical properties. The current creep experiments were conducted at 900 degrees C for the effective stress range of 15-35 MPa. For both alloys, complete creep strain development with primary, secondary, and tertiary regimes was observed in all the studied conditions. Tertiary creep was found to he dominant over the entire creep lives of bothmore » alloys. With increasing applied creep stress, the fraction of the secondary creep regime decreases. The nucleation, diffusion, and coarsening of creep voids and carbides on grain boundaries were found to be the main reasons for the limited secondary regime and were also found to be the major causes of creep fracture. The creep curves computed using the adjusted creep equation of the form epsilon= cosh 1(1 rt) + P-sigma ntm agree well with the experimental results for both alloys at die temperatures of 850-950 degrees C.« less

  7. [Evaluation of the standard application of Delphi in the diagnosis of chronic obstructive pulmonary disease caused by occupational irritant chemicals].

    PubMed

    Zhao, L; Yan, Y J

    2017-11-20

    Objective: To investigate the problems encountered in the application of the standard (hereinafter referred to as standard) for the diagnosis of chronic obstructive pulmonary disease caused by occu-pational irritant chemicals, to provide reference for the revision of the new standard, to reduce the number of missed patients in occupational COPD, and to get rid of the working environment of those who suffer from chronic respiratory diseases due to long-term exposure to poisons., slowing the progression of the disease. Methods: Using Delphi (Delphi) Expert research method, after the senior experts to demonstrate, to under-stand the GBZ 237-2011 "occupational irritant chemicals to the diagnosis of chronic obstructive pulmonary dis-ease" standard evaluation of the system encountered problems, to seek expert advice, The problems encoun-tered during the clinical implementation of the standards promulgated in 2011 are presented. Results: Through the Delphi Expert investigation method, it is found that experts agree on the content evaluation and implemen-tation evaluation in the standard, but the operational evaluation of the standard is disputed. According to the clinical experience, the experts believe that the range of occupational irritant gases should be expanded, and the operation of the problem of smoking, seniority determination and occupational contact history should be challenged during the diagnosis. Conclusions: Since the promulgation in 2011 of the criteria for the diagnosis of chronic obstructive pulmonary disease caused by occupational stimulant chemicals, there have been some problems in the implementation process, which have caused many occupationally exposed to irritating gases to suffer from "occupational chronic respiratory Diseases" without a definitive diagnosis.

  8. Authentication: A Standard Problem or a Problem of Standards?

    PubMed

    Capes-Davis, Amanda; Neve, Richard M

    2016-06-01

    Reproducibility and transparency in biomedical sciences have been called into question, and scientists have been found wanting as a result. Putting aside deliberate fraud, there is evidence that a major contributor to lack of reproducibility is insufficient quality assurance of reagents used in preclinical research. Cell lines are widely used in biomedical research to understand fundamental biological processes and disease states, yet most researchers do not perform a simple, affordable test to authenticate these key resources. Here, we provide a synopsis of the problems we face and how standards can contribute to an achievable solution.

  9. A process for reaching standardization of word processing software for Sandia National Laboratories (Albuquerque) secretaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, S.R.

    1989-04-01

    In the summer of 1986, a number of problems being experienced by Sandia secretaries due to multiple word processing packages being used were brought to the attention of Sandia's upper management. This report discusses how these problems evolved, how management chose to correct the problem, and how standardization of word processing for Sandia secretaries was achieved. 11 refs.

  10. Dependability of technical items: Problems of standardization

    NASA Astrophysics Data System (ADS)

    Fedotova, G. A.; Voropai, N. I.; Kovalev, G. F.

    2016-12-01

    This paper is concerned with problems blown up in the development of a new version of the Interstate Standard GOST 27.002 "Industrial product dependability. Terms and definitions". This Standard covers a wide range of technical items and is used in numerous regulations, specifications, standard and technical documentation. A currently available State Standard GOST 27.002-89 was introduced in 1990. Its development involved a participation of scientists and experts from different technical areas, its draft was debated in different audiences and constantly refined, so it was a high quality document. However, after 25 years of its application it's become necessary to develop a new version of the Standard that would reflect the current understanding of industrial dependability, accounting for the changes taking place in Russia in the production, management and development of various technical systems and facilities. The development of a new version of the Standard makes it possible to generalize on a terminological level the knowledge and experience in the area of reliability of technical items, accumulated over a quarter of the century in different industries and reliability research schools, to account for domestic and foreign experience of standardization. Working on the new version of the Standard, we have faced a number of issues and problems on harmonization with the International Standard IEC 60500-192, caused first of all by different approaches to the use of terms and differences in the mentalities of experts from different countries. The paper focuses on the problems related to the chapter "Maintenance, restoration and repair", which caused difficulties for the developers to harmonize term definitions both with experts and the International Standard, which is mainly related to differences between the Russian concept and practice of maintenance and repair and foreign ones.

  11. Qualitative Differences in Real-Time Solution of Standardized Figural Analogies.

    ERIC Educational Resources Information Center

    Schiano, Diane J.; And Others

    Performance on standardized figural analogy tests is considered highly predictive of academic success. While information-processing models of analogy solution attribute performance differences to quantitative differences in processing parameters, the problem-solving literature suggests that qualitative differences in problem representation and…

  12. Assembling Appliances Standards from a Basket of Functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siderious, Hans-Paul; Meier, Alan

    2014-08-11

    Rapid innovation in product design challenges the current methodology for setting standards and labels, especially for electronics, software and networking. Major problems include defining the product, measuring its energy consumption, and choosing the appropriate metric and level for the standard. Most governments have tried to solve these problems by defining ever more specific product subcategories, along with their corresponding test methods and metrics. An alternative approach would treat each energy-using product as something that delivers a basket of functions. Then separate standards would be constructed for the individual functions that can be defined, tested, and evaluated. Case studies of thermostats,more » displays and network equipment are presented to illustrate the problems with the classical approach for setting standards and indicate the merits and drawbacks of the alternative. The functional approach appears best suited to products whose primary purpose is processing information and that have multiple functions.« less

  13. 77 FR 9239 - California State Motor Vehicle and Nonroad Engine Pollution Control Standards; Truck Idling...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... Pollution Control Standards; Truck Idling Requirements; Notice of Decision AGENCY: Environmental Protection... to meet its serious air pollution problems. Likewise, EPA has consistently recognized that California... and high concentrations of automobiles, create serious pollution problems.'' \\37\\ Furthermore, no...

  14. Learning to Write about Mathematics

    ERIC Educational Resources Information Center

    Parker, Renee; Breyfogle, M. Lynn

    2011-01-01

    Beginning in third grade, Pennsylvania students are required to take the Pennsylvania State Standardized Assessment (PSSA), which presents multiple-choice mathematics questions and open-ended mathematics problems. Consistent with the Communication Standard of the National Council of Teachers of Mathematics, while solving the open-ended problems,…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J E; Vassilevski, P S; Woodward, C S

    This paper provides extensions of an element agglomeration AMG method to nonlinear elliptic problems discretized by the finite element method on general unstructured meshes. The method constructs coarse discretization spaces and corresponding coarse nonlinear operators as well as their Jacobians. We introduce both standard (fairly quasi-uniformly coarsened) and non-standard (coarsened away) coarse meshes and respective finite element spaces. We use both kind of spaces in FAS type coarse subspace correction (or Schwarz) algorithms. Their performance is illustrated on a number of model problems. The coarsened away spaces seem to perform better than the standard spaces for problems with nonlinearities inmore » the principal part of the elliptic operator.« less

  16. [Development of a software standardizing optical density with operation settings related to several limitations].

    PubMed

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  17. Solving standard traveling salesman problem and multiple traveling salesman problem by using branch-and-bound

    NASA Astrophysics Data System (ADS)

    Saad, Shakila; Wan Jaafar, Wan Nurhadani; Jamil, Siti Jasmida

    2013-04-01

    The standard Traveling Salesman Problem (TSP) is the classical Traveling Salesman Problem (TSP) while Multiple Traveling Salesman Problem (MTSP) is an extension of TSP when more than one salesman is involved. The objective of MTSP is to find the least costly route that the traveling salesman problem can take if he wishes to visit exactly once each of a list of n cities and then return back to the home city. There are a few methods that can be used to solve MTSP. The objective of this research is to implement an exact method called Branch-and-Bound (B&B) algorithm. Briefly, the idea of B&B algorithm is to start with the associated Assignment Problem (AP). A branching strategy will be applied to the TSP and MTSP which is Breadth-first-Search (BFS). 11 nodes of cities are implemented for both problem and the solutions to the problem are presented.

  18. Modeling of tool path for the CNC sheet cutting machines

    NASA Astrophysics Data System (ADS)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  19. Boosting standard order sets utilization through clinical decision support.

    PubMed

    Li, Haomin; Zhang, Yinsheng; Cheng, Haixia; Lu, Xudong; Duan, Huilong

    2013-01-01

    Well-designed standard order sets have the potential to integrate and coordinate care by communicating best practices through multiple disciplines, levels of care, and services. However, there are several challenges which certainly affected the benefits expected from standard order sets. To boost standard order sets utilization, a problem-oriented knowledge delivery solution was proposed in this study to facilitate access of standard order sets and evaluation of its treatment effect. In this solution, standard order sets were created along with diagnostic rule sets which can trigger a CDS-based reminder to help clinician quickly discovery hidden clinical problems and corresponding standard order sets during ordering. Those rule set also provide indicators for targeted evaluation of standard order sets during treatment. A prototype system was developed based on this solution and will be presented at Medinfo 2013.

  20. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  1. Research Problems Associated with Limiting the Applied Force in Vibration Tests and Conducting Base-Drive Modal Vibration Tests

    NASA Technical Reports Server (NTRS)

    Scharton, Terry D.

    1995-01-01

    The intent of this paper is to make a case for developing and conducting vibration tests which are both realistic and practical (a question of tailoring versus standards). Tests are essential for finding things overlooked in the analyses. The best test is often the most realistic test which can be conducted within the cost and budget constraints. Some standards are essential, but the author believes more in the individual's ingenuity to solve a specific problem than in the application of standards which reduce problems (and technology) to their lowest common denominator. Force limited vibration tests and base-drive modal tests are two examples of realistic, but practical testing approaches. Since both of these approaches are relatively new, a number of interesting research problems exist, and these are emphasized herein.

  2. The problem of epistemic jurisdiction in global governance: The case of sustainability standards for biofuels.

    PubMed

    Winickoff, David E; Mondou, Matthieu

    2017-02-01

    While there is ample scholarly work on regulatory science within the state, or single-sited global institutions, there is less on its operation within complex modes of global governance that are decentered, overlapping, multi-sectorial and multi-leveled. Using a co-productionist framework, this study identifies 'epistemic jurisdiction' - the power to produce or warrant technical knowledge for a given political community, topical arena or geographical territory - as a central problem for regulatory science in complex governance. We explore these dynamics in the arena of global sustainability standards for biofuels. We select three institutional fora as sites of inquiry: the European Union's Renewable Energy Directive, the Roundtable on Sustainable Biomaterials, and the International Organization for Standardization. These cases allow us to analyze how the co-production of sustainability science responds to problems of epistemic jurisdiction in the global regulatory order. First, different problems of epistemic jurisdiction beset different standard-setting bodies, and these problems shape both the content of regulatory science and the procedures designed to make it authoritative. Second, in order to produce global regulatory science, technical bodies must manage an array of conflicting imperatives - including scientific virtue, due process and the need to recruit adoptees to perpetuate the standard. At different levels of governance, standard drafters struggle to balance loyalties to country, to company or constituency and to the larger project of internationalization. Confronted with these sometimes conflicting pressures, actors across the standards system quite self-consciously maneuver to build or retain authority for their forum through a combination of scientific adjustment and political negotiation. Third, the evidentiary demands of regulatory science in global administrative spaces are deeply affected by 1) a market for standards, in which firms and states can choose the cheapest sustainability certification, and 2) the international trade regime, in which the long shadow of WTO law exerts a powerful disciplining function.

  3. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  4. DICOMweb™: Background and Application of the Web Standard for Medical Imaging.

    PubMed

    Genereaux, Brad W; Dennison, Donald K; Ho, Kinson; Horn, Robert; Silver, Elliot Lewis; O'Donnell, Kevin; Kahn, Charles E

    2018-05-10

    This paper describes why and how DICOM, the standard that has been the basis for medical imaging interoperability around the world for several decades, has been extended into a full web technology-based standard, DICOMweb. At the turn of the century, healthcare embraced information technology, which created new problems and new opportunities for the medical imaging industry; at the same time, web technologies matured and began serving other domains well. This paper describes DICOMweb, how it extended the DICOM standard, and how DICOMweb can be applied to problems facing healthcare applications to address workflow and the changing healthcare climate.

  5. [Problems Inherent in Attempting Standardization of Libraries.

    ERIC Educational Resources Information Center

    Port, Idelle

    In setting standards for a large and geographically dispersed library system, one must reconcile the many varying practices that affect what is being measured or discussed. The California State University and Colleges (CSUC) consists of 19 very distinct campuses. The problems and solutions of one type of CSUC library are not likely to be those of…

  6. Naturalness of Electroweak Symmetry Breaking

    NASA Astrophysics Data System (ADS)

    Espinosa, J. R.

    2007-02-01

    After revisiting the hierarchy problem of the Standard Model and its implications for the scale of New Physics, I consider the fine tuning problem of electroweak symmetry breaking in two main scenarios beyond the Standard Model: SUSY and Little Higgs models. The main conclusions are that New Physics should appear on the reach of the LHC; that some SUSY models can solve the hierarchy problem with acceptable residual fine tuning and, finally, that Little Higgs models generically suffer from large tunings, many times hidden.

  7. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolda, Christopher

    In this talk, I review recent work on using a generalization of the Next-to-Minimal Supersymmetric Standard Model (NMSSM), called the Singlet-extended Minimal Supersymmetric Standard Model (SMSSM), to raise the mass of the Standard Model-like Higgs boson without requiring extremely heavy top squarks or large stop mixing. In so doing, this model solves the little hierarchy problem of the minimal model (MSSM), at the expense of leaving the {mu}-problem of the MSSM unresolved. This talk is based on work published in Refs. [1, 2, 3].

  9. Quantum annealing of the traveling-salesman problem.

    PubMed

    Martonák, Roman; Santoro, Giuseppe E; Tosatti, Erio

    2004-11-01

    We propose a path-integral Monte Carlo quantum annealing scheme for the symmetric traveling-salesman problem, based on a highly constrained Ising-like representation, and we compare its performance against standard thermal simulated annealing. The Monte Carlo moves implemented are standard, and consist in restructuring a tour by exchanging two links (two-opt moves). The quantum annealing scheme, even with a drastically simple form of kinetic energy, appears definitely superior to the classical one, when tested on a 1002-city instance of the standard TSPLIB.

  10. Does language ambiguity in clinical practice justify the introduction of standard terminology? An integrative review.

    PubMed

    Stallinga, Hillegonda A; ten Napel, Huib; Jansen, Gerard J; Geertzen, Jan H B; de Vries Robbé, Pieter F; Roodbol, Petrie F

    2015-02-01

    To research the use of ambiguous language in written information concerning patients' functioning and to identify problems resulting from the use of ambiguous language in clinical practice. Many projects that aimed to introduce standard terminology concerning patients' functioning in clinical practice are unsuccessful because standard terminology is rarely used in clinical practice. These projects mainly aim to improve communication by reducing ambiguous language. Considering their lack of success, the validity of the argument that language ambiguity is used in clinical practice is questioned. An integrative literature review. A systematic search of the MEDLINE (1950-2012) and CINAHL (1982-2012) databases was undertaken, including empirical and theoretical literature. The selected studies were critically appraised using a data assessment and extraction form. Seventeen of 767 papers were included in the review and synthesis. The use of ambiguous language in written information concerning patients' functioning was demonstrated. Problems resulting from the use of ambiguous language in clinical practice were not identified. However, several potential problems were suggested, including hindered clinical decision-making and limited research opportunities. The results of this review demonstrated the use of ambiguous language concerning patients' functioning, but health professionals in clinical practice did not experience this issue as a problem. This finding might explain why many projects aimed at introducing standard terminology concerning functioning in clinical practice to solve problems caused by ambiguous language are often unsuccessful. Language ambiguity alone is not a valid argument to justify the introduction of standard terminology. The introduction of standard terminology concerning patients' functioning will only be successful when clinical practice requires the aggregation and reuse of data from electronic patient records for different purposes, including multidisciplinary decision-making and research. © 2014 John Wiley & Sons Ltd.

  11. Golden Ratio in a Coupled-Oscillator Problem

    ERIC Educational Resources Information Center

    Moorman, Crystal M.; Goff, John Eric

    2007-01-01

    The golden ratio appears in a classical mechanics coupled-oscillator problem that many undergraduates may not solve. Once the symmetry is broken in a more standard problem, the golden ratio appears. Several student exercises arise from the problem considered in this paper.

  12. RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2012-06-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.« less

  13. RELAP5-3D results for phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, G.; Epiney, A. S.

    2012-07-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2. (authors)« less

  14. Fuel development for gas-cooled fast reactors

    NASA Astrophysics Data System (ADS)

    Meyer, M. K.; Fielding, R.; Gan, J.

    2007-09-01

    The Generation IV Gas-cooled Fast Reactor (GFR) concept is proposed to combine the advantages of high-temperature gas-cooled reactors (such as efficient direct conversion with a gas turbine and the potential for application of high-temperature process heat), with the sustainability advantages that are possible with a fast-spectrum reactor. The latter include the ability to fission all transuranics and the potential for breeding. The GFR is part of a consistent set of gas-cooled reactors that includes a medium-term Pebble Bed Modular Reactor (PBMR)-like concept, or concepts based on the Gas Turbine Modular Helium Reactor (GT-MHR), and specialized concepts such as the Very High-Temperature Reactor (VHTR), as well as actinide burning concepts [A Technology Roadmap for Generation IV Nuclear Energy Systems, US DOE Nuclear Energy Research Advisory Committee and the Generation IV International Forum, December 2002]. To achieve the necessary high power density and the ability to retain fission gas at high temperature, the primary fuel concept proposed for testing in the United States is dispersion coated fuel particles in a ceramic matrix. Alternative fuel concepts considered in the US and internationally include coated particle beds, ceramic clad fuel pins, and novel ceramic 'honeycomb' structures. Both mixed carbide and mixed nitride-based solid solutions are considered as fuel phases.

  15. Current Problems of Improving the Environmental Certification and Output Compliance Verification in the Context of Environmental Management in Kazakhstan

    ERIC Educational Resources Information Center

    Zhambaev, Yerzhan S.; Sagieva, Galia K.; Bazarbek, Bakhytzhan Zh.; Akkulov, Rustem T.

    2016-01-01

    The article discusses the issues of improving the activity of subjects of environmental management in accordance with international environmental standards and national environmental legislation. The article deals with the problem of ensuring the implementation of international environmental standards, the introduction of eco-management, and the…

  16. The Impact of a Standards Guided Equity and Problem Solving Institute on Participating Science Teachers and Their Students.

    ERIC Educational Resources Information Center

    Huber, Richard A.; Smith, Robert W.; Shotsberger, Paul G.

    This study examined the effect of a teacher enhancement project combining training on the National Science Education Standards, problem solving and equity education on middle school science teachers' attitudes and practices and, in turn, the attitudes of their students. Participating teachers reported changes in their instructional methods that…

  17. 40 CFR 61.346 - Standards: Individual drain systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS National Emission Standard for Benzene... of cracks, gaps, or other problems that could result in benzene emissions. (5) Except as provided in...

  18. 40 CFR 61.346 - Standards: Individual drain systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS National Emission Standard for Benzene... of cracks, gaps, or other problems that could result in benzene emissions. (5) Except as provided in...

  19. The effects of multi-disciplinary psycho-social care on socio-economic problems in cancer patients: a cluster-randomized trial.

    PubMed

    Singer, Susanne; Roick, Julia; Meixensberger, Jürgen; Schiefke, Franziska; Briest, Susanne; Dietz, Andreas; Papsdorf, Kirsten; Mössner, Joachim; Berg, Thomas; Stolzenburg, Jens-Uwe; Niederwieser, Dietger; Keller, Annette; Kersting, Anette; Danker, Helge

    2018-06-01

    We examined whether multi-disciplinary stepped psycho-social care decreases financial problems and improves return-to-work in cancer patients. In a university hospital, wards were randomly allocated to either stepped or standard care. Stepped care comprised screening for financial problems, consultation between doctor and patient, and the provision of social service. Outcomes were financial problems at the time of discharge and return-to-work in patients < 65 years old half a year after baseline. The analysis employed mixed-effect multivariate regression modeling. Thirteen wards were randomized and 1012 patients participated (n = 570 in stepped care and n = 442 in standard care). Those who reported financial problems at baseline were less likely to have financial problems at discharge when they had received stepped care (odds ratio (OR) 0.2, 95% confidence interval (CI) 0.1, 0.7; p = 0.01). There was no evidence for an effect of stepped care on financial problems in patients without such problems at baseline (OR 1.1, CI 0.5, 2.6; p = 0.82). There were 399 patients < 65 years old who were not retired at baseline. In this group, there was no evidence for an effect of stepped care on being employed half a year after baseline (OR 0.7, CI 0.3, 2.0; p = 0.52). NCT01859429 CONCLUSIONS: Financial problems can be avoided more effectively with multi-disciplinary stepped psycho-social care than with standard care in patients who have such problems.

  20. Problematic Alcohol Use and Mild Intellectual Disability: Standardization of Pictorial Stimuli for an Alcohol Cue Reactivity Task

    ERIC Educational Resources Information Center

    van Duijvenbode, Neomi; Didden, Robert; Bloemsaat, Gijs; Engels, Rutger C. M. E.

    2012-01-01

    The present study focused on the first step in developing a cue reactivity task for studying cognitive biases in individuals with mild to borderline intellectual disability (ID) and alcohol use-related problems: the standardization of pictorial stimuli. Participants (N = 40), both with and without a history of alcohol use-related problems and…

  1. Looking beyond RtI Standard Treatment Approach: It's Not Too Late to Embrace the Problem-Solving Approach

    ERIC Educational Resources Information Center

    King, Diane; Coughlin, Patricia Kathleen

    2016-01-01

    There are two approaches for providing Tier 2 interventions within Response to Intervention (RtI): standard treatment protocol (STP) and the problem-solving approach (PSA). This article describes the multi-tiered RtI prevention model being implemented across the United States through an analysis of these two approaches in reading instruction. It…

  2. The "Pedagogy of the Oppressed": The Necessity of Dealing with Problems in Students' Lives

    ERIC Educational Resources Information Center

    Reynolds, Patricia R.

    2007-01-01

    Students have problems in their lives, but can teachers help them? Should teachers help? The No Child Left Behind (NCLB) act and its emphasis on standardized test results have forced school systems to produce high scores, and in turn school administrators pressure teachers to prepare students for taking standardized tests. Teachers may want to…

  3. Standardization of 237Np by the CIEMAT/NIST LSC tracer method

    PubMed

    Gunther

    2000-03-01

    The standardization of 237Np presents some difficulties: several groups of alpha, beta and gamma radiation, chemical problems with the daughter nuclide 233Pa, an incomplete radioactive equilibrium after sample preparation, high conversion of some gamma transitions. To solve the chemical problems, a sample composition involving the Ultima Gold AB scintillator and a high concentration of HCl is used. Standardization by the CIEMAT/NIST method and by pulse shape discrimination is described. The results agree within 0.1% with those obtained by two other methods.

  4. An implementation and analysis of the Abstract Syntax Notation One and the basic encoding rules

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The details of abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively solve the problem of data transfer across incompatible host environments are presented, and a compiler that was built to automate their use is described. Experiences with this compiler are also discussed which provide a quantitative analysis of the performance costs associated with the application of these standards. An evaluation is offered as to how well suited ASN.1 and BER are in solving the common data representation problem.

  5. Emission Standards for Particulates

    ERIC Educational Resources Information Center

    Walsh, George W.

    1974-01-01

    Promulgation of standards of performance under Section 111 and national emission standards for hazardous pollutants under Section 112 of the Clean Air Act is the responsibility of the Emission Standards and Engineering Division of the Environmental Protection Agency. The problems encountered and the bases used are examined. (Author/BT)

  6. Problem Solvers: Problem--Jesse's Train

    ERIC Educational Resources Information Center

    James, Julie; Steimle, Alice

    2014-01-01

    Persevering in problem solving and constructing and critiquing mathematical arguments are some of the mathematical practices included in the Common Core State Standards for Mathematics (CCSSI 2010). To solve unfamiliar problems, students must make sense of the situation and apply current knowledge. Teachers can present such opportunities by…

  7. Exploring creativity and critical thinking in traditional and innovative problem-based learning groups.

    PubMed

    Chan, Zenobia C Y

    2013-08-01

    To explore students' attitude towards problem-based learning, creativity and critical thinking, and the relevance to nursing education and clinical practice. Critical thinking and creativity are crucial in nursing education. The teaching approach of problem-based learning can help to reduce the difficulties of nurturing problem-solving skills. However, there is little in the literature on how to improve the effectiveness of a problem-based learning lesson by designing appropriate and innovative activities such as composing songs, writing poems and using role plays. Exploratory qualitative study. A sample of 100 students participated in seven semi-structured focus groups, of which two were innovative groups and five were standard groups, adopting three activities in problem-based learning, namely composing songs, writing poems and performing role plays. The data were analysed using thematic analysis. There are three themes extracted from the conversations: 'students' perceptions of problem-based learning', 'students' perceptions of creative thinking' and 'students' perceptions of critical thinking'. Participants generally agreed that critical thinking is more important than creativity in problem-based learning and clinical practice. Participants in the innovative groups perceived a significantly closer relationship between critical thinking and nursing care, and between creativity and nursing care than the standard groups. Both standard and innovative groups agreed that problem-based learning could significantly increase their critical thinking and problem-solving skills. Further, by composing songs, writing poems and using role plays, the innovative groups had significantly increased their awareness of the relationship among critical thinking, creativity and nursing care. Nursing educators should include more types of creative activities than it often does in conventional problem-based learning classes. The results could help nurse educators design an appropriate curriculum for preparing professional and ethical nurses for future clinical practice. © 2013 Blackwell Publishing Ltd.

  8. Preservation of Digital Objects.

    ERIC Educational Resources Information Center

    Galloway, Patricia

    2004-01-01

    Presents a literature review that covers the following topics related to preservation of digital objects: practical examples; stakeholders; recordkeeping standards; genre-specific problems; trusted repository standards; preservation methods; preservation metadata standards; and future directions. (Contains 82 references.) (MES)

  9. California residential energy standards: problems and recommendations relating to implementation, enforcement, and design. [Thermal insulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-08-01

    Documents relevant to the development and implementation of the California energy insulation standards for new residential buildings were evaluated and a survey was conducted to determine problems encountered in the implementation, enforcement, and design aspects of the standards. The impact of the standards on enforcement agencies, designers, builders and developers, manufacturers and suppliers, consumers, and the building process in general is summarized. The impact on construction costs and energy savings varies considerably because of the wide variation in prior insulation practices and climatic conditions in California. The report concludes with a series of recommendations covering all levels of government andmore » the building process. (MCW)« less

  10. Hierarchy problem and BSM physics

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Gautam

    2017-10-01

    The `hierarchy problem' plagues the Standard Model of particle physics. The source of this problem is our inability to answer the following question: Why is the Higgs mass so much below the GUT or Planck scale? A brief description about how `supersymmetry' and `composite Higgs' address this problem is given here.

  11. The Performance of Chinese Primary School Students on Realistic Arithmetic Word Problems

    ERIC Educational Resources Information Center

    Xin, Ziqiang; Lin, Chongde; Zhang, Li; Yan, Rong

    2007-01-01

    Compared with standard arithmetic word problems demanding only the direct use of number operations and computations, realistic problems are harder to solve because children need to incorporate "real-world" knowledge into their solutions. Using the realistic word problem testing materials developed by Verschaffel, De Corte, and Lasure…

  12. Using the CPGI to Determine Problem Gambling Prevalence in Australia: Measurement Issues

    ERIC Educational Resources Information Center

    Jackson, Alun C.; Wynne, Harold; Dowling, Nicki A.; Tomnay, Jane E.; Thomas, Shane A.

    2010-01-01

    Most states and territories in Australia have adopted the Problem Gambling Severity Index (PGSI) of the Canadian Problem Gambling Index as the standard measure of problem gambling in their prevalence studies and research programs. However, notwithstanding this attempted standardisation, differences in sampling and recruitment methodologies and in…

  13. Kindergarten Students Solving Mathematical Word Problems

    ERIC Educational Resources Information Center

    Johnson, Nickey Owen

    2013-01-01

    The purpose of this study was to explore problem solving with kindergarten students. This line of inquiry is highly significant given that Common Core State Standards emphasize deep, conceptual understanding in mathematics as well as problem solving in kindergarten. However, there is little research on problem solving with kindergarten students.…

  14. The Role of Expository Writing in Mathematical Problem Solving

    ERIC Educational Resources Information Center

    Craig, Tracy S.

    2016-01-01

    Mathematical problem-solving is notoriously difficult to teach in a standard university mathematics classroom. The project on which this article reports aimed to investigate the effect of the writing of explanatory strategies in the context of mathematical problem solving on problem-solving behaviour. This article serves to describe the…

  15. The Thinnest Path Problem

    DTIC Science & Technology

    2016-07-22

    their corresponding transmission powers . At first glance, one may wonder whether the thinnest path problem is simply a shortest path problem with the...nature of the shortest path problem. Another aspect that complicates the problem is the choice of the transmission power at each node (within a maximum...fixed transmission power at each node (in this case, the resulting hypergraph degenerates to a standard graph), the thinnest path problem is NP

  16. Problems of Technical Standards Teaching in the Context of the Globalization and Euro-Integration in Higher Education System of Ukraine

    ERIC Educational Resources Information Center

    Kornuta, Olena; Pryhorovska, Tetiana

    2015-01-01

    Globalization and Ukraine association with EU imply including Ukrainian universities into the world scientific space. The aim of this article is to analyze the problem of drawing standards teaching, based on the experience of Ivano-Frankivsk National Technical University of Oil and Gas (Ukraine) and to summarize the experience of post Soviet…

  17. Planning Model of Physics Learning In Senior High School To Develop Problem Solving Creativity Based On National Standard Of Education

    NASA Astrophysics Data System (ADS)

    Putra, A.; Masril, M.; Yurnetti, Y.

    2018-04-01

    One of the causes of low achievement of student’s competence in physics learning in high school is the process which they have not been able to develop student’s creativity in problem solving. This is shown that the teacher’s learning plan is not accordance with the National Eduction Standard. This study aims to produce a reconstruction model of physics learning that fullfil the competency standards, content standards, and assessment standards in accordance with applicable curriculum standards. The development process follows: Needs analysis, product design, product development, implementation, and product evaluation. The research process involves 2 peers judgment, 4 experts judgment and two study groups of high school students in Padang. The data obtained, in the form of qualitative and quantitative data that collected through documentation, observation, questionnaires, and tests. The result of this research up to the product development stage that obtained the physics learning plan model that meets the validity of the content and the validity of the construction in terms of the fulfillment of Basic Competence, Content Standards, Process Standards and Assessment Standards.

  18. A Five Stage Conceptual Model for Information Technology Standards.

    ERIC Educational Resources Information Center

    Cargill, Carl F.

    The advent of anticipatory and boundary layer standards used in information technology standardization has created a need for a new base level theory that can be used to anticipate the problems that will be encountered in standards planning, creation, and implementation. To meet this need, a five-level model of standards has been developed. The…

  19. The stage-value model: Implications for the changing standards of care.

    PubMed

    Görtz, Daniel Patrik; Commons, Michael Lamport

    2015-01-01

    The standard of care is a legal and professional notion against which doctors and other medical personnel are held liable. The standard of care changes as new scientific findings and technological innovations within medicine, pharmacology, nursing and public health are developed and adopted. This study consists of four parts. Part 1 describes the problem and gives concrete examples of its occurrence. The second part discusses the application of the Model of Hierarchical Complexity on the field, giving examples of how standards of care are understood at different behavioral developmental stage. It presents the solution to the problem of standards of care at a Paradigmatic Stage 14. The solution at this stage is a deliberative, communicative process based around why certain norms should or should not apply in each specific case, by the use of "meta-norms". Part 3 proposes a Cross-Paradigmatic Stage 15 view of how the problem of changing standards of care can be solved. The proposed solution is to found the legal procedure in each case on well-established behavioral laws. We maintain that such a behavioristic, scientifically based justice would be much more proficient at effecting restorative legal interventions that create desired behaviors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. The Virginia History Standards and the Cold War

    ERIC Educational Resources Information Center

    Altschuler, Glenn C.; Rauchway, Eric

    2002-01-01

    President George W. Bush's approach to education policy has earned him cautious plaudits from otherwise hostile critics, who see much to admire in the implementation of standards for education. However useful such standards for testing students' technical skills like arithmetic and reading, they create problems for less-standardized processes like…

  1. Education Technology Standards Self-Efficacy (ETSSE) Scale: A Validity and Reliability Study

    ERIC Educational Resources Information Center

    Simsek, Omer; Yazar, Taha

    2016-01-01

    Problem Statement: The educational technology standards for teachers set by the International Society for Technology in Education (the ISTE Standards-T) represent an important framework for using technology effectively in teaching and learning processes. These standards are widely used by universities, educational institutions, and schools. The…

  2. The Federal Government and Information Technology Standards: Building the National Information Infrastructure.

    ERIC Educational Resources Information Center

    Radack, Shirley M.

    1994-01-01

    Examines the role of the National Institute of Standards and Technology (NIST) in the development of the National Information Infrastructure (NII). Highlights include the standards process; voluntary standards; Open Systems Interconnection problems; Internet Protocol Suite; consortia; government's role; and network security. (16 references) (LRW)

  3. A Harmonious Accounting Duo?

    ERIC Educational Resources Information Center

    Schapperle, Robert F.; Hardiman, Patrick F.

    1992-01-01

    Accountants have urged "harmonization" of standards between the Governmental Accounting Standards Board and the Financial Accounting Standards Board, recommending similar reporting of like transactions. However, varying display of similar accounting events does not necessarily indicate disharmony. The potential for problems because of…

  4. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  5. Unified heuristics to solve routing problem of reverse logistics in sustainable supply chain

    NASA Astrophysics Data System (ADS)

    Anbuudayasankar, S. P.; Ganesh, K.; Lenny Koh, S. C.; Mohandas, K.

    2010-03-01

    A reverse logistics problem, motivated by many real-life applications, is examined where bottles/cans in which products are delivered from a processing depot to customers in one period are available for return to the depot in the following period. The picked-up bottles/cans need to be adjusted in the place of delivery load. This problem is termed as simultaneous delivery and pick-up problem with constrained capacity (SDPC). We develop three unified heuristics based on extended branch and bound heuristic, genetic algorithm and simulated annealing to solve SDPC. These heuristics are also designed to solve standard travelling salesman problem (TSP) and TSP with simultaneous delivery and pick-up (TSDP). We tested the heuristics on standard, derived and randomly generated datasets of TSP, TSDP and SDPC and obtained satisfying results with high convergence in reasonable time.

  6. Introductory Course Based on a Single Problem: Learning Nucleic Acid Biochemistry from AIDS Research

    ERIC Educational Resources Information Center

    Grover, Neena

    2004-01-01

    In departure from the standard approach of using several problems to cover specific topics in a class, I use a single problem to cover the contents of the entire semester-equivalent biochemistry classes. I have developed a problem-based service-learning (PBSL) problem on HIV/AIDS to cover nucleic acid concepts that are typically taught in the…

  7. The Role of Content Knowledge in Ill-Structured Problem Solving for High School Physics Students

    ERIC Educational Resources Information Center

    Milbourne, Jeff; Wiebe, Eric

    2018-01-01

    While Physics Education Research has a rich tradition of problem-solving scholarship, most of the work has focused on more traditional, well-defined problems. Less work has been done with ill-structured problems, problems that are better aligned with the engineering and design-based scenarios promoted by the Next Generation Science Standards. This…

  8. Dynamic simulation solves process control problem in Oman

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-11-16

    A dynamic simulation study solved the process control problems for a Saih Rawl, Oman, gas compressor station operated by Petroleum Development of Oman (PDO). PDO encountered persistent compressor failure that caused frequent facility shutdowns, oil production deferment, and gas flaring. It commissioned MSE (Consultants) Ltd., U.K., to find a solution for the problem. Saih Rawl, about 40 km from Qarn Alam, produces oil and associated gas from a large number of low and high-pressure wells. Oil and gas are separated in three separators. The oil is pumped to Qarn Alam for treatment and export. Associated gas is compressed in twomore » parallel trains. Train K-1115 is a 350,000 standard cu m/day, four-stage reciprocating compressor driven by a fixed-speed electric motor. Train K-1120 is a 1 million standard cu m/day, four-stage reciprocating compressor driven by a fixed-speed electric motor. Train K-1120 is a 1 million standard cu m/day, four-stage centrifugal compressor driven by a variable-speed motor. The paper describes tripping and surging problems with the gas compressor and the control simplifications that solved the problem.« less

  9. Naturalness of Electroweak Symmetry Breaking while Waiting for the LHC

    NASA Astrophysics Data System (ADS)

    Espinosa, J. R.

    2007-06-01

    After revisiting the hierarchy problem of the Standard Model and its implications for the scale of New Physics, I consider the finetuning problem of electroweak symmetry breaking in several scenarios beyond the Standard Model: SUSY, Little Higgs and "improved naturalness" models. The main conclusions are that: New Physics should appear on the reach of the LHC; some SUSY models can solve the hierarchy problem with acceptable residual tuning; Little Higgs models generically suffer from large tunings, many times hidden; and, finally, that "improved naturalness" models do not generically improve the naturalness of the SM.

  10. Duality in non-linear programming

    NASA Astrophysics Data System (ADS)

    Jeyalakshmi, K.

    2018-04-01

    In this paper we consider duality and converse duality for a programming problem involving convex objective and constraint functions with finite dimensional range. We do not assume any constraint qualification. The dual is presented by reducing the problem to a standard Lagrange multiplier problem.

  11. SCALE PROBLEMS IN REPORTING LANDSCAPE PATTERN AT THE REGIONAL SCALE

    EPA Science Inventory

    Remotely sensed data for Southeastern United States (Standard Federal Region 4) are used to examine the scale problems involved in reporting landscape pattern for a large, heterogeneous region. Frequency distributions of landscape indices illustrate problems associated with the g...

  12. Relationships between Problem Behaviors and Academic Achievement in Adolescents: The Unique Role of Attention Problems.

    ERIC Educational Resources Information Center

    Barriga, Alvaro Q.; Doran, Jeffrey W.; Newell, Stephanie B.; Morrison, Elizabeth M.; Barbetti, Victor; Robbins, Brent Dean

    2002-01-01

    This study examined relationships among eight teacher-reported problem behavior syndromes and standardized measures of academic achievement among 58 adolescents in an alternative school. Analysis suggested association between attention problems and academic achievement was primarily due to inattention component of the syndrome rather than the…

  13. Listening Responsively

    ERIC Educational Resources Information Center

    Callahan, Kadian M.

    2011-01-01

    Standards documents, such as the Common Core State Standards for Mathematics and "Principles and Standards for School Mathematics", expect teachers to foster mathematics learning by engaging students in meaningful mathematical discourse to expose students to different ways of thinking about and solving problems and positively influence their…

  14. 42 CFR 493.1233 - Standard: Complaint investigations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing General Laboratory Systems § 493.1233 Standard: Complaint investigations. The laboratory must have a system in place to ensure that it documents all complaints and problems reported to the laboratory...

  15. Three Perspectives on Standards: Positivism, Panopticism, and Intersubjectivism

    ERIC Educational Resources Information Center

    Lee, Cheu-jey

    2010-01-01

    Perhaps no other words occur more frequently than standards in today's discourse on educational reform. There is much debate about standards. Instead of taking sides on the debate, this paper argues that the problem with standards does not lie so much in standards themselves as in how they are viewed by those who make them and those who are held…

  16. [The environment and health. Of the difficulty of reconciling environmental and health standards in cultural nature].

    PubMed

    Mittelstrass, J

    1989-09-15

    Scientific cultures, i.e. modern industrial societies, create their own environment. The expression denoting such a creation is a Kultur-Natur ('cultural nature') determined by environmental and health standards. These standards are neither natural laws nor can they be derived from nature. They are instead a part of human rationality. They also have an ethical dimension. The argument focuses on the following aspects: (scientific and technological) rationality as problem solver and problem producer, exploration of the concept of the Kultur-Natur, the status of environmental and health standards, presenting the case for the concept of rational ethics (Vernunftethik) against the concept of ecological ethics and the supplementation of a research imperative by an ethical imperative.

  17. Ethics and choosing appropriate means to an end: problems with coal mine and nuclear workplace safety.

    PubMed

    Shrader-Frechette, Kristin; Cooke, Roger

    2004-02-01

    A common problem in ethics is that people often desire an end but fail to take the means necessary to achieve it. Employers and employees may desire the safety end mandated by performance standards for pollution control, but they may fail to employ the means, specification standards, necessary to achieve this end. This article argues that current (de jure) performance standards, for lowering employee exposures to ionizing radiation, fail to promote de facto worker welfare, in part because employers and employees do not follow the necessary means (practices known as specification standards) to achieve the end (performance standards) of workplace safety. To support this conclusion, the article argues that (1) safety requires attention to specification, as well as performance, standards; (2) coal-mine specification standards may fail to promote performance standards; (3) nuclear workplace standards may do the same; (4) choosing appropriate means to the end of safety requires attention to the ways uncertainties and variations in exposure may mask violations of standards; and (5) correcting regulatory inattention to differences between de jure and de facto is necessary for achievement of ethical goals for safety.

  18. Usability of HL7 and SNOMED CT standards in Java Persistence API environment.

    PubMed

    Antal, Gábor; Végh, Ádám Zoltán; Bilicki, Vilmos

    2014-01-01

    Due to the need for an efficient way of communication between the different stakeholders of healthcare (e.g. doctors, pharmacists, hospitals, patients etc.), the possibility of integrating different healthcare systems occurs. However, during the integration process several problems of heterogeneity might come up, which can turn integration into a difficult task. These problems motivated the development of healthcare information standards. The main goal of the HL7 family of standards is the standardization of communication between clinical systems and the unification of clinical document formats on the structural level. The SNOMED CT standard aims the unification of the healthcare terminology, thus the development of a standard on lexical level. The goal of this article is to introduce the usability of these two standards in Java Persistence API (JPA) environment, and to examine how standard-based system components can be efficiently generated. First, we shortly introduce the structure of the standards, their advantages and disadvantages. Then, we present an architecture design method, which can help to eliminate the possible structural drawbacks of the standards, and makes code generating tools applicable for the automatic production of certain system components.

  19. Keep It in Proportion.

    ERIC Educational Resources Information Center

    Snider, Richard G.

    1985-01-01

    The ratio factors approach involves recognizing a given fraction, then multiplying so that units cancel. This approach, which is grounded in concrete operational thinking patterns, provides a standard for science ratio and proportion problems. Examples are included for unit conversions, mole problems, molarity, speed/density problems, and…

  20. External Standards or Standard Addition? Selecting and Validating a Method of Standardization

    NASA Astrophysics Data System (ADS)

    Harvey, David T.

    2002-05-01

    A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.

  1. Faculty Perspectives on International Accounting Topics.

    ERIC Educational Resources Information Center

    Smith, L. Murphy; Salter, Stephen B.

    1996-01-01

    A survey of 63 professors specializing in international accounting identified the following topics as most important to incorporate into the curriculum: (1) foreign currency translation; (2) international accounting standards; (3) comparative standards and harmonizing of accounting standards; (4) reporting and disclosure problems of multinational…

  2. NHEXAS PHASE I MARYLAND STUDY--STANDARD OPERATING PROCEDURE FOR PROBLEM MANAGEMENT (G06)

    EPA Science Inventory

    The purpose of this SOP is to describe problem management, and to define a set of reporting actions to be taken in the event of a problem during any phase of the study. This procedure outlines the steps for making a problem known in order that it may be systematically resolved b...

  3. Neurons and the Process Standards

    ERIC Educational Resources Information Center

    Zambo, Ron; Zambo, Debby

    2011-01-01

    The classic Chickens and Pigs problem is considered to be an algebraic problem with two equations and two unknowns. In this article, the authors describe how third-grade teacher Maria is using it to develop a problem-based lesson because she is looking to her students' future needs. As Maria plans, she considers how a series of problems with the…

  4. Epistemic Beliefs about Justification Employed by Physics Students and Faculty in Two Different Problem Contexts

    ERIC Educational Resources Information Center

    Mercan, Fatih Caglayan

    2012-01-01

    This study examines the epistemic beliefs about justification employed by physics undergraduate and graduate students and faculty in the context of solving a standard classical physics problem and a frontier physics problem. Data were collected by a think-aloud problem solving session followed by a semi-structured interview conducted with 50…

  5. A Flipped Pedagogy for Expert Problem Solving

    NASA Astrophysics Data System (ADS)

    Pritchard, David

    The internet provides free learning opportunities for declarative (Wikipedia, YouTube) and procedural (Kahn Academy, MOOCs) knowledge, challenging colleges to provide learning at a higher cognitive level. Our ``Modeling Applied to Problem Solving'' pedagogy for Newtonian Mechanics imparts strategic knowledge - how to systematically determine which concepts to apply and why. Declarative and procedural knowledge is learned online before class via an e-text, checkpoint questions, and homework on edX.org (see http://relate.mit.edu/physicscourse); it is organized into five Core Models. Instructors then coach students on simple ``touchstone problems'', novel exercises, and multi-concept problems - meanwhile exercising three of the four C's: communication, collaboration, critical thinking and problem solving. Students showed 1.2 standard deviations improvement on the MIT final exam after three weeks instruction, a significant positive shift in 7 of the 9 categories in the CLASS, and their grades improved by 0.5 standard deviation in their following physics course (Electricity and Magnetism).

  6. Iterative algorithms for a non-linear inverse problem in atmospheric lidar

    NASA Astrophysics Data System (ADS)

    Denevi, Giulia; Garbarino, Sara; Sorrentino, Alberto

    2017-08-01

    We consider the inverse problem of retrieving aerosol extinction coefficients from Raman lidar measurements. In this problem the unknown and the data are related through the exponential of a linear operator, the unknown is non-negative and the data follow the Poisson distribution. Standard methods work on the log-transformed data and solve the resulting linear inverse problem, but neglect to take into account the noise statistics. In this study we show that proper modelling of the noise distribution can improve substantially the quality of the reconstructed extinction profiles. To achieve this goal, we consider the non-linear inverse problem with non-negativity constraint, and propose two iterative algorithms derived using the Karush-Kuhn-Tucker conditions. We validate the algorithms with synthetic and experimental data. As expected, the proposed algorithms out-perform standard methods in terms of sensitivity to noise and reliability of the estimated profile.

  7. 75 FR 22291 - Safety Standard for Toddler Beds

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-28

    ... the next most commonly reported problems. However, only two injuries--one laceration and one ingestion--resulted from these problems. Product integrity issues, mostly integrity of the mattress-support, were the... assembly instructions because consumer misassembly has been a problem with similar products, such as cribs...

  8. MOTOR VEHICLE SAFETY: NHTSA’s Ability to Detect and Recall Defective Replacement Crash Parts Is Limited

    DTIC Science & Technology

    2001-01-01

    incorporate airbags , under the used vehicle provision. NHTSA has not developed such standards because it has not identified significant problems with...might incorporate airbags . NHTSA has not developed such standards because it has not identified significant problems with occupant restraint systems...Appendix I: Scope and Methodology 24 Appendix II: State Legislation Governing Aftermarket Crash Parts and Recycled Airbags 27 Figures Figure 1: Replacement

  9. Development of Learning Models Based on Problem Solving and Meaningful Learning Standards by Expert Validity for Animal Development Course

    NASA Astrophysics Data System (ADS)

    Lufri, L.; Fitri, R.; Yogica, R.

    2018-04-01

    The purpose of this study is to produce a learning model based on problem solving and meaningful learning standards by expert assessment or validation for the course of Animal Development. This research is a development research that produce the product in the form of learning model, which consist of sub product, namely: the syntax of learning model and student worksheets. All of these products are standardized through expert validation. The research data is the level of validity of all sub products obtained using questionnaire, filled by validators from various field of expertise (field of study, learning strategy, Bahasa). Data were analysed using descriptive statistics. The result of the research shows that the problem solving and meaningful learning model has been produced. Sub products declared appropriate by expert include the syntax of learning model and student worksheet.

  10. An improved random walk algorithm for the implicit Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keady, Kendra P., E-mail: keadyk@lanl.gov; Cleveland, Mathew A.

    In this work, we introduce a modified Implicit Monte Carlo (IMC) Random Walk (RW) algorithm, which increases simulation efficiency for multigroup radiative transfer problems with strongly frequency-dependent opacities. To date, the RW method has only been implemented in “fully-gray” form; that is, the multigroup IMC opacities are group-collapsed over the full frequency domain of the problem to obtain a gray diffusion problem for RW. This formulation works well for problems with large spatial cells and/or opacities that are weakly dependent on frequency; however, the efficiency of the RW method degrades when the spatial cells are thin or the opacities aremore » a strong function of frequency. To address this inefficiency, we introduce a RW frequency group cutoff in each spatial cell, which divides the frequency domain into optically thick and optically thin components. In the modified algorithm, opacities for the RW diffusion problem are obtained by group-collapsing IMC opacities below the frequency group cutoff. Particles with frequencies above the cutoff are transported via standard IMC, while particles below the cutoff are eligible for RW. This greatly increases the total number of RW steps taken per IMC time-step, which in turn improves the efficiency of the simulation. We refer to this new method as Partially-Gray Random Walk (PGRW). We present numerical results for several multigroup radiative transfer problems, which show that the PGRW method is significantly more efficient than standard RW for several problems of interest. In general, PGRW decreases runtimes by a factor of ∼2–4 compared to standard RW, and a factor of ∼3–6 compared to standard IMC. While PGRW is slower than frequency-dependent Discrete Diffusion Monte Carlo (DDMC), it is also easier to adapt to unstructured meshes and can be used in spatial cells where DDMC is not applicable. This suggests that it may be optimal to employ both DDMC and PGRW in a single simulation.« less

  11. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  12. The SPH consistency problem and some astrophysical applications

    NASA Astrophysics Data System (ADS)

    Klapp, Jaime; Sigalotti, Leonardo; Rendon, Otto; Gabbasov, Ruslan; Torres, Ayax

    2017-11-01

    We discuss the SPH kernel and particle consistency problem and demonstrate that SPH has a limiting second-order convergence rate. We also present a solution to the SPH consistency problem. We present examples of how SPH implementations that are not mathematically consistent may lead to erroneous results. The new formalism has been implemented into the Gadget 2 code, including an improved scheme for the artificial viscosity. We present results for the ``Standard Isothermal Test Case'' of gravitational collapse and fragmentation of protostellar molecular cores that produce a very different evolution than with the standard SPH theory. A further application of accretion onto a black hole is presented.

  13. An assessment of RELAP5-3D using the Edwards-O'Brien Blowdown problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomlinson, E.T.; Aumiller, D.L.

    1999-07-01

    The RELAP5-3D (version bt) computer code was used to assess the United States Nuclear Regulatory Commission's Standard Problem 1 (Edwards-O'Brien Blowdown Test). The RELAP5-3D standard installation problem based on the Edwards-O'Brien Blowdown Test was modified to model the appropriate initial conditions and to represent the proper location of the instruments present in the experiment. The results obtained using the modified model are significantly different from the original calculation indicating the need to model accurately the experimental conditions if an accurate assessment of the calculational model is to be obtained.

  14. Selection and properties of alternative forming fluids for TRISO fuel kernel production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, M. P.; King, J. C.; Gorman, B. P.

    2013-01-01

    Current Very High Temperature Reactor (VHTR) designs incorporate TRi-structural ISOtropic (TRISO) fuel, which consists of a spherical fissile fuel kernel surrounded by layers of pyrolytic carbon and silicon carbide. An internal sol-gel process forms the fuel kernel using wet chemistry to produce uranium oxyhydroxide gel spheres by dropping a cold precursor solution into a hot column of trichloroethylene (TCE). Over time, gelation byproducts inhibit complete gelation, and the TCE must be purified or discarded. The resulting TCE waste stream contains both radioactive and hazardous materials and is thus considered a mixed hazardous waste. Changing the forming fluid to a non-hazardousmore » alternative could greatly improve the economics of TRISO fuel kernel production. Selection criteria for a replacement forming fluid narrowed a list of ~10,800 chemicals to yield ten potential replacement forming fluids: 1-bromododecane, 1- bromotetradecane, 1-bromoundecane, 1-chlorooctadecane, 1-chlorotetradecane, 1-iododecane, 1-iodododecane, 1-iodohexadecane, 1-iodooctadecane, and squalane. The density, viscosity, and surface tension for each potential replacement forming fluid were measured as a function of temperature between 25 °C and 80 °C. Calculated settling velocities and heat transfer rates give an overall column height approximation. 1-bromotetradecane, 1-chlorooctadecane, and 1-iodododecane show the greatest promise as replacements, and future tests will verify their ability to form satisfactory fuel kernels.« less

  15. Enhancing the oxidation resistance of graphite by applying an SiC coat with crack healing at an elevated temperature

    NASA Astrophysics Data System (ADS)

    Park, Jae-Won; Kim, Eung-Seon; Kim, Jae-Un; Kim, Yootaek; Windes, William E.

    2016-08-01

    The potential of reducing the oxidation of the supporting graphite components during normal and/or accident conditions in the Very High Temperature Reactor (VHTR) design has been studied. In this work efforts have been made to slow the oxidation process of the graphite with a thin SiC coating (∼ 10 μm). Upon heating at ≥ 1173 K in air, the spallations and cracks were formed in the dense columnar structured SiC coating layer grown on the graphite with a functionally gradient electron beam physical vapor deposition (EB-PVD. In accordance with the formations of these defects, the sample was vigorously oxidized, leaving only the SiC coating layer. Then, efforts were made to heal the surface defects using additional EB-PVD with ion beam bombardment and chemical vapor deposition (CVD). The EB-PVD did not effectively heal the cracks. But, the CVD was more appropriate for crack healing, likely due to its excellent crack line filling capability with a high density and high aspect ratio. It took ∼ 34 min for the 20% weight loss of the CVD crack healed sample in the oxidation test with annealing at 1173 K, while it took ∼ 8 min for the EB-PVD coated sample, which means it took ∼4 times longer at 1173 K for the same weight reduction in this experimental set-up.

  16. Selection and properties of alternative forming fluids for TRISO fuel kernel production

    NASA Astrophysics Data System (ADS)

    Baker, M. P.; King, J. C.; Gorman, B. P.; Marshall, D. W.

    2013-01-01

    Current Very High Temperature Reactor (VHTR) designs incorporate TRi-structural ISOtropic (TRISO) fuel, which consists of a spherical fissile fuel kernel surrounded by layers of pyrolytic carbon and silicon carbide. An internal sol-gel process forms the fuel kernel using wet chemistry to produce uranium oxyhydroxide gel spheres by dropping a cold precursor solution into a hot column of trichloroethylene (TCE). Over time, gelation byproducts inhibit complete gelation, and the TCE must be purified or discarded. The resulting TCE waste stream contains both radioactive and hazardous materials and is thus considered a mixed hazardous waste. Changing the forming fluid to a non-hazardous alternative could greatly improve the economics of TRISO fuel kernel production. Selection criteria for a replacement forming fluid narrowed a list of ˜10,800 chemicals to yield ten potential replacement forming fluids: 1-bromododecane, 1-bromotetradecane, 1-bromoundecane, 1-chlorooctadecane, 1-chlorotetradecane, 1-iododecane, 1-iodododecane, 1-iodohexadecane, 1-iodooctadecane, and squalane. The density, viscosity, and surface tension for each potential replacement forming fluid were measured as a function of temperature between 25 °C and 80 °C. Calculated settling velocities and heat transfer rates give an overall column height approximation. 1-bromotetradecane, 1-chlorooctadecane, and 1-iodododecane show the greatest promise as replacements, and future tests will verify their ability to form satisfactory fuel kernels.

  17. The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts.

    PubMed

    Coderre, Sylvain P; Harasym, Peter; Mandin, Henry; Fick, Gordon

    2004-11-05

    Pencil-and-paper examination formats, and specifically the standard, five-option multiple-choice question, have often been questioned as a means for assessing higher-order clinical reasoning or problem solving. This study firstly investigated whether two paper formats with differing number of alternatives (standard five-option and extended-matching questions) can test problem-solving abilities. Secondly, the impact of the alternatives number on psychometrics and problem-solving strategies was examined. Think-aloud protocols were collected to determine the problem-solving strategy used by experts and non-experts in answering Gastroenterology questions, across the two pencil-and-paper formats. The two formats demonstrated equal ability in testing problem-solving abilities, while the number of alternatives did not significantly impact psychometrics or problem-solving strategies utilized. These results support the notion that well-constructed multiple-choice questions can in fact test higher order clinical reasoning. Furthermore, it can be concluded that in testing clinical reasoning, the question stem, or content, remains more important than the number of alternatives.

  18. A bottom-up approach to the strong CP problem

    NASA Astrophysics Data System (ADS)

    Diaz-Cruz, J. L.; Hollik, W. G.; Saldana-Salazar, U. J.

    2018-05-01

    The strong CP problem is one of many puzzles in the theoretical description of elementary particle physics that still lacks an explanation. While top-down solutions to that problem usually comprise new symmetries or fields or both, we want to present a rather bottom-up perspective. The main problem seems to be how to achieve small CP violation in the strong interactions despite the large CP violation in weak interactions. In this paper, we show that with minimal assumptions on the structure of mass (Yukawa) matrices, they do not contribute to the strong CP problem and thus we can provide a pathway to a solution of the strong CP problem within the structures of the Standard Model and no extension at the electroweak scale is needed. However, to address the flavor puzzle, models based on minimal SU(3) flavor groups leading to the proposed flavor matrices are favored. Though we refrain from an explicit UV completion of the Standard Model, we provide a simple requirement for such models not to show a strong CP problem by construction.

  19. Solving Capacitated Closed Vehicle Routing Problem with Time Windows (CCVRPTW) using BRKGA with local search

    NASA Astrophysics Data System (ADS)

    Prasetyo, H.; Alfatsani, M. A.; Fauza, G.

    2018-05-01

    The main issue in vehicle routing problem (VRP) is finding the shortest route of product distribution from the depot to outlets to minimize total cost of distribution. Capacitated Closed Vehicle Routing Problem with Time Windows (CCVRPTW) is one of the variants of VRP that accommodates vehicle capacity and distribution period. Since the main problem of CCVRPTW is considered a non-polynomial hard (NP-hard) problem, it requires an efficient and effective algorithm to solve the problem. This study was aimed to develop Biased Random Key Genetic Algorithm (BRKGA) that is combined with local search to solve the problem of CCVRPTW. The algorithm design was then coded by MATLAB. Using numerical test, optimum algorithm parameters were set and compared with the heuristic method and Standard BRKGA to solve a case study on soft drink distribution. Results showed that BRKGA combined with local search resulted in lower total distribution cost compared with the heuristic method. Moreover, the developed algorithm was found to be successful in increasing the performance of Standard BRKGA.

  20. [Geriatric assessment. Development, status quo and perspectives].

    PubMed

    Lüttje, D; Varwig, D; Teigel, B; Gilhaus, B

    2011-08-01

    Multimorbidity is typical for geriatric patients. Problems not identified in time may lead to increased hospitalisation or prolonged hospital stay. Problems of multimorbidity are not covered by most guidelines or clinical pathways. The geriatric assessment supports standard clinical and technical assessment. Geriatric identification screening is basic for general practitioners and in emergency rooms to filter those patients bearing a special risk. Geriatric basic assessment covers most of the problems relevant for people in old age, revealing even problems that had so far been hidden. It permits to structure a comprehensive and holistic therapeutic approach and to evaluate the targets of treatment relevant for independent living and well-being. This results in reduction of morbidity and mortality. Assessment tools focusing on pain, nutrition and frailty should be added to the standardized geriatric basic assessment in Germany.

  1. Plasma equilibrium with fast ion orbit width, pressure anisotropy, and toroidal flow effects

    DOE PAGES

    Gorelenkov, Nikolai N.; Zakharov, Leonid E.

    2018-04-27

    Here, we formulate the problem of tokamak plasma equilibrium including the toroidal flow and fast ion (or energetic particle, EP) pressure anisotropy and the finite drift orbit width (FOW) effects. The problem is formulated via the standard Grad-Shafranov equation (GShE) amended by the solvability condition which imposes physical constraints on allowed spacial dependencies of the anisotropic pressure. The GShE problem employs the pressure coupling scheme and includes the dominant diagonal terms and non-diagonal corrections to the standard pressure tensor. The anisotropic tensor elements are obtained via the distribution function represented in the factorized form via the constants of motion. Consideredmore » effects on the plasma equilibrium are estimated analytically, if possible, to understand their importance for GShE tokamak plasma problem.« less

  2. Plasma equilibrium with fast ion orbit width, pressure anisotropy, and toroidal flow effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorelenkov, Nikolai N.; Zakharov, Leonid E.

    Here, we formulate the problem of tokamak plasma equilibrium including the toroidal flow and fast ion (or energetic particle, EP) pressure anisotropy and the finite drift orbit width (FOW) effects. The problem is formulated via the standard Grad-Shafranov equation (GShE) amended by the solvability condition which imposes physical constraints on allowed spacial dependencies of the anisotropic pressure. The GShE problem employs the pressure coupling scheme and includes the dominant diagonal terms and non-diagonal corrections to the standard pressure tensor. The anisotropic tensor elements are obtained via the distribution function represented in the factorized form via the constants of motion. Consideredmore » effects on the plasma equilibrium are estimated analytically, if possible, to understand their importance for GShE tokamak plasma problem.« less

  3. NEW U.S. EPA STANDARDS AND PROBLEMS ASSOCIATED WITH MEASUREMENT OF POLLUTANTS: IMPLICATION FOR FILTER MANUFACTURERS

    EPA Science Inventory

    This presentation will describe the following items: important epidemiologic data establishing the need for new particulate matter standards, the size distribution of suspended particulate matter, epidemiologic data demonstrating the need for a fine particle standard, indicator a...

  4. USL/DBMS NASA/PC R and D project system design standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Moreau, Dennis R.

    1984-01-01

    A set of system design standards intended to assure the completeness and quality of designs developed for PC research and development projects is established. The standards presented address the areas of problem definition, initial design plan, design specification, and re-evaluation.

  5. Measurement standards for interdisciplinary medical rehabilitation.

    PubMed

    Johnston, M V; Keith, R A; Hinderer, S R

    1992-12-01

    Rehabilitation must address problems inherent in the measurement of human function and health-related quality of life, as well as problems in diagnosis and measurement of impairment. This educational document presents an initial set of standards to be used as guidelines for development and use of measurement and evaluation procedures and instruments for interdisciplinary, health-related rehabilitation. Part I covers general measurement principles and technical standards, beginning with validity, the central consideration for use of measures. Subsequent sections focus on reliability and errors of measurement, norms and scaling, development of measures, and technical manuals and guides. Part II covers principles and standards for use of measures. General principles of application of measures in practice are discussed first, followed by standards to protect persons being measured and then by standards for administrative applications. Many explanations, examples, and references are provided to help professionals understand measurement principles. Improved measurement will ensure the basis of rehabilitation as a science and nourish its success as a clinical service.

  6. Review of USGS Open-file Report 95-525 ("Cartographic and digital standard for geologic map information") and plans for development of Federal draft standards for geologic map information

    USGS Publications Warehouse

    Soller, David R.

    1996-01-01

    This report summarizes a technical review of USGS Open-File Report 95-525, 'Cartographic and Digital Standard for Geologic Map Information' and OFR 95-526 (diskettes containing digital representations of the standard symbols). If you are considering the purchase or use of those documents, you should read this report first. For some purposes, OFR 95-525 (the printed document) will prove to be an excellent resource. However, technical review identified significant problems with the two documents that will be addressed by various Federal and State committees composed of geologists and cartographers, as noted below. Therefore, the 2-year review period noted in OFR 95-525 is no longer applicable. Until those problems are resolved and formal standards are issued, you may consult the following World-Wide Web (WWW) site which contains information about development of geologic map standards: URL: http://ncgmp.usgs.gov/ngmdbproject/home.html

  7. Gravitational Field as a Pressure Force from Logarithmic Lagrangians and Non-Standard Hamiltonians: The Case of Stellar Halo of Milky Way

    NASA Astrophysics Data System (ADS)

    El-Nabulsi, Rami Ahmad

    2018-03-01

    Recently, the notion of non-standard Lagrangians was discussed widely in literature in an attempt to explore the inverse variational problem of nonlinear differential equations. Different forms of non-standard Lagrangians were introduced in literature and have revealed nice mathematical and physical properties. One interesting form related to the inverse variational problem is the logarithmic Lagrangian, which has a number of motivating features related to the Liénard-type and Emden nonlinear differential equations. Such types of Lagrangians lead to nonlinear dynamics based on non-standard Hamiltonians. In this communication, we show that some new dynamical properties are obtained in stellar dynamics if standard Lagrangians are replaced by Logarithmic Lagrangians and their corresponding non-standard Hamiltonians. One interesting consequence concerns the emergence of an extra pressure term, which is related to the gravitational field suggesting that gravitation may act as a pressure in a strong gravitational field. The case of the stellar halo of the Milky Way is considered.

  8. Introduction to Problem Solving, Grades 6-8 [with CD-ROM]. The Math Process Standards, Grades 6-8 Series

    ERIC Educational Resources Information Center

    Schackow, Joy Bronston; O'Connell, Susan

    2008-01-01

    The National Council of Teachers of Mathematics' (NCTM's) Process Standards support teaching that helps students develop independent, effective mathematical thinking. The books in the Heinemann Math Process Standards Series give every middle grades math teacher the opportunity to explore each standard in depth. The series offers friendly,…

  9. Cataloguing Standards; The Report of the Canadian Task Group on Cataloguing Standards.

    ERIC Educational Resources Information Center

    National Library of Canada, Ottawa (Ontario).

    Following the recommendations of the National Conference on Cataloguing Standards held at the National Library of Canada in May 1970, a Canadian Task Group on Cataloguing Standards was set up to study and identify present deficiencies in the organizing and processing of Canadian material, and the cataloging problems of Canadian libraries, and to…

  10. SODA FOUNTAIN-LUNCHEONETTE EQUIPMENT AND APPURTENANCES. NATIONAL SANITATION FOUNDATION STANDARD NO. 1.

    ERIC Educational Resources Information Center

    National Sanitation Foundation, Ann Arbor, MI.

    THIS STANDARD OF SODA FOUNTAIN-LUNCHEONETTE EQUIPMENT IS THE FIRST IN A SERIES OF NATIONAL SANITATION FOUNDATION STANDARDS. THESE STANDARDS ARE ISSUED IN RECOGNITION OF THE LONG FELT NEED FOR A COMMON UNDERSTANDING OF THE PROBLEMS OF SANITATION INVOLVING INDUSTRIAL AND ADMINISTRATIVE HEALTH OFFICIALS WHOSE OBLIGATION IT IS TO ENFORCE REGULATIONS.…

  11. Effect of Causal Stories in Solving Mathematical Story Problems

    ERIC Educational Resources Information Center

    Smith, Glenn Gordon; Gerretson, Helen; Olkun, Sinan; Joutsenlahti, Jorma

    2010-01-01

    This study investigated whether infusing "causal" story elements into mathematical word problems improves student performance. In one experiment in the USA and a second in USA, Finland and Turkey, undergraduate elementary education majors worked word problems in three formats: 1) standard (minimal verbiage), 2) potential causation…

  12. Ethical Principles, Practices, and Problems in Higher Education.

    ERIC Educational Resources Information Center

    Baca, M. Carlota, Ed.; Stein, Ronald H., Ed.

    Eighteen professionals analyze the ethical principles, practices, and problems in institutions of higher learning by examining the major issues facing higher education today. Focusing on ethical standards and judgements that affect decision-making and problem-solving, the contributors review the rights and responsibilities of academic freedom,…

  13. Peer Victimization as a Mediator of the Relation between Facial Attractiveness and Internalizing Problems

    ERIC Educational Resources Information Center

    Rosen, Lisa H.; Underwood, Marion K.; Beron, Kurt J.

    2011-01-01

    This study examined the relations among facial attractiveness, peer victimization, and internalizing problems in early adolescence. We hypothesized that experiences of peer victimization would partially mediate the relationship between attractiveness and internalizing problems. Ratings of attractiveness were obtained from standardized photographs…

  14. Theory of wide-angle photometry from standard stars

    NASA Technical Reports Server (NTRS)

    Usher, Peter D.

    1989-01-01

    Wide angle celestial structures, such as bright comet tails and nearby galaxies and clusters of galaxies, rely on photographic methods for quantified morphology and photometry, primarily because electronic devices with comparable resolution and sky coverage are beyond current technological capability. The problem of the photometry of extended structures and of how this problem may be overcome through calibration by photometric standard stars is examined. The perfect properties of the ideal field of view are stated in the guise of a radiometric paraxial approximation, in the hope that fields of view of actual telescopes will conform. Fundamental radiometric concepts are worked through before the issue of atmospheric attenuation is addressed. The independence of observed atmospheric extinction and surface brightness leads off the quest for formal solutions to the problem of surface photometry. Methods and problems of solution are discussed. The spectre is confronted in the spirit of standard stars and shown to be chimerical in that light, provided certain rituals are adopted. After a brief discussion of Baker-Sampson polynomials and the vexing issue of saturation, a pursuit is made of actual numbers to be expected in real cases. While the numbers crunched are gathered ex nihilo, they demonstrate the feasibility of Newton's method in the solution of this overdetermined, nonlinear, least square, multiparametric, photometric problem.

  15. Improving data quality in the linked open data: a survey

    NASA Astrophysics Data System (ADS)

    Hadhiatma, A.

    2018-03-01

    The Linked Open Data (LOD) is “web of data”, a different paradigm from “web of document” commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.

  16. 40 CFR 92.11 - Compliance with emission standards in extraordinary circumstances.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standards in extraordinary circumstances. The provisions of this section are intended to address problems... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Compliance with emission standards in extraordinary circumstances. 92.11 Section 92.11 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  17. 40 CFR 63.5910 - What reports must I submit and when?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Standards for Hazardous Air Pollutants: Reinforced Plastic Composites Production Notifications, Reports, and... period into those that are due to startup, shutdown, control equipment problems, process problems, other...

  18. 40 CFR 63.5910 - What reports must I submit and when?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards for Hazardous Air Pollutants: Reinforced Plastic Composites Production Notifications, Reports, and... period into those that are due to startup, shutdown, control equipment problems, process problems, other...

  19. 42 CFR 493.1451 - Standard: Technical supervisor responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... testing samples; and (vi) Assessment of problem solving skills; and (9) Evaluating and documenting the... analysis and reporting of test results; (5) Resolving technical problems and ensuring that remedial actions...

  20. Literacy, Numeracy, and Problem Solving in Technology-Rich Environments among U.S. Adults: Results from the Program for the International Assessment of Adult Competencies 2012. Appendix D: Standard Error Tables. First Look. NCES 2014-008

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2013

    2013-01-01

    This paper provides Appendix D, Standard Error tables, for the full report, entitled. "Literacy, Numeracy, and Problem Solving in Technology-Rich Environments among U.S. Adults: Results from the Program for the International Assessment of Adult Competencies 2012. First Look. NCES 2014-008." The full report presents results of the Program…

  1. Visual field defects may not affect safe driving.

    PubMed

    Dow, Jamie

    2011-10-01

    In Quebec a driver whose acquired visual field defect renders them ineligible for a driver's permit renewal may request an exemption from the visual field standard by demonstrating safe driving despite the defect. For safety reasons it was decided to attempt to identify predictors of failure on the road test in order to avoid placing driving evaluators in potentially dangerous situations when evaluating drivers with visual field defects. During a 4-month period in 2009 all requests for exemptions from the visual field standard were collected and analyzed. All available medical and visual field data were collated for 103 individuals, of whom 91 successfully completed the evaluation process and obtained a waiver. The collated data included age, sex, type of visual field defect, visual field characteristics, and concomitant medical problems. No single factor, or combination of factors, could predict failure of the road test. All 5 failures of the road test had cognitive problems but 6 of the successful drivers also had known cognitive problems. Thus, cognitive problems influence the risk of failure but do not predict certain failure. Most of the applicants for an exemption were able to complete the evaluation process successfully, thereby demonstrating safe driving despite their handicap. Consequently, jurisdictions that have visual field standards for their driving permit should implement procedures to evaluate drivers with visual field defects that render them unable to meet the standard but who wish to continue driving.

  2. Estimates and Standard Errors for Ratios of Normalizing Constants from Multiple Markov Chains via Regeneration

    PubMed Central

    Doss, Hani; Tan, Aixin

    2017-01-01

    In the classical biased sampling problem, we have k densities π1(·), …, πk(·), each known up to a normalizing constant, i.e. for l = 1, …, k, πl(·) = νl(·)/ml, where νl(·) is a known function and ml is an unknown constant. For each l, we have an iid sample from πl,·and the problem is to estimate the ratios ml/ms for all l and all s. This problem arises frequently in several situations in both frequentist and Bayesian inference. An estimate of the ratios was developed and studied by Vardi and his co-workers over two decades ago, and there has been much subsequent work on this problem from many different perspectives. In spite of this, there are no rigorous results in the literature on how to estimate the standard error of the estimate. We present a class of estimates of the ratios of normalizing constants that are appropriate for the case where the samples from the πl’s are not necessarily iid sequences, but are Markov chains. We also develop an approach based on regenerative simulation for obtaining standard errors for the estimates of ratios of normalizing constants. These standard error estimates are valid for both the iid case and the Markov chain case. PMID:28706463

  3. Estimates and Standard Errors for Ratios of Normalizing Constants from Multiple Markov Chains via Regeneration.

    PubMed

    Doss, Hani; Tan, Aixin

    2014-09-01

    In the classical biased sampling problem, we have k densities π 1 (·), …, π k (·), each known up to a normalizing constant, i.e. for l = 1, …, k , π l (·) = ν l (·)/ m l , where ν l (·) is a known function and m l is an unknown constant. For each l , we have an iid sample from π l , · and the problem is to estimate the ratios m l /m s for all l and all s . This problem arises frequently in several situations in both frequentist and Bayesian inference. An estimate of the ratios was developed and studied by Vardi and his co-workers over two decades ago, and there has been much subsequent work on this problem from many different perspectives. In spite of this, there are no rigorous results in the literature on how to estimate the standard error of the estimate. We present a class of estimates of the ratios of normalizing constants that are appropriate for the case where the samples from the π l 's are not necessarily iid sequences, but are Markov chains. We also develop an approach based on regenerative simulation for obtaining standard errors for the estimates of ratios of normalizing constants. These standard error estimates are valid for both the iid case and the Markov chain case.

  4. The Dreaded "Work" Problems Revisited: Connections through Problem Solving from Basic Fractions to Calculus

    ERIC Educational Resources Information Center

    Shore, Felice S.; Pascal, Matthew

    2008-01-01

    This article describes several distinct approaches taken by preservice elementary teachers to solving a classic rate problem. Their approaches incorporate a variety of mathematical concepts, ranging from proportions to infinite series, and illustrate the power of all five NCTM Process Standards. (Contains 8 figures.)

  5. Activities: Activities to Introduce Maxima-Minima Problems.

    ERIC Educational Resources Information Center

    Pleacher, David

    1991-01-01

    Presented are student activities that involve two standard problems from geometry and calculus--the volume of a box and the bank shot on a pool table. Problem solving is emphasized as a method of inquiry and application with descriptions of the results using graphical, numerical, and physical models. (JJK)

  6. The Problem of Faculty Relocation.

    ERIC Educational Resources Information Center

    Tabachnick, Stephen E.

    1992-01-01

    A faculty move to a new campus can be traumatic, but colleges and universities can take steps to lessen the strain. Solutions to faculty relocation problems should be a standard part of any hiring package, not left to chance and individual negotiation. Some problems are inexpensive and easy to solve. (MSE)

  7. Child and Family Predictors of Therapy Outcome for Children with Behavioral and Emotional Problems

    ERIC Educational Resources Information Center

    Hemphill, Sheryl A.; Littlefield, Lyn

    2006-01-01

    This study investigated the characteristics of 106 children primarily referred for externalizing behavior problems and their families, and assessed the prediction of treatment outcome following a standardized short-term, cognitive behavioral group program. "Exploring Together" comprised a children's group (anger management, problem-solving and…

  8. [Research progress on standards of commodity classes of Chinese materia medica and discussion on several key problems].

    PubMed

    Yang, Guang; Zeng, Yan; Guo, Lan-Ping; Huang, Lu-Qi; Jin, Yan; Zheng, Yu-Guang; Wang, Yong-Yan

    2014-05-01

    Standards of commodity classes of Chinese materia medica is an important way to solve the "Lemons Problem" of traditional Chinese medicine market. Standards of commodity classes are also helpful to rebuild market mechanisms for "high price for good quality". The previous edition of commodity classes standards of Chinese materia medica was made 30 years ago. It is no longer adapted to the market demand. This article researched progress on standards of commodity classes of Chinese materia medica. It considered that biological activity is a better choice than chemical constituents for standards of commodity classes of Chinese materia medica. It is also considered that the key point to set standards of commodity classes is finding the influencing factors between "good quality" and "bad quality". The article also discussed the range of commodity classes of Chinese materia medica, and how to coordinate standards of pharmacopoeia and commodity classes. According to different demands, diversiform standards can be used in commodity classes of Chinese materia medica, but efficacy is considered the most important index of commodity standard. Decoction pieces can be included in standards of commodity classes of Chinese materia medica. The authors also formulated the standards of commodity classes of Notoginseng Radix as an example, and hope this study can make a positive and promotion effect on traditional Chinese medicine market related research.

  9. 40 CFR 63.4520 - What reports must I submit?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards for Hazardous Air Pollutants for Surface Coating of Plastic Parts and Products Notifications... problems, process problems, other known causes, and other unknown causes. (xi) A summary of the total...

  10. 40 CFR 63.4520 - What reports must I submit?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Standards for Hazardous Air Pollutants for Surface Coating of Plastic Parts and Products Notifications... problems, process problems, other known causes, and other unknown causes. (xi) A summary of the total...

  11. Meeting the New AASL Standards for the 21st-Century Learner via Big6 Problem Solving

    ERIC Educational Resources Information Center

    Needham, Joyce

    2010-01-01

    "AASL Standards for the 21st-Century Learner." New standards for library media programs! What does it mean to practicing library media specialists? Does this mean they must abandon all the strategies, activities, and lessons they have developed based upon "Information Power's Information Literacy Standards for Student Learning" and create all new…

  12. User-generated quality standards for youth mental health in primary care: a participatory research design using mixed methods

    PubMed Central

    Graham, Tanya; Rose, Diana; Murray, Joanna; Ashworth, Mark; Tylee, André

    2014-01-01

    Objectives To develop user-generated quality standards for young people with mental health problems in primary care using a participatory research model. Methods 50 young people aged 16–25 from community settings and primary care participated in focus groups and interviews about their views and experiences of seeking help for mental health problems in primary care, cofacilitated by young service users and repeated to ensure respondent validation. A second group of young people also aged 16–25 who had sought help for any mental health problem from primary care or secondary care within the last 5 years were trained as focus groups cofacilitators (n=12) developed the quality standards from the qualitative data and participated in four nominal groups (n=28). Results 46 quality standards were developed and ranked by young service users. Agreement was defined as 100% of scores within a two-point region. Group consensus existed for 16 quality standards representing the following aspects of primary care: better advertising and information (three); improved competence through mental health training and skill mix within the practice (two); alternatives to medication (three); improved referral protocol (three); and specific questions and reassurances (five). Alternatives to medication and specific questions and reassurances are aspects of quality which have not been previously reported. Conclusions We have demonstrated the feasibility of using participatory research methods in order to develop user-generated quality standards. The development of patient-generated quality standards may offer a more formal method of incorporating the views of service users into quality improvement initiatives. This method can be adapted for generating quality standards applicable to other patient groups. PMID:24920648

  13. Primary Discussion on Standardized Management of Purchasing Large Equipments for Measurement Technology Institution

    NASA Astrophysics Data System (ADS)

    Hu, Chang; Hu, Juanli; Zhou, Qi; Yang, Yue

    In view of current situation and existing problem on purchasing equipment for measurement technology institution, this paper analyzes key factors that affect the standardization of equipment procurement and it proposes a set of scientific and standardized solutions for equipment procurement based on actual work.

  14. Effect of Directed Study of Mathematics Vocabulary on Standardized Mathematics Assessment Questions

    ERIC Educational Resources Information Center

    Waite, Adel Marlane

    2017-01-01

    The problems under investigation included (a) Did a directed study of mathematics vocabulary significantly affect student performance levels on standardized mathematical questions? and (b) Did the strategies used in this study significantly affect student performance levels on standardized mathematical questions? The population consisted of…

  15. Choosing the Right Tool

    ERIC Educational Resources Information Center

    Boote, Stacy K.

    2016-01-01

    Students' success with fourth-grade content standards builds on mathematical knowledge learned in third grade and creates a conceptual foundation for division standards in subsequent grades that focus on the division algorithm. The division standards in fourth and fifth grade are similar; but in fourth grade, division problem divisors are only one…

  16. International Cooperation for a Single World Production Standard of High Definition Television.

    ERIC Educational Resources Information Center

    Hongcharu, Boonchai

    Broadcasters, television engineers and the production industry have encountered many problems with diverse television standards since the introduction of color television. With the advent of high definition television (HDTV), the chance to have a common production standard for international exchange of programs and technical information has…

  17. Supporting Mathematics Instruction through Community

    ERIC Educational Resources Information Center

    Amidon, Joel C.; Trevathan, Morgan L.

    2016-01-01

    Raising expectations is nothing new. Every iteration of standards elevates the expectations for what students should know and be able to do. The Common Core State Standards for Mathematics (CCSSM) is no exception, with standards for content and practice that move beyond memorization of traditional algorithms to "make sense of problems and…

  18. Planning Questions and Persevering in the Practices

    ERIC Educational Resources Information Center

    Gurl, Theresa J.; Fox, Ryan; Dabovic, Nikolina; Leavitt, Arielle Eager

    2016-01-01

    The implementation of the Common Core's Standards for Mathematical Practice can pose a challenge to all teachers of mathematics but especially to preservice teachers. These standards require teaching in a way that often differs from what preservice teachers have experienced as learners. Standard 1--"Make sense of problems and persevere in…

  19. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  20. Combinatorial algorithms for design of DNA arrays.

    PubMed

    Hannenhalli, Sridhar; Hubell, Earl; Lipshutz, Robert; Pevzner, Pavel A

    2002-01-01

    Optimal design of DNA arrays requires the development of algorithms with two-fold goals: reducing the effects caused by unintended illumination (border length minimization problem) and reducing the complexity of masks (mask decomposition problem). We describe algorithms that reduce the number of rectangles in mask decomposition by 20-30% as compared to a standard array design under the assumption that the arrangement of oligonucleotides on the array is fixed. This algorithm produces provably optimal solution for all studied real instances of array design. We also address the difficult problem of finding an arrangement which minimizes the border length and come up with a new idea of threading that significantly reduces the border length as compared to standard designs.

  1. Development of Finnish Elementary Pupils' Problem-Solving Skills in Mathematics

    ERIC Educational Resources Information Center

    Laine, Anu; Näveri, Liisa; Ahtee, Maija; Pehkonen, Erkki

    2014-01-01

    The purpose of this study is to determine how Finnish pupils' problem-solving skills develop from the 3rd to 5th grade. As research data, we use one non-standard problem from pre- and post-test material from a three-year follow-up study, in the area of Helsinki, Finland. The problems in both tests consisted of four questions related to each other.…

  2. Computational strategy for the solution of large strain nonlinear problems using the Wilkins explicit finite-difference approach

    NASA Technical Reports Server (NTRS)

    Hofmann, R.

    1980-01-01

    The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.

  3. The importance of production standard operating procedure in a family business company

    NASA Astrophysics Data System (ADS)

    Hongdiyanto, C.

    2017-12-01

    Plastic industry is a growing sector, therefore UD X which engage in this business has a great potential to grow as well. The problem faced by this family business company is that no standard operating procedure is used and it lead to problem in the quality and quantity produced. This research is aim to create a production standard operating procedure for UD X. Semistructure interview is used to gather information from respondent to help writer create the SOP. There are four SOP’s created, namely: classifying SOP, sorting SOP, milling SOP and packing SOP. Having SOP will improve the effectiveness of production because employees already know how to work in each stages of production process.

  4. The Posing of Arithmetic Problems by Mathematically Talented Students

    ERIC Educational Resources Information Center

    Espinoza González, Johan; Lupiáñez Gómez, José Luis; Segovia Alex, Isidoro

    2016-01-01

    Introduction: This paper analyzes the arithmetic problems posed by a group of mathematically talented students when given two problem-posing tasks, and compares these students' responses to those given by a standard group of public school students to the same tasks. Our analysis focuses on characterizing and identifying the differences between the…

  5. Following the Template: Transferring Modeling Skills to Nonstandard Problems

    ERIC Educational Resources Information Center

    Tyumeneva, Yu. A.; Goncharova, M. V.

    2017-01-01

    This study seeks to analyze how students apply a mathematical modeling skill that was previously learned by solving standard word problems to the solution of word problems with nonstandard contexts. During the course of an experiment involving 106 freshmen, we assessed how well they were able to transfer the mathematical modeling skill that is…

  6. An Introduction to Multilinear Formula Score Theory. Measurement Series 84-4.

    ERIC Educational Resources Information Center

    Levine, Michael V.

    Formula score theory (FST) associates each multiple choice test with a linear operator and expresses all of the real functions of item response theory as linear combinations of the operator's eigenfunctions. Hard measurement problems can then often be reformulated as easier, standard mathematical problems. For example, the problem of estimating…

  7. Pain as a Predictor of Sleep Problems in Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Tudor, Megan E.; Walsh, Caitlin E.; Mulder, Emile C.; Lerner, Matthew D.

    2015-01-01

    Evidence suggests that pain interferes with sleep in youth with developmental disabilities. This study examined the relationship between pain and sleep problems in a sample of youth with parent-reported autism spectrum disorder (N = 62). Mothers reported on standardized measures of pain and sleep problems. Youth demonstrated atypically high levels…

  8. Modelling Problem-Solving Situations into Number Theory Tasks: The Route towards Generalisation

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis; Iatridou, Maria

    2010-01-01

    This paper examines the way two 10th graders cope with a non-standard generalisation problem that involves elementary concepts of number theory (more specifically linear Diophantine equations) in the geometrical context of a rectangle's area. Emphasis is given on how the students' past experience of problem solving (expressed through interplay…

  9. Best Known Problem Solving Strategies in "High-Stakes" Assessments

    ERIC Educational Resources Information Center

    Hong, Dae S.

    2011-01-01

    In its mathematics standards, National Council of Teachers of Mathematics (NCTM) states that problem solving is an integral part of all mathematics learning and exposure to problem solving strategies should be embedded across the curriculum. Furthermore, by high school, students should be able to use, decide and invent a wide range of strategies.…

  10. The Impact of Tutoring on Early Reading Achievement for Children with and without Attention Problems

    ERIC Educational Resources Information Center

    Rabiner, David L.; Malone, Patrick S.

    2004-01-01

    This study examined whether the benefits of reading tutoring in first grade were moderated by children's level of attention problems. Participants were 581 children from the intervention and control samples of Fast Track, a longitudinal multisite investigation of the development and prevention of conduct problems. Standardized reading achievement…

  11. A Statewide Case Management, Surveillance, and Outcome Evaluation System for Children with Special Health Care Needs

    PubMed Central

    Monsen, Karen A.; Elsbernd, Scott A.; Barnhart, Linda; Stock, Jacquie; Prock, Carla E.; Looman, Wendy S.; Nardella, Maria

    2013-01-01

    Objectives. To evaluate the feasibility of implementing a statewide children with special health care needs (CSHCN) program evaluation, case management, and surveillance system using a standardized instrument and protocol that operationalized the United States Health and Human Services CSHCN National Performance Measures. Methods. Public health nurses in local public health agencies in Washington State jointly developed and implemented the standardized system. The instrument was the Omaha System. Descriptive statistics were used for the analysis of standardized data. Results. From the sample of CSHCN visit reports (n = 127), 314 problems and 853 interventions were documented. The most common problem identified was growth and development followed by health care supervision, communication with community resources, caretaking/parenting, income, neglect, and abuse. The most common intervention category was surveillance (60%), followed by case management (24%) and teaching, guidance, and counseling (16%). On average, there were 2.7 interventions per problem and 6.7 interventions per visit. Conclusions. This study demonstrates the feasibility of an approach for statewide CSHCN program evaluation, case management, and surveillance system. Knowledge, behavior, and status ratings suggest that there are critical unmet needs in the Washington State CSHCN population for six major problems. PMID:23533804

  12. Usability evaluation of Laboratory and Radiology Information Systems integrated into a hospital information system.

    PubMed

    Nabovati, Ehsan; Vakili-Arki, Hasan; Eslami, Saeid; Khajouei, Reza

    2014-04-01

    This study was conducted to evaluate the usability of widely used laboratory and radiology information systems. Three usability experts independently evaluated the user interfaces of Laboratory and Radiology Information Systems using heuristic evaluation method. They applied Nielsen's heuristics to identify and classify usability problems and Nielsen's severity rating to judge their severity. Overall, 116 unique heuristic violations were identified as usability problems. In terms of severity, 67 % of problems were rated as major and catastrophic. Among 10 heuristics, "consistency and standards" was violated most frequently. Moreover, mean severity of problems concerning "error prevention" and "help and documentation" heuristics was higher than of the others. Despite widespread use of specific healthcare information systems, they suffer from usability problems. Improving the usability of systems by following existing design standards and principles from the early phased of system development life cycle is recommended. Especially, it is recommended that the designers design systems that inhibit the initiation of erroneous actions and provide sufficient guidance to users.

  13. A Formidable Foe is Sabotaging Your Results: What You Should Know about Biofilms and Wound Healing

    PubMed Central

    Barker, Jenny C; Khansa, Ibrahim; Gordillo, Gayle M

    2017-01-01

    Learning Objectives After reading this article, the participant should be able to: 1. Describe biofilm pathogenesis as it relates to problem wounds, 2. Understand the pre-clinical and clinical evidence implicating biofilm in problem wounds, 3. Explain the diagnostic and treatment challenges that biofilms create for problem wounds, 4. Demonstrate a basic understanding of emerging strategies aimed at counteracting these processes. Summary Biofilm represents a protected mode of growth for bacteria, allowing them to evade standard diagnostic techniques and avoid eradication by standard therapies. Though only recently discovered, biofilm has existed for millennia and complicates nearly every aspect of medicine. Biofilm impacts wound healing by allowing bacteria to evade immune responses, prolonging inflammation and disabling skin barrier function. It is important to understand why problem wounds persist despite state-of-the-art treatment, why they are difficult to accurately diagnose, and why they recur. The aim of this article is to focus on current gaps in knowledge related to problem wounds, specifically, biofilm infection. PMID:28445380

  14. Extensions of the standard model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramond, P.

    1983-01-01

    In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinnmore » symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references.« less

  15. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  16. Outreach pharmacy service in old age homes: a Hong Kong experience.

    PubMed

    Lau, Wai-Man; Chan, Kit; Yung, Tsz-Ho; Lee, Anna See-Wing

    2003-06-01

    To explore drug-related problems in old age homes in Hong Kong through outreach pharmacy service. A standard form was used by outreach pharmacists to identify drug-related problems at old age homes. Homes were selected through random sampling, voluntary participation or adverse selection. Initial observation and assessment were performed in the first and second weeks. Appropriate advice and recommendations were given upon assessment and supplemented by a written report. Educational talks were provided to staff of the homes in addition to other drug information materials. At week 7 to 9, evaluations were carried out. Eighty-five homes were assessed and identified to have problems in the drug management system. These problems could generally be classified into physical storage (8.8%), quality of storage (19.2%), drug administration system (13.3%), documentation (16.4%), and drug knowledge of staff of homes (42.2%). Quality of drug storage was the most common problem found, followed by documentation and drug knowledge (73%, 50% and 44% of points assessed with problems, respectively). Apart from lack of drug knowledge and unawareness of potential risks by staff, minimal professional standards unmet may be fundamentally related to lack of professional input and inadequacy in legislation. Most homes demonstrated significant improvements upon simple interventions, from a majority of homes with more than 10 problems to a majority with less than 5 problems. Diverse problems in drug management are common in old age homes, which warrants attention and professional inputs. Simple interventions and education by pharmacists are shown to be effective in improving the quality of drug management and hence care to residents. While future financing of old age home service can be reviewed within the social context to provide incentives for improvement, review of regulatory policy with enforcement may be more fundamental and effective in upholding the service standard.

  17. Labor force participation and the influence of having back problems on income poverty in Australia.

    PubMed

    Schofield, Deborah J; Callander, Emily J; Shrestha, Rupendra N; Percival, Richard; Kelly, Simon J; Passey, Megan E

    2012-06-01

    Cross-sectional study of 45- to 64-year-old Australians. To assess the relationship between chronic back problems and being in income poverty among the older working-aged population. Older workers who leave the labor force due to chronic back problems have fragile economic situations and as such are likely to have poorer living standards. Poverty is one way of comparing the living standards of different individuals within society. The 2003 Survey of Disability, Ageing and Carers data were used, along with the 50% of the median equivalized income-unit income poverty line to identify those in poverty. Logistic regression models were used to look at the relationship between chronic back problems, labor force participation, and poverty. Regardless of labor force participation status (employed full-time, part-time, or not in the labor force at all), those with chronic back problems were significantly more likely to be in poverty. Those not in the labor force due to chronic back problems were significantly more likely to be in poverty than those in the labor force full-time with no chronic health condition (Odds ratio [OR]: 0.07, 95% CI: 0.07-0.07, P < 0.0001). Further, those employed part-time with no chronic health condition were 48% less likely to be in poverty (OR: 0.52, 95% CI: 0.51-0.53, P < 0.0001) than those also employed part-time but with chronic back problems. It was found that among those with back problems, those out of the labor force were significantly more likely to be in poverty than those employed part-time or full-time (OR: 0.44, 95% CI: 0.43-0.44, P < 0.0001; OR: 0.10, 95% CI: 0.10-0.10, P < 0.0001, respectively). This highlights the need to prevent and effectively treat chronic back problems, as these conditions are associated with reduced living standards.

  18. National Education Standards: Getting beneath the Surface. Policy Information Perspective

    ERIC Educational Resources Information Center

    Barton, Paul E.

    2009-01-01

    This report discusses issues involved in the debate over whether the United States should have national education standards, what must be considered in creating such standards, what problems must be addressed, and what trade-offs might be required among conflicting objectives. The first section provides a short summary of developments in education…

  19. Minority Language Standardisation and the Role of Users

    ERIC Educational Resources Information Center

    Lane, Pia

    2015-01-01

    Developing a standard for a minority language is not a neutral process; this has consequences for the status of the language and how the language users relate to the new standard. A potential inherent problem with standardisation is whether the language users themselves will accept and identify with the standard. When standardising minority…

  20. Beyond Standards: The Rest of the Agenda.

    ERIC Educational Resources Information Center

    Sobol, Thomas

    1997-01-01

    Argues that new high standards of curriculum content and student performance are important, but they alone are not enough. If traditional aspirations to make students wise and just are to be realized, It is necessary to move beyond standards to support teachers, provide necessary resources, nurture community, handle problems of race effectively,…

  1. The Problems of Educational Standards in the United States and Russia.

    ERIC Educational Resources Information Center

    Bespal'ko, V. P.

    1996-01-01

    Compares and contrasts the need for educational standards in the United States and Russia. Argues that both systems burden their students with an excess of peripheral and inconsequential material in order to satisfy outdated pedagogical objectives. Praises American efforts at creating national standards but questions their applicability to Russia.…

  2. Cost minimizing of cutting process for CNC thermal and water-jet machines

    NASA Astrophysics Data System (ADS)

    Tavaeva, Anastasia; Kurennov, Dmitry

    2015-11-01

    This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.

  3. What's the Problem? Familiarity Working Memory, and Transfer in a Problem-Solving Task.

    PubMed

    Kole, James A; Snyder, Hannah R; Brojde, Chandra L; Friend, Angela

    2015-01-01

    The contributions of familiarity and working memory to transfer were examined in the Tower of Hanoi task. Participants completed 3 different versions of the task: a standard 3-disk version, a clothing exchange task that included familiar semantic content, and a tea ceremony task that included unfamiliar semantic content. The constraints on moves were equivalent across tasks, and each could be solved with the same sequence of movements. Working memory demands were manipulated by the provision of a (static or dynamic) visual representation of the problem. Performance was equivalent for the standard Tower of Hanoi and clothing exchange tasks but worse for the tea ceremony task, and it decreased with increasing working memory demands. Furthermore, the standard Tower of Hanoi task and clothing exchange tasks independently, additively, and equivalently transferred to subsequent tasks, whereas the tea ceremony task did not. The results suggest that both familiarity and working memory demands determine overall level of performance, whereas familiarity influences transfer.

  4. The inverse problem of estimating the gravitational time dilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gusev, A. V., E-mail: avg@sai.msu.ru; Litvinov, D. A.; Rudenko, V. N.

    2016-11-15

    Precise testing of the gravitational time dilation effect suggests comparing the clocks at points with different gravitational potentials. Such a configuration arises when radio frequency standards are installed at orbital and ground stations. The ground-based standard is accessible directly, while the spaceborne one is accessible only via the electromagnetic signal exchange. Reconstructing the current frequency of the spaceborne standard is an ill-posed inverse problem whose solution depends significantly on the characteristics of the stochastic electromagnetic background. The solution for Gaussian noise is known, but the nature of the standards themselves is associated with nonstationary fluctuations of a wide class ofmore » distributions. A solution is proposed for a background of flicker fluctuations with a spectrum (1/f){sup γ}, where 1 < γ < 3, and stationary increments. The results include formulas for the error in reconstructing the frequency of the spaceborne standard and numerical estimates for the accuracy of measuring the relativistic redshift effect.« less

  5. The Health Care Financing Administration's new examination documentation criteria: minimum auditing standards for the neurologic examination to be used by Medicare and other payors. Report from the American Academy of Neurology Medical Economics and Management Subcommittee.

    PubMed

    Nuwer, M R; Sigsbee, B

    1998-02-01

    Medicare recently announced the adoption of minimum documentation criteria for the neurologic examination. These criteria are added to existing standards for the history and medical decision-making. These criteria will be used in compliance audits by Medicare and other payors. Given the current federal initiative to eliminate fraud in the Medicare program, all neurologists need to comply with these standards. These criteria are for documentation only. Neurologic standards of care require a more complex and diverse examination pertinent to the problem(s) under consideration. Further guidance as to the content of a neurologic evaluation is outlined in the article "Practice guidelines: Neurologic evaluation" (Neurology 1990; 40: 871). The level of history and examination required for specific services is defined in the American Medical Association current procedural terminology book. Documentation standards for examination of children are not yet defined.

  6. Standardized Tests as Outcome Measures for Evaluating Instructional Interventions in Mathematics and Science

    NASA Astrophysics Data System (ADS)

    Sussman, Joshua Michael

    This three-paper dissertation explores problems with the use of standardized tests as outcome measures for the evaluation of instructional interventions in mathematics and science. Investigators commonly use students' scores on standardized tests to evaluate the impact of instructional programs designed to improve student achievement. However, evidence suggests that the standardized tests may not measure, or may not measure well, the student learning caused by the interventions. This problem is special case of a basic problem in applied measurement related to understanding whether a particular test provides accurate and useful information about the impact of an educational intervention. The three papers explore different aspects of the issue and highlight the potential benefits of (a) using particular research methods and of (b) implementing changes to educational policy that would strengthen efforts to reform instructional intervention in mathematics and science. The first paper investigates measurement problems related to the use of standardized tests in applied educational research. Analysis of the research projects funded by the Institute of Education Sciences (IES) Mathematics and Science Education Program permitted me to address three main research questions. One, how often are standardized tests used to evaluate new educational interventions? Two, do the tests appear to measure the same thing that the intervention teaches? Three, do investigators establish validity evidence for the specific uses of the test? The research documents potential problems and actual problems related to the use of standardized tests in leading applied research, and suggests changes to policy that would address measurement issues and improve the rigor of applied educational research. The second paper explores the practical consequences of misalignment between an outcome measure and an educational intervention in the context of summative evaluation. Simulated evaluation data and a psychometric model of alignment grounded in item response modeling generate the results that address the following research question: how do differences between what a test measures and what an intervention teaches influence the results of an evaluation? The simulation derives a functional relationship between alignment, defined as the match between the test and the intervention, and treatment sensitivity, defined as the statistical power for detecting the impact of an intervention. The paper presents a new model of the effect of misalignment on the results of an evaluation and recommendations for outcome measure selection. The third paper documents the educational effectiveness of the Learning Mathematics through Representations (LMR) lesson sequence for students classified as English Learners (ELs). LMR is a research-based curricular unit designed to support upper elementary students' understandings of integers and fractions, areas considered foundational for the development of higher mathematics. The experimental evaluation contains a multilevel analysis of achievement data from two assessments: a standardized test and a researcher-developed assessment. The study coordinates the two sources of research data with a theoretical mechanism of action in order to rigorously document the effectiveness and educational equity of LMR for ELs using multiple sources of information.

  7. Development and validation of a new method for the registration of overuse injuries in sports injury epidemiology: the Oslo Sports Trauma Research Centre (OSTRC) overuse injury questionnaire.

    PubMed

    Clarsen, Benjamin; Myklebust, Grethe; Bahr, Roald

    2013-05-01

    Current methods for injury registration in sports injury epidemiology studies may substantially underestimate the true burden of overuse injuries due to a reliance on time-loss injury definitions. To develop and validate a new method for the registration of overuse injuries in sports. A new method, including a new overuse injury questionnaire, was developed and validated in a 13-week prospective study of injuries among 313 athletes from five different sports, cross-country skiing, floorball, handball, road cycling and volleyball. All athletes completed a questionnaire by email each week to register problems in the knee, lower back and shoulder. Standard injury registration methods were also used to record all time-loss injuries that occurred during the study period. The new method recorded 419 overuse problems in the knee, lower back and shoulder during the 3-month-study period. Of these, 142 were classified as substantial overuse problems, defined as those leading to moderate or severe reductions in sports performance or participation, or time loss. Each week, an average of 39% of athletes reported having overuse problems and 13% reported having substantial problems. In contrast, standard methods of injury registration registered only 40 overuse injuries located in the same anatomical areas, the majority of which were of minimal or mild severity. Standard injury surveillance methods only capture a small percentage of the overuse problems affecting the athletes, largely because few problems led to time loss from training or competition. The new method captured a more complete and nuanced picture of the burden of overuse injuries in this cohort.

  8. An investigation of dynamic-analysis methods for variable-geometry structures

    NASA Technical Reports Server (NTRS)

    Austin, F.

    1980-01-01

    Selected space structure configurations were reviewed in order to define dynamic analysis problems associated with variable geometry. The dynamics of a beam being constructed from a flexible base and the relocation of the completed beam by rotating the remote manipulator system about the shoulder joint were selected. Equations of motion were formulated in physical coordinates for both of these problems, and FORTRAN programs were developed to generate solutions by numerically integrating the equations. These solutions served as a standard of comparison to gauge the accuracy of approximate solution techniques that were developed and studied. Good control was achieved in both problems. Unstable control system coupling with the system flexibility did not occur. An approximate method was developed for each problem to enable the analyst to investigate variable geometry effects during a short time span using standard fixed geometry programs such as NASTRAN. The average angle and average length techniques are discussed.

  9. Impact of Early Intervention on Psychopathology, Crime, and Weil-Being at Age 25

    PubMed Central

    2015-01-01

    Objective This randomized controlled trial tested the efficacy of early intervention to prevent adult psychopathology and improve well-being in early-starting conduct-problem children. Method Kindergarteners (N=9,594) in three cohorts (1991–1993) at 55 schools in four communities were screened for conduct problems, yielding 979 early starters. A total of 891 (91%) consented (51% African American, 47% European American; 69% boys). Children were randomly assigned by school cluster to a 10-year intervention or control. The intervention goal was to develop social competencies in children that would carry them throughout life, through social skills training, parent behavior-management training with home visiting, peer coaching, reading tutoring, and classroom social-emotional curricula. Manualization and supervision ensured program fidelity. Ninety-eight percent participated during grade 1, and 80% continued through grade 10. At age 25, arrest records were reviewed (N=817,92%), and condition-blinded adults psychiatrically interviewed participants (N=702; 81% of living participants) and a peer (N=535) knowledgeable about the participant. Results Intent-to-treat logistic regression analyses indicated that 69% of participants in the control arm displayed at least one externalizing, internalizing, or substance abuse psychiatric problem (based on self- or peer interview) at age 25, in contrast with 59% of those assigned to intervention (odds ratio=0.59, CI=0.43–0.81; number needed to treat=8). This pattern also held for self-interviews, peer interviews, scores using an “and” rule for self- and peer reports, and separate tests for externalizing problems, internalizing problems, and substance abuse problems, as well as for each of three cohorts, four sites, male participants, female participants, African Americans, European Americans, moderate-risk, and high-risk subgroups. Intervention participants also received lower severity-weighted violent (standardized estimate=-0.37) and drug (standardized estimate=-0.43) crime conviction scores, lower risky sexual behavior scores (standardized estimate=-0.24), and higher well-being scores (standardized estimate=0.19). Conclusions This study provides evidence for the efficacy of early intervention in preventing adult psychopathology among high-risk early-starting conduct-problem children. PMID:25219348

  10. Problem Solving in Everyday Office Work--A Diary Study on Differences between Experts and Novices

    ERIC Educational Resources Information Center

    Rausch, Andreas; Schley, Thomas; Warwas, Julia

    2015-01-01

    Contemporary office work is becoming increasingly challenging as many routine tasks are automated or outsourced. The remaining problem solving activities may also offer potential for lifelong learning in the workplace. In this study, we analyzed problem solving in an office work setting using an Internet-based, semi-standardized diary to collect…

  11. Parental Divorce, Marital Conflict and Children's Behavior Problems: A Comparison of Adopted and Biological Children

    ERIC Educational Resources Information Center

    Amato, Paul R.; Cheadle, Jacob E.

    2008-01-01

    We used adopted and biological children from Waves 1 and 2 of the National Survey of Families and Households to study the links between parents' marital conflict, divorce and children's behavior problems. The standard family environment model assumes that marital conflict and divorce increase the risk of children's behavior problems. The passive…

  12. A Methodology for Validation of High Resolution Combat Models

    DTIC Science & Technology

    1988-06-01

    TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the

  13. The Effects of Differentiating Instruction by Learning Styles on Problem Solving in Cooperative Groups

    ERIC Educational Resources Information Center

    Westbrook, Amy F.

    2011-01-01

    It can be difficult to find adequate strategies when teaching problem solving in a standard based mathematics classroom. The purpose of this study was to improve students' problem solving skills and attitudes through differentiated instruction when working on lengthy performance tasks in cooperative groups. This action research studied for 15 days…

  14. Understanding Problem-Solving Errors by Students with Learning Disabilities in Standards-Based and Traditional Curricula

    ERIC Educational Resources Information Center

    Bouck, Emily C.; Bouck, Mary K.; Joshi, Gauri S.; Johnson, Linley

    2016-01-01

    Students with learning disabilities struggle with word problems in mathematics classes. Understanding the type of errors students make when working through such mathematical problems can further describe student performance and highlight student difficulties. Through the use of error codes, researchers analyzed the type of errors made by 14 sixth…

  15. Scale problems in reporting landscape pattern at the regional scale

    Treesearch

    R.V. O' Neill; C.T. Hunsaker; S.P. Timmins; B.L. Jackson; K.B. Jones; Kurt H. Riitters; James D. Wickham

    1996-01-01

    Remotely sensed data for Southeastern United States (Standard Federal Region 4) are used to examine the scale problems involved in reporting landscape pattern for a large, heterogeneous region. Frequency distribu-tions of landscape indices illustrate problems associated with the grain or resolution of the data. Grain should be 2 to 5 times smaller than the...

  16. Additive Relations Word Problems in the South African Curriculum and Assessment Policy Standard at Foundation Phase

    ERIC Educational Resources Information Center

    Roberts, Nicky

    2016-01-01

    Drawing on a literature review of classifications developed by each of Riley, Verschaffel and Carpenter and their respective research groups, a refined typology of additive relations word problems is proposed and then used as analytical tool to classify the additive relations word problems in South African Curriculum and Assessment Policy Standard…

  17. Progressing From Initially Ambiguous Functional Analyses: Three Case Examples

    PubMed Central

    Tiger, Jeffrey H.; Fisher, Wayne W.; Toussaint, Karen A.; Kodak, Tiffany

    2009-01-01

    Most often functional analyses are initiated using a standard set of test conditions, similar to those described by Iwata, Dorsey, Slifer, Bauman, and Richman (1982/1994). These test conditions involve the careful manipulation of motivating operations, discriminative stimuli, and reinforcement contingencies to determine the events related to the occurrence and maintenance of problem behavior. Some individuals display problem behavior that is occasioned and reinforced by idiosyncratic or otherwise unique combinations of environmental antecedents and consequences of behavior, which are unlikely to be detected using these standard assessment conditions. For these individuals, modifications to the standard test conditions or the inclusion of novel test conditions may result in clearer assessment outcomes. The current study provides three case examples of individuals whose functional analyses were initially undifferentiated; however, modifications to the standard conditions resulted in the identification of behavioral functions and the implementation of effective function-based treatments. PMID:19233611

  18. Self-aligned quadruple patterning-compliant placement

    NASA Astrophysics Data System (ADS)

    Nakajima, Fumiharu; Kodama, Chikaaki; Nakayama, Koichi; Nojima, Shigeki; Kotani, Toshiya

    2015-03-01

    Self-Aligned Quadruple Patterning (SAQP) will be one of the leading candidates for sub-14nm node and beyond. However, compared with triple patterning, making a feasible standard cell placement has following problems. (1) When coloring conflicts occur between two adjoining cells, they may not be solved easily since SAQP layout has stronger coloring constraints. (2) SAQP layout cannot use stitch to solve coloring conflict. In this paper, we present a framework of SAQP-aware standard cell placement considering the above problems. When standard cell is placed, the proposed method tries to solve coloring conflicts between two cells by exchanging two of three colors. If some conflicts remain between adjoining cells, dummy space will be inserted to keep coloring constraints of SAQP. We show some examples to confirm effectiveness of the proposed framework. To our best knowledge, this is the first framework of SAQP-aware standard cell placement.

  19. Using standardized patients versus video cases for representing clinical problems in problem-based learning.

    PubMed

    Yoon, Bo Young; Choi, Ikseon; Choi, Seokjin; Kim, Tae-Hee; Roh, Hyerin; Rhee, Byoung Doo; Lee, Jong-Tae

    2016-06-01

    The quality of problem representation is critical for developing students' problem-solving abilities in problem-based learning (PBL). This study investigates preclinical students' experience with standardized patients (SPs) as a problem representation method compared to using video cases in PBL. A cohort of 99 second-year preclinical students from Inje University College of Medicine (IUCM) responded to a Likert scale questionnaire on their learning experiences after they had experienced both video cases and SPs in PBL. The questionnaire consisted of 14 items with eight subcategories: problem identification, hypothesis generation, motivation, collaborative learning, reflective thinking, authenticity, patient-doctor communication, and attitude toward patients. The results reveal that using SPs led to the preclinical students having significantly positive experiences in boosting patient-doctor communication skills; the perceived authenticity of their clinical situations; development of proper attitudes toward patients; and motivation, reflective thinking, and collaborative learning when compared to using video cases. The SPs also provided more challenges than the video cases during problem identification and hypotheses generation. SPs are more effective than video cases in delivering higher levels of authenticity in clinical problems for PBL. The interaction with SPs engages preclinical students in deeper thinking and discussion; growth of communication skills; development of proper attitudes toward patients; and motivation. Considering the higher cost of SPs compared with video cases, SPs could be used most advantageously during the preclinical period in the IUCM curriculum.

  20. Stencils and problem partitionings: Their influence on the performance of multiple processor systems

    NASA Technical Reports Server (NTRS)

    Reed, D. A.; Adams, L. M.; Patrick, M. L.

    1986-01-01

    Given a discretization stencil, partitioning the problem domain is an important first step for the efficient solution of partial differential equations on multiple processor systems. Partitions are derived that minimize interprocessor communication when the number of processors is known a priori and each domain partition is assigned to a different processor. This partitioning technique uses the stencil structure to select appropriate partition shapes. For square problem domains, it is shown that non-standard partitions (e.g., hexagons) are frequently preferable to the standard square partitions for a variety of commonly used stencils. This investigation is concluded with a formalization of the relationship between partition shape, stencil structure, and architecture, allowing selection of optimal partitions for a variety of parallel systems.

  1. International aspects of problems associated with the use of psychoactive drugs.

    PubMed

    Chruściel, T L

    1976-01-01

    Problems of terminology, use and consumption, advertising, effectiveness and appropriate information and education on psychoactive drugs are outlined and advantages of international collaboration in attempts to establish standards for controlled clinical trials in psychopharmacology are discussed.

  2. Using General Education Student Data to Calibrate a Mathematics Alternate Assessment Based on Modified Academic Achievement Standards

    ERIC Educational Resources Information Center

    Jung, Eunju

    2012-01-01

    The U.S. Department of Education released regulations governing the development of alternate assessments for students with persistent learning problems who are eligible for Modified Academic Achievement Standards (MAAS) in 2007. To date, state regular assessments or alternate assessments based on Alternate Academic Achievement Standards have not…

  3. Proficiency Standards and Cut-Scores for Language Proficiency Tests.

    ERIC Educational Resources Information Center

    Moy, Raymond H.

    1984-01-01

    Discusses the problems associated with "grading on a curve," the approach often used for standard setting on language proficiency tests. Proposes four main steps presented in the setting of a non-arbitrary cut-score. These steps not only establish a proficiency standard checked by external criteria, but also check to see that the test covers the…

  4. Adapting to Change: Teacher Perceptions of Implementing the Common Core State Standards

    ERIC Educational Resources Information Center

    Burks, Brooke A.; Beziat, Tara L. R.; Danley, Sheree; Davis, Kashara; Lowery, Holly; Lucas, Jessica

    2015-01-01

    The current research study looked at secondary teachers' (grades 6-12) perceptions of their preparedness to implement the Common Core State Standards as well as their feelings about the training they have or have not received related to implementing the standards. The problem: Many conflicting views exist among teachers, parents, and others…

  5. An Examination of the Statistical Problem-Solving Process as a Potential Means for Developing an Understanding of Argumentation

    ERIC Educational Resources Information Center

    Smith Baum, Brittany Deshae

    2017-01-01

    As part of the recent history of the mathematics curriculum, reasoning and argument have been emphasized throughout mathematics curriculum standards. Specifically, as part of the Common Core State Standards for Mathematics, the Standards for Mathematical Practice were presented, which included the expectation that students develop arguments and…

  6. Lapses in Education Policy Formulation Processes in Nigeria: Implications for the Standard of Education

    ERIC Educational Resources Information Center

    Oyedeji, Samson Oyelola

    2015-01-01

    Nigeria's Education Policy is quite laudable yet her investments in education are not too rewarding considering the deteriorating educational standards. The poor performance of the education sector in Nigeria, which is evident in the falling in standard of education and poor quality, has become very worrisome. What is the problem? Is the…

  7. Contracting for Computer Software in Standardized Computer Languages

    PubMed Central

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  8. Contingency management treatment in cocaine using methadone maintained patients with and without legal problems.

    PubMed

    Ginley, Meredith K; Rash, Carla J; Olmstead, Todd A; Petry, Nancy M

    2017-11-01

    Legal difficulties and cocaine use are prevalent in methadone maintenance patients, and they are related to one another, as well as to poor response to methadone treatment. Contingency management (CM) is efficacious for decreasing cocaine use, but the relation of CM treatment to criminal activities has rarely been studied. This study evaluated whether baseline legal problems are related to subsequent substance use and illegal activities for cocaine using methadone maintained patients and whether CM differentially improves outcomes depending on baseline legal problems. Using data from four randomized CM trials (N=323), we compared methadone maintained patients with legal problems at the start of study participation to those without initial legal problems. Overall, the addition of CM to standard methadone care improved substance use outcomes regardless of initial legal problems. Endorsement of legal problems within 30days of study initiation was associated with reduced proportion of negative samples submitted during the 12-week treatment period. A significant interaction effect of baseline legal problems and treatment condition was present for subsequent self-reports of illegal activities. Those with baseline legal problems who were assigned to CM had reduced self-reports of reengagement in illegal activity throughout a six month follow-up compared to their counterparts randomized to standard care. Adding CM to methadone treatment improves substance use outcomes and reduces subsequent illegal activity in cocaine-using methadone patients with legal problems. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Solving the transient water age distribution problem in environmental flow systems

    NASA Astrophysics Data System (ADS)

    Cornaton, F. J.

    2011-12-01

    The temporal evolution of groundwater age and its frequency distributions can display important changes as flow regimes vary due to the natural change in climate and hydrologic conditions and/or to human induced pressures on the resource to satisfy the water demand. Groundwater age being nowadays frequently used to investigate reservoir properties and recharge conditions, special attention needs to be put on the way this property is characterized, would it be using isotopic methods, multiple tracer techniques, or mathematical modelling. Steady-state age frequency distributions can be modelled using standard numerical techniques, since the general balance equation describing age transport under steady-state flow conditions is exactly equivalent to a standard advection-dispersion equation. The time-dependent problem is however described by an extended transport operator that incorporates an additional coordinate for water age. The consequence is that numerical solutions can hardly be achieved, especially for real 3-D applications over large time periods of interest. The absence of any robust method has thus left us in the quantitative hydrogeology community dodging the issue of transience. Novel algorithms for solving the age distribution problem under time-varying flow regimes are presented and, for some specific configurations, extended to the problem of generalized component exposure time. The solution strategy is based on the combination of the Laplace Transform technique applied to the age (or exposure time) coordinate with standard time-marching schemes. The method is well-suited for groundwater problems with possible density-dependency of fluid flow (e.g. coupled flow and heat/salt concentration problems), but also presents significance to the homogeneous flow (compressible case) problem. The approach is validated using 1-D analytical solutions and exercised on some demonstration problems that are relevant to topical issues in groundwater age, including analysis of transfer times in the vadose zone, aquifer-aquitard interactions and the induction of transient age distributions when a well pump is started.

  10. Regularizing cosmological singularities by varying physical constants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dąbrowski, Mariusz P.; Marosek, Konrad, E-mail: mpdabfz@wmf.univ.szczecin.pl, E-mail: k.marosek@wmf.univ.szczecin.pl

    2013-02-01

    Varying physical constant cosmologies were claimed to solve standard cosmological problems such as the horizon, the flatness and the Λ-problem. In this paper, we suggest yet another possible application of these theories: solving the singularity problem. By specifying some examples we show that various cosmological singularities may be regularized provided the physical constants evolve in time in an appropriate way.

  11. Solving the "Rural School Problem": New State Aid, Standards, and Supervision of Local Schools, 1900-1933

    ERIC Educational Resources Information Center

    Steffes, Tracy L.

    2008-01-01

    In 1918, Minnesota county superintendent Julius Arp argued that the greatest educational problem facing the American people was the Rural School Problem, saying: "There is no defect more glaring today than the inequality that exists between the educational facilities of the urban and rural communities. Rural education in the United States has…

  12. Progress and Problems in Reforming Public Language Examinations in Europe: Cameos from the Baltic States, Greece, Hungary, Poland, Slovenia, France and Germany

    ERIC Educational Resources Information Center

    Eckes, Thomas; Ellis, Melanie; Kalnberzina, Vita; Pizorn, Karmen; Springer, Claude; Szollas, Krisztina; Tsagari, Constance

    2005-01-01

    Contributions from seven European countries pinpoint major projects, problems, and prospects of reforming public language assessment procedures. Each country has faced unique problems in the reform process, yet there have also been several common themes emerging, such as a focus on multilingualism, communicative skills, standardization, reference…

  13. Incorporation of epidemiological findings into radiation protection standards.

    PubMed

    Goldsmith, J R

    In standard setting there is a tendency to use data from experimental studies in preference to findings from epidemiological studies. Yet the epidemiological studies are usually the first and at times the only source of data on such critical effects as cancer, reproductive failure, and chronic cardiac and cardiovascular disease in exposed humans. A critique of the protection offered by current and proposed standards for ionizing and non-ionizing radiation illustrates some of the problems. Similar problems occur with water and air pollutants and with occupational exposures of many types. The following sorts of problems were noted: (a) Consideration of both thermal and non-thermal effects especially of non-ionizing radiation. (b) Interpretation of non-significant results as equivalent to no effect. (c) Accepting author's interpretation of a study, rather than examining its data independently for evidence of hazard. (d) Discounting data on unanticipated effects because of poor fit to preconceptions. (e) Dependence on threshold assumptions and demonstrations of dose-response relationships. (f) Choice of insensitive epidemiological indicators and procedures. (g) Consideration of each study separately, rather than giving weight to the conjunction of evidence from all available studies. These problems may be minimized by greater involvement of epidemiologists and their professional organizations in decisions about health protection.

  14. Impact of lightning strikes on hospital functions.

    PubMed

    Mortelmans, Luc J M; Van Springel, Gert L J; Van Boxstael, Sam; Herrijgers, Jan; Hoflacks, Stefaan

    2009-01-01

    Two regional hospitals were struck by lightning during a one-month period. The first hospital, which had 236 beds, suffered a direct strike to the building. This resulted in a direct spread of the power peak and temporary failure of the standard power supply. The principle problems, after restoring standard power supply, were with the fire alarm system and peripheral network connections in the digital radiology systems. No direct impact on the hardware could be found. Restarting the servers resolved all problems. The second hospital, which had 436 beds, had a lightning strike on the premises and mainly experienced problems due to induction. All affected installations had a cable connection from outside in one way or another. The power supplies never were endangered. The main problem was the failure of different communication systems (telephone, radio, intercom, fire alarm system). Also, the electronic entrance control went out. During the days after the lightening strike, multiple software problems became apparent, as well as failures of the network connections controlling the technical support systems. There are very few ways to prepare for induction problems. The use of fiber-optic networks can limit damage. To the knowledge of the authors, these are the first cases of lightning striking hospitals in medical literature.

  15. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE PAGES

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.; ...

    2018-03-26

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  16. A problem-solving task specialized for functional neuroimaging: validation of the Scarborough adaptation of the Tower of London (S-TOL) using near-infrared spectroscopy

    PubMed Central

    Ruocco, Anthony C.; Rodrigo, Achala H.; Lam, Jaeger; Di Domenico, Stefano I.; Graves, Bryanna; Ayaz, Hasan

    2014-01-01

    Problem-solving is an executive function subserved by a network of neural structures of which the dorsolateral prefrontal cortex (DLPFC) is central. Whereas several studies have evaluated the role of the DLPFC in problem-solving, few standardized tasks have been developed specifically for use with functional neuroimaging. The current study adapted a measure with established validity for the assessment of problem-solving abilities to design a test more suitable for functional neuroimaging protocols. The Scarborough adaptation of the Tower of London (S-TOL) was administered to 38 healthy adults while hemodynamic oxygenation of the PFC was measured using 16-channel continuous-wave functional near-infrared spectroscopy (fNIRS). Compared to a baseline condition, problems that required two or three steps to achieve a goal configuration were associated with higher activation in the left DLPFC and deactivation in the medial PFC. Individuals scoring higher in trait deliberation showed consistently higher activation in the left DLPFC regardless of task difficulty, whereas individuals lower in this trait displayed less activation when solving simple problems. Based on these results, the S-TOL may serve as a standardized task to evaluate problem-solving abilities in functional neuroimaging studies. PMID:24734017

  17. Eigensensitivity analysis of rotating clamped uniform beams with the asymptotic numerical method

    NASA Astrophysics Data System (ADS)

    Bekhoucha, F.; Rechak, S.; Cadou, J. M.

    2016-12-01

    In this paper, free vibrations of a rotating clamped Euler-Bernoulli beams with uniform cross section are studied using continuation method, namely asymptotic numerical method. The governing equations of motion are derived using Lagrange's method. The kinetic and strain energy expression are derived from Rayleigh-Ritz method using a set of hybrid variables and based on a linear deflection assumption. The derived equations are transformed in two eigenvalue problems, where the first is a linear gyroscopic eigenvalue problem and presents the coupled lagging and stretch motions through gyroscopic terms. While the second is standard eigenvalue problem and corresponds to the flapping motion. Those two eigenvalue problems are transformed into two functionals treated by continuation method, the Asymptotic Numerical Method. New method proposed for the solution of the linear gyroscopic system based on an augmented system, which transforms the original problem to a standard form with real symmetric matrices. By using some techniques to resolve these singular problems by the continuation method, evolution curves of the natural frequencies against dimensionless angular velocity are determined. At high angular velocity, some singular points, due to the linear elastic assumption, are computed. Numerical tests of convergence are conducted and the obtained results are compared to the exact values. Results obtained by continuation are compared to those computed with discrete eigenvalue problem.

  18. Phase field benchmark problems for dendritic growth and linear elasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jokisaari, Andrea M.; Voorhees, P. W.; Guyer, Jonathan E.

    We present the second set of benchmark problems for phase field models that are being jointly developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST) along with input from other members in the phase field community. As the integrated computational materials engineering (ICME) approach to materials design has gained traction, there is an increasing need for quantitative phase field results. New algorithms and numerical implementations increase computational capabilities, necessitating standard problems to evaluate their impact on simulated microstructure evolution as well as their computational performance. We propose one benchmark problem formore » solidifiication and dendritic growth in a single-component system, and one problem for linear elasticity via the shape evolution of an elastically constrained precipitate. We demonstrate the utility and sensitivity of the benchmark problems by comparing the results of 1) dendritic growth simulations performed with different time integrators and 2) elastically constrained precipitate simulations with different precipitate sizes, initial conditions, and elastic moduli. As a result, these numerical benchmark problems will provide a consistent basis for evaluating different algorithms, both existing and those to be developed in the future, for accuracy and computational efficiency when applied to simulate physics often incorporated in phase field models.« less

  19. Supported employment: cost-effectiveness across six European sites

    PubMed Central

    Knapp, Martin; Patel, Anita; Curran, Claire; Latimer, Eric; Catty, Jocelyn; Becker, Thomas; Drake, Robert E; Fioritti, Angelo; Kilian, Reinhold; Lauber, Christoph; Rössler, Wulf; Tomov, Toma; van Busschbach, Jooske; Comas-Herrera, Adelina; White, Sarah; Wiersma, Durk; Burns, Tom

    2013-01-01

    A high proportion of people with severe mental health problems are unemployed but would like to work. Individual Placement and Support (IPS) offers a promising approach to establishing people in paid employment. In a randomized controlled trial across six European countries, we investigated the economic case for IPS for people with severe mental health problems compared to standard vocational rehabilitation. Individuals (n=312) were randomized to receive either IPS or standard vocational services and followed for 18 months. Service use and outcome data were collected. Cost-effectiveness analysis was conducted with two primary outcomes: additional days worked in competitive settings and additional percentage of individuals who worked at least 1 day. Analyses distinguished country effects. A partial cost-benefit analysis was also conducted. IPS produced better outcomes than alternative vocational services at lower cost overall to the health and social care systems. This pattern also held in disaggregated analyses for five of the six European sites. The inclusion of imputed values for missing cost data supported these findings. IPS would be viewed as more cost-effective than standard vocational services. Further analysis demonstrated cost-benefit arguments for IPS. Compared to standard vocational rehabilitation services, IPS is, therefore, probably cost-saving and almost certainly more cost-effective as a way to help people with severe mental health problems into competitive employment. PMID:23471803

  20. Physics Features of TRU-Fueled VHTRs

    DOE PAGES

    Lewis, Tom G.; Tsvetkov, Pavel V.

    2009-01-01

    The current waste management strategy for spent nuclear fuel (SNF) mandated by the US Congress is the disposal of high-level waste (HLW) in a geological repository at Yucca Mountain. Ongoing efforts on closed-fuel cycle options and difficulties in opening and safeguarding such a repository have led to investigations of alternative waste management strategies. One potential strategy for the US fuel cycle would be to make use of fuel loadings containing high concentrations of transuranic (TRU) nuclides in the next-generation reactors. The use of such fuels would not only increase fuel supply but could also potentially facilitate prolonged operation modes (viamore » fertile additives) on a single fuel loading. The idea is to approach autonomous operation on a single fuel loading that would allow marketing power units as nuclear batteries for worldwide deployment. Studies have already shown that high-temperature gas-cooled reactors (HTGRs) and their Generation IV (GEN IV) extensions, very-high-temperature reactors (VHTRs), have encouraging performance characteristics. This paper is focused on possible physics features of TRU-fueled VHTRs. One of the objectives of a 3-year U.S. DOE NERI project was to show that TRU-fueled VHTRs have the possibility of prolonged operation on a single fuel loading. A 3D temperature distribution was developed based on conceivable operation conditions of the 600 MWth VHTR design. Results of extensive criticality and depletion calculations with varying fuel loadings showed that VHTRs are capable for autonomous operation and HLW waste reduction when loaded with TRU fuel.« less

  1. Evaluation of the Effects of Hidden Node Problems in IEEE 802.15.7 Uplink Performance

    PubMed Central

    Ley-Bosch, Carlos; Alonso-González, Itziar; Sánchez-Rodríguez, David; Ramírez-Casañas, Carlos

    2016-01-01

    In the last few years, the increasing use of LEDs in illumination systems has been conducted due to the emergence of Visible Light Communication (VLC) technologies, in which data communication is performed by transmitting through the visible band of the electromagnetic spectrum. In 2011, the Institute of Electrical and Electronics Engineers (IEEE) published the IEEE 802.15.7 standard for Wireless Personal Area Networks based on VLC. Due to limitations in the coverage of the transmitted signal, wireless networks can suffer from the hidden node problems, when there are nodes in the network whose transmissions are not detected by other nodes. This problem can cause an important degradation in communications when they are made by means of the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) access control method, which is used in IEEE 802.15.7 This research work evaluates the effects of the hidden node problem in the performance of the IEEE 802.15.7 standard We implement a simulator and analyze VLC performance in terms of parameters like end-to-end goodput and message loss rate. As part of this research work, a solution to the hidden node problem is proposed, based on the use of idle patterns defined in the standard. Idle patterns are sent by the network coordinator node to communicate to the other nodes that there is an ongoing transmission. The validity of the proposed solution is demonstrated with simulation results. PMID:26861352

  2. Evaluation of the Effects of Hidden Node Problems in IEEE 802.15.7 Uplink Performance.

    PubMed

    Ley-Bosch, Carlos; Alonso-González, Itziar; Sánchez-Rodríguez, David; Ramírez-Casañas, Carlos

    2016-02-06

    In the last few years, the increasing use of LEDs in illumination systems has been conducted due to the emergence of Visible Light Communication (VLC) technologies, in which data communication is performed by transmitting through the visible band of the electromagnetic spectrum. In 2011, the Institute of Electrical and Electronics Engineers (IEEE) published the IEEE 802.15.7 standard for Wireless Personal Area Networks based on VLC. Due to limitations in the coverage of the transmitted signal, wireless networks can suffer from the hidden node problems, when there are nodes in the network whose transmissions are not detected by other nodes. This problem can cause an important degradation in communications when they are made by means of the Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) access control method, which is used in IEEE 802.15.7 This research work evaluates the effects of the hidden node problem in the performance of the IEEE 802.15.7 standard We implement a simulator and analyze VLC performance in terms of parameters like end-to-end goodput and message loss rate. As part of this research work, a solution to the hidden node problem is proposed, based on the use of idle patterns defined in the standard. Idle patterns are sent by the network coordinator node to communicate to the other nodes that there is an ongoing transmission. The validity of the proposed solution is demonstrated with simulation results.

  3. Focusing on Main Street's Problems from Secluded Laboratory Retreats

    ERIC Educational Resources Information Center

    Kushner, Lawrence M.

    1973-01-01

    A report on the National Bureau of Standards is presented. It provides national measurement standards for some 40 physical quantities related through the laws of physics to the basic six - length, time, mass, temperature, electric current, and luminous intensity. (DF)

  4. Evaluation of Standardized Instruments for Use in Universal Screening of Very Early School-Age Children: Suitability, Technical Adequacy, and Usability

    ERIC Educational Resources Information Center

    Miles, Sandra; Fulbrook, Paul; Mainwaring-Mägi, Debra

    2018-01-01

    Universal screening of very early school-age children (age 4-7 years) is important for early identification of learning problems that may require enhanced learning opportunity. In this context, use of standardized instruments is critical to obtain valid, reliable, and comparable assessment outcomes. A wide variety of standardized instruments is…

  5. Cracks in Continuing Education's Mirror and a Fix To Correct Its Distorted Internal and External Image.

    ERIC Educational Resources Information Center

    Loch, John R.

    2003-01-01

    Outlines problems in continuing higher education, suggesting that it lacks (1) a standard name; (2) a unified voice on national issues; (3) a standard set of roles and functions; (4) a standard title for the chief administrative officer; (5) an accreditation body and process; and (6) resolution of the centralization/decentralization issue. (SK)

  6. Framework for Assessing the ICT Competency in Teachers up to the Requirements of "Teacher" Occupational Standard

    ERIC Educational Resources Information Center

    Avdeeva, Svetlana; Zaichkina, Olga; Nikulicheva, Nataliya; Khapaeva, Svetlana

    2016-01-01

    The paper deals with problems of working out a test framework for the assessment of teachers' ICT competency in line with the requirements of "Teacher" occupational standard. The authors have analyzed the known approaches to assessing teachers' ICT competency--ISTE Standards and UNESCO ICT CFT and have suggested their own approach to…

  7. Some Practical Solutions to Standard-Setting Problems: The Georgia Teacher Certification Test Experience.

    ERIC Educational Resources Information Center

    Cramer, Stephen E.

    A standard-setting procedure was developed for the Georgia Teacher Certification Testing Program as tests in 30 teaching fields were revised. A list of important characteristics of a standard-setting procedure was derived, drawing on the work of R. A. Berk (1986). The best method was found to be a highly formalized judgmental, empirical Angoff…

  8. Basic Materials for Electromagnetic Field Standards

    DTIC Science & Technology

    2003-03-04

    Stepanov. “Problem of population electromagnetic safety”. In- ternational Medical Congress “New technologies in medicine. National and interna- tional...Rubtcova N.B. Harmonization options EMF standards: proposals of Russian national committee on non-ionazing radiation protection (RNCNIRP). 3rd...international and national EMF standards of different countries as well as to evaluate the population health danger of electromag- netic fields of

  9. Learning to Love the Questions: How Essential Questions Promote Creativity and Deep Learning

    ERIC Educational Resources Information Center

    Wilhelm, Jeffrey D.

    2014-01-01

    Educators know that creativity and innovation involve questioning and the capacity to frame topics as problems to be solved. They know that we are living in a time of a new generation of standards, including the Common Core State Standards (CCSS). In the U.S., compliance with these standards requires that educators encourage students to ask…

  10. Impact of Gadget Based Learning of Grammar in English at Standard II

    ERIC Educational Resources Information Center

    Singaravelu, G.

    2014-01-01

    The study enlightens the impact of Gadget Based Learning of English Grammar at standard II. Objectives of the study is to find out the learning problems of the students of standard II in Learning English Grammar in Shri Vani Vilas Middle School and to find whether there is any significant difference in achievement mean score between pre test of…

  11. Better Serving the Children of Our Servicemen and Women: How the Common Core Improves Education for Military-Connected Children

    ERIC Educational Resources Information Center

    Center for American Progress, 2014

    2014-01-01

    States across the country have always established their own academic standards, curricula, and achievement goals. What students are expected to know and be able to do often differs from state to state. Additionally, states with low standards may leave students unprepared for higher standards in other states. This inconsistency creates problems for…

  12. A Phenomenological Study on the Lived Experience of First and Second Year Teachers in Standards-Based Grading Districts

    ERIC Educational Resources Information Center

    Battistone, William A., Jr.

    2017-01-01

    Problem: There is an existing cycle of questionable grading practices at the K-12 level. As a result, districts continue to search for innovative methods of evaluating and reporting student progress. One result of this effort has been the adoption of a standards-based grading approach. Research concerning standards-based grading implementation has…

  13. Numerical Solution of Time-Dependent Problems with a Fractional-Power Elliptic Operator

    NASA Astrophysics Data System (ADS)

    Vabishchevich, P. N.

    2018-03-01

    A time-dependent problem in a bounded domain for a fractional diffusion equation is considered. The first-order evolution equation involves a fractional-power second-order elliptic operator with Robin boundary conditions. A finite-element spatial approximation with an additive approximation of the operator of the problem is used. The time approximation is based on a vector scheme. The transition to a new time level is ensured by solving a sequence of standard elliptic boundary value problems. Numerical results obtained for a two-dimensional model problem are presented.

  14. [Principles and Methods for Formulating National Standards of "Regulations of Acupuncture-nee- dle Manipulating techniques"].

    PubMed

    Gang, Wei-juan; Wang, Xin; Wang, Fang; Dong, Guo-feng; Wu, Xiao-dong

    2015-08-01

    The national standard of "Regulations of Acupuncture-needle Manipulating Techniques" is one of the national Criteria of Acupuncturology for which a total of 22 items have been already established. In the process of formulation, a series of common and specific problems have been met. In the present paper, the authors expound these problems from 3 aspects, namely principles for formulation, methods for formulating criteria, and considerations about some problems. The formulating principles include selection and regulations of principles for technique classification and technique-related key factors. The main methods for formulating criteria are 1) taking the literature as the theoretical foundation, 2) taking the clinical practice as the supporting evidence, and 3) taking the expounded suggestions or conclusions through peer review.

  15. Finding the strong CP problem at the LHC

    NASA Astrophysics Data System (ADS)

    D'Agnolo, Raffaele Tito; Hook, Anson

    2016-11-01

    We show that a class of parity based solutions to the strong CP problem predicts new colored particles with mass at the TeV scale, due to constraints from Planck suppressed operators. The new particles are copies of the Standard Model quarks and leptons. The new quarks can be produced at the LHC and are either collider stable or decay into Standard Model quarks through a Higgs, a W or a Z boson. We discuss some simple but generic predictions of the models for the LHC and find signatures not related to the traditional solutions of the hierarchy problem. We thus provide alternative motivation for new physics searches at the weak scale. We also briefly discuss the cosmological history of these models and how to obtain successful baryogenesis.

  16. Related Rates and the Speed of Light.

    ERIC Educational Resources Information Center

    Althoen, S. C.; Weidner, J. F.

    1985-01-01

    Standard calculus textbooks often include a related rates problem involving light cast onto a straight line by a revolving light source. Mathematical aspects to these problems (both in the solution and in the method by which that solution is obtained) are examined. (JN)

  17. 45 CFR Appendix A to Part 1211 - Standards for Examiners

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., professional, investigative, or technical work which has demonstrated the possession of: (i) The personal... problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; Interpret and apply regulations and other complex written material; Communicate...

  18. 45 CFR Appendix A to Part 1211 - Standards for Examiners

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., professional, investigative, or technical work which has demonstrated the possession of: (i) The personal... problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; Interpret and apply regulations and other complex written material; Communicate...

  19. Quest for Quality.

    ERIC Educational Resources Information Center

    Wilson, Richard B.; Schmoker, Mike

    1992-01-01

    Unlike traditional school management, Toyota of America recognizes thinking employees and emphasizes problems and measurable approaches to improvement. Instead of meeting to discuss short-term goals, specific problems, and concrete successes, school leaders often alienate staff by leading year-end discussions of standardized test score data.…

  20. What Are the Signs of Alzheimer's Disease? | NIH MedlinePlus the Magazine

    MedlinePlus

    ... in behavior and personality Conduct tests of memory, problem solving, attention, counting, and language Carry out standard medical ... over and over having trouble paying bills or solving simple math problems getting lost losing things or putting them in ...

  1. The Four Billion Dollar Lunch

    ERIC Educational Resources Information Center

    Sautter, R. Craig

    1978-01-01

    Discusses problems with the National School Lunch Program, including the high proportion of food thrown away by students, problems with food preparation, nutritional standards, and competition from junk foods. Suggestions for nutrition education are offered and organizations and books for further reference are listed. (JMB)

  2. New Dental Accreditation Standard on Critical Thinking: A Call for Learning Models, Outcomes, Assessments.

    PubMed

    Johnsen, David C; Williams, John N; Baughman, Pauletta Gay; Roesch, Darren M; Feldman, Cecile A

    2015-10-01

    This opinion article applauds the recent introduction of a new dental accreditation standard addressing critical thinking and problem-solving, but expresses a need for additional means for dental schools to demonstrate they are meeting the new standard because articulated outcomes, learning models, and assessments of competence are still being developed. Validated, research-based learning models are needed to define reference points against which schools can design and assess the education they provide to their students. This article presents one possible learning model for this purpose and calls for national experts from within and outside dental education to develop models that will help schools define outcomes and assess performance in educating their students to become practitioners who are effective critical thinkers and problem-solvers.

  3. Digital combined instrument transformer for automated electric power supply control systems of mining companies

    NASA Astrophysics Data System (ADS)

    Topolsky, D. V.; Gonenko, T. V.; Khatsevskiy, V. F.

    2017-10-01

    The present paper discusses ways to solve the problem of enhancing operating efficiency of automated electric power supply control systems of mining companies. According to the authors, one of the ways to solve this problem is intellectualization of the electric power supply control system equipment. To enhance efficiency of electric power supply control and electricity metering, it is proposed to use specially designed digital combined instrument current and voltage transformers. This equipment conforms to IEC 61850 international standard and is adapted for integration into the digital substation structure. Tests were performed to check conformity of an experimental prototype of the digital combined instrument current and voltage transformer with IEC 61850 standard. The test results have shown that the considered equipment meets the requirements of the standard.

  4. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    PubMed Central

    Tamayo, Alain; Granell, Carlos; Huerta, Joaquín

    2012-01-01

    Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  5. On the Performance Evaluation of 3D Reconstruction Techniques from a Sequence of Images

    NASA Astrophysics Data System (ADS)

    Eid, Ahmed; Farag, Aly

    2005-12-01

    The performance evaluation of 3D reconstruction techniques is not a simple problem to solve. This is not only due to the increased dimensionality of the problem but also due to the lack of standardized and widely accepted testing methodologies. This paper presents a unified framework for the performance evaluation of different 3D reconstruction techniques. This framework includes a general problem formalization, different measuring criteria, and a classification method as a first step in standardizing the evaluation process. Performance characterization of two standard 3D reconstruction techniques, stereo and space carving, is also presented. The evaluation is performed on the same data set using an image reprojection testing methodology to reduce the dimensionality of the evaluation domain. Also, different measuring strategies are presented and applied to the stereo and space carving techniques. These measuring strategies have shown consistent results in quantifying the performance of these techniques. Additional experiments are performed on the space carving technique to study the effect of the number of input images and the camera pose on its performance.

  6. Standard Model—axion—seesaw—Higgs portal inflation. Five problems of particle physics and cosmology solved in one stroke

    NASA Astrophysics Data System (ADS)

    Ballesteros, Guillermo; Redondo, Javier; Ringwald, Andreas; Tamarit, Carlos

    2017-08-01

    We present a minimal extension of the Standard Model (SM) providing a consistent picture of particle physics from the electroweak scale to the Planck scale and of cosmology from inflation until today. Three right-handed neutrinos Ni, a new color triplet Q and a complex SM-singlet scalar σ, whose vacuum expectation value vσ ~ 1011 GeV breaks lepton number and a Peccei-Quinn symmetry simultaneously, are added to the SM. At low energies, the model reduces to the SM, augmented by seesaw generated neutrino masses and mixing, plus the axion. The latter solves the strong CP problem and accounts for the cold dark matter in the Universe. The inflaton is comprised by a mixture of σ and the SM Higgs, and reheating of the Universe after inflation proceeds via the Higgs portal. Baryogenesis occurs via thermal leptogenesis. Thus, five fundamental problems of particle physics and cosmology are solved at one stroke in this unified Standard Model—axion—seesaw—Higgs portal inflation (SMASH) model. It can be probed decisively by upcoming cosmic microwave background and axion dark matter experiments.

  7. Using standardized patients versus video cases for representing clinical problems in problem-based learning

    PubMed Central

    2016-01-01

    Purpose: The quality of problem representation is critical for developing students’ problem-solving abilities in problem-based learning (PBL). This study investigates preclinical students’ experience with standardized patients (SPs) as a problem representation method compared to using video cases in PBL. Methods: A cohort of 99 second-year preclinical students from Inje University College of Medicine (IUCM) responded to a Likert scale questionnaire on their learning experiences after they had experienced both video cases and SPs in PBL. The questionnaire consisted of 14 items with eight subcategories: problem identification, hypothesis generation, motivation, collaborative learning, reflective thinking, authenticity, patient-doctor communication, and attitude toward patients. Results: The results reveal that using SPs led to the preclinical students having significantly positive experiences in boosting patient-doctor communication skills; the perceived authenticity of their clinical situations; development of proper attitudes toward patients; and motivation, reflective thinking, and collaborative learning when compared to using video cases. The SPs also provided more challenges than the video cases during problem identification and hypotheses generation. Conclusion: SPs are more effective than video cases in delivering higher levels of authenticity in clinical problems for PBL. The interaction with SPs engages preclinical students in deeper thinking and discussion; growth of communication skills; development of proper attitudes toward patients; and motivation. Considering the higher cost of SPs compared with video cases, SPs could be used most advantageously during the preclinical period in the IUCM curriculum. PMID:26923094

  8. Clothes washer standards in China -- The problem of water andenergy trade-offs in establishing efficiency standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biermayer, Peter J.; Lin, Jiang

    2004-05-19

    Currently the sales of clothes washers in China consist ofseveral general varieties. Some use more energy (with or withoutincluding hot water energy use) and some use more water. Both energy andwater are in short supply in China. This poses the question - how do youtrade off water versus energy in establishing efficiency standards? Thispaper discusses how China dealt with this situation and how itestablished minimum efficiency standards for clothes washers.

  9. The standardization of urine particle counting in medical laboratories--a Polish experience with the EQA programme.

    PubMed

    Cwiklińska, Agnieszka; Kąkol, Judyta; Kuchta, Agnieszka; Kortas-Stempak, Barbara; Pacanis, Anastasis; Rogulski, Jerzy; Wróblewska, Małgorzata

    2012-02-01

    Given the common problems with the standardization of urine particle counting methods and the great variability in the results obtained by Polish laboratories under international Labquality External Quality Assessment (EQA), we initiated educational recovery activities. Detailed instructions on how to perform the standardized examination were sent to EQA participants, as was a questionnaire forms which enabled information to be gathered in respect to the procedures being applied. Laboratory results were grouped according to the method declared on the EQA 'Result' form or according to a manual examination procedure established on the basis of the questionnaire. The between-laboratory CVs for leukocyte and erythrocyte counts were calculated for each group and compared using the Mann-Whitney test. Significantly lower between-laboratory CVs (p = 0.03) were achieved for leukocyte counting among the laboratories that analysed control specimens in accordance with standardized procedures as compared with those which used non-standardized procedures. We also observed a visible lower variability for erythrocyte counting. Unfortunately despite our activities, only a few of the Polish laboratories applied the standardized examination procedures, and only 29% of the results could have been considered to be standardized (16% - manual methods, 13% - automated systems). The standardization of urine particle counting methods continues to be a significant problem in medical laboratories and requires further recovery activities which can be conducted using the EQA scheme.

  10. STEM Gives Meaning to Mathematics

    ERIC Educational Resources Information Center

    Hefty, Lukas J.

    2015-01-01

    The National Council of Teachers of Mathematics' (NCTM's) "Principles and Standards for School Mathematics" (2000) outlines fi ve Process Standards that are essential for developing deep understanding of mathematics: (1) Problem Solving; (2) Reasoning and Proof; (3) Communication; (4) Connections; and (5) Representation. The Common Core…

  11. "Standard" versus "Dialect" in Bilingual Education: An Old Problem in a New Context

    ERIC Educational Resources Information Center

    Fishman, Joshua A.

    1977-01-01

    A survey discussion of the question of standard languages versus dialects in education observes practice and conditions in America and Europe with attention to the definition of dialect. Responsibilities of the bilingual education teacher are outlined. (CHK)

  12. AN ELECTRIFYING NEW SOLUTION TO AN OLD PROBLEM?

    EPA Science Inventory

    The adverse health effects of particles have been linked to many factors, including particle size. The U.S. Environmental Protection Agency (EPA) first issued National Ambient Air QualityStandards (NAAQS) for particular matter (PM) in 1971, amended the standards in 1987 for part...

  13. OSI: Will It Ever See the Light of Day?

    ERIC Educational Resources Information Center

    Moloney, Peter

    1997-01-01

    Examines issues of viability and necessity regarding the Open System Interconnections (OSI) reference service model with a view on future developments. Discusses problems with the standards; conformance testing; OSI bureaucracy; standardized communications; security; the transport level; applications; the stakeholders (communications providers,…

  14. Education on Trial. Strategies for the Future.

    ERIC Educational Resources Information Center

    Johnston, William J., Ed.

    Problems and opportunities in educational reform at all educational levels are considered in this collection of 18 articles. Titles and authors are as follows: Introduction (William J. Johnston); "Evidence of Decline in Educational Standards" (Philip N. Marcus); "Standards--by What Criteria?" (Francis Keppel); "Educational…

  15. Identification of Gambling Problems in Primary Care: Properties of the NODS-CLiP Screening Tool.

    PubMed

    Cowlishaw, Sean; McCambridge, Jim; Kessler, David

    2018-06-25

    There are several brief screening tools for gambling that possess promising psychometric properties, but have uncertain utility in generalist healthcare environments which prioritize prevention and brief interventions. This study describes an examination of the National Opinion Research Centre Diagnostic and Statistical Manual of Mental Disorders Screen for Gambling Problems (NODS-CLiP), in comparison with the Problem Gambling Severity Index (PGSI), when used to operationalize gambling problems across a spectrum of severity. Data were obtained from 1058 primary care attendees recruited from 11 practices in England who completed various measures including the NODS-CLiP and PGSI. The performance of the former was defined by estimates of sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs), when PGSI indicators of problem gambling (5+) and any gambling problems (1+), respectively, were reference standards. The NODS-CLiP demonstrated perfect sensitivity for problem gambling, along with high specificity and a NPV, but a low PPV. There was much lower sensitivity when the indicator of any gambling problems was the reference standard, with capture rates indicating only 20% of patients exhibiting low to moderate severity gambling problems (PGSI 1-4) were identified by the NODS-CLiP. The NODS-CLiP performs well when identifying severe cases of problem gambling, but lacks sensitivity for less severe problems and may be unsuitable for settings which prioritize prevention and brief interventions. There is a need for screening measures which are sensitive across the full spectrum of risk and severity, and can support initiatives for improving identification and responses to gambling problems in healthcare settings such as primary care.

  16. International aerospace standards - An overview

    NASA Astrophysics Data System (ADS)

    Mason, J. L.

    1983-10-01

    Factors to be considered in adopting and extending international standards in the U.S. aerospace industry are reviewed. Cost-related advantages and disadvantages of standardization are weighed, and further obstacles are identified in the English/metric rivalry and the pacing of metrification. The problem of standard duplication is examined, and the issue of revenues from the sale of copyrighted documents describing standards is addressed. It is recommended that international metric-system standards be introduced, with proper timing, wherever possible, and that prompt negotiations be undertaken to prevent or resolve document-sales disagreements. The continuation of English-system standards for safety-related cockpit readouts and communications systems is suggested.

  17. An Experimental Copyright Moratorium: Study of a Proposed Solution to the Copyright Photocopying Problem. Final Report to the American Society for Testing and Materials (ASTM).

    ERIC Educational Resources Information Center

    Heilprin, Laurence B.

    The Committee to Investigate Copyright Problems (CICP), a non-profit organization dedicated to resolving the conflict known as the "copyright photocopying problem" was joined by the American Society for Testing and Materials (ASTM), a large national publisher of technical and scientific standards, in a plan to simulate a long-proposed…

  18. The Language Factor in Elementary Mathematics Assessments: Computational Skills and Applied Problem Solving in a Multidimensional IRT Framework

    ERIC Educational Resources Information Center

    Hickendorff, Marian

    2013-01-01

    The results of an exploratory study into measurement of elementary mathematics ability are presented. The focus is on the abilities involved in solving standard computation problems on the one hand and problems presented in a realistic context on the other. The objectives were to assess to what extent these abilities are shared or distinct, and…

  19. ACCESS: Design and Sub-System Performance

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary Elizabeth; Morris, Matthew J.; McCandliss, Stephan R.; Rasucher, Bernard J.; Kimble, Randy A.; Kruk, Jeffrey W.; Pelton, Russell; Mott, D. Brent; Wen, Hiting; Foltz, Roger; hide

    2012-01-01

    Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. ACCESS, "Absolute Color Calibration Experiment for Standard Stars", is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 -1.7 micrometer bandpass.

  20. Problem solving and decisionmaking: An integration

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1980-01-01

    An attempt was made to redress a critical fault of decisionmaking and problem solving research-a lack of a standard method to classify problem or decision states or conditions. A basic model was identified and expanded to indicate a possible taxonomy of conditions which may be used in reviewing previous research or for systematically pursuing new research designs. A generalization of the basic conditions was then made to indicate that the conditions are essentially the same for both concepts, problem solving and decisionmaking.

  1. Mental Health Problems in Adults with Williams Syndrome

    ERIC Educational Resources Information Center

    Stinton, Chris; Elison, Sarah; Howlin, Patricia

    2010-01-01

    Although many researchers have investigated emotional and behavioral difficulties in individuals with Williams syndrome, few have used standardized diagnostic assessments. We examined mental health problems in 92 adults with Williams syndrome using the Psychiatric Assessment Schedule for Adults with Developmental Disabilities--PAS-ADD (Moss,…

  2. College Basketball on the Line.

    ERIC Educational Resources Information Center

    Suggs, Welch

    1999-01-01

    The National Collegiate Athletic Association (NCAA) has convened a working group to address problems in recruiting, gambling, academic standards, and other corrupt practices in college basketball programs. Such problems are neither new nor unique to basketball, and changing college sports has proven to be difficult. Recommendations are anticipated…

  3. Fostering Perseverance

    ERIC Educational Resources Information Center

    Lewis, Jennifer M.; Özgün-Koca, S. Asli

    2016-01-01

    Sustaining engagement with a mathematics task is not a novel suggestion for effective mathematics teaching. "Principles and Standards for School Mathematics" (2000) specified that "students need to know that a challenging problem will take some time and that perseverance is an important aspect of the problem-solving process and of…

  4. Behaviour of 4- to 5-year-old nondisabled ELBW children: Outcomes following group-based physiotherapy intervention.

    PubMed

    Brown, L; Burns, Y R; Watter, P; Gray, P H; Gibbons, K S

    2018-03-01

    Extreme prematurity or extremely low birth weight (ELBW) can adversely affect behaviour. Nondisabled ELBW children are at risk of behavioural problems, which may become a particular concern after commencement of formal education. This study explored the frequency of behavioural and emotional problems amongst nondisabled ELBW children at 4 to 5 years of age and whether intervention had a positive influence on behaviour. The relationship between behaviour, gender, and other areas of performance at 5 years was explored. Fifty 4-year-old children (born <28 weeks gestation or birth weight <1,000 g) with minimal/mild motor impairment were randomly allocated to intervention (n = 24) or standard care (n = 26). Intervention was 6 group-based physiotherapy weekly sessions and home programme. Standard care was best practice advice. The Child Behavior Checklist (CBCL) for preschool children was completed at baseline and at 1-year post-baseline. Other measures at follow-up included Movement Assessment Battery for Children Second Edition, Beery Visual-Motor Integration Test 5th Edition, and Peabody Picture Vocabulary Test 4th Edition. The whole cohort improved on CBCL total problems score between baseline (mean 50.0, SD 11.1) and 1-year follow-up (mean 45.2, SD 10.3), p = .004. There were no significant differences between groups over time on CBCL internalizing, externalizing, or total problems scores. The intervention group showed a mean difference in total problems score of -3.8 (CI [1.5, 9.1]) between times, with standard care group values being -4.4 (CI [1.6, 7.1]). Males had higher total problems scores than females (p = .026), although still performed within the "normal" range. CBCL scores did not correlate with other scores. The behaviour of nondisabled ELBW children was within the "normal" range at 4 to 5 years, and both intervention and standard care may have contributed to improved behavioural outcomes. Behaviour was not related to performance in other developmental domains. © 2017 John Wiley & Sons Ltd.

  5. Design optimization of steel frames using an enhanced firefly algorithm

    NASA Astrophysics Data System (ADS)

    Carbas, Serdar

    2016-12-01

    Mathematical modelling of real-world-sized steel frames under the Load and Resistance Factor Design-American Institute of Steel Construction (LRFD-AISC) steel design code provisions, where the steel profiles for the members are selected from a table of steel sections, turns out to be a discrete nonlinear programming problem. Finding the optimum design of such design optimization problems using classical optimization techniques is difficult. Metaheuristic algorithms provide an alternative way of solving such problems. The firefly algorithm (FFA) belongs to the swarm intelligence group of metaheuristics. The standard FFA has the drawback of being caught up in local optima in large-sized steel frame design problems. This study attempts to enhance the performance of the FFA by suggesting two new expressions for the attractiveness and randomness parameters of the algorithm. Two real-world-sized design examples are designed by the enhanced FFA and its performance is compared with standard FFA as well as with particle swarm and cuckoo search algorithms.

  6. Randomized trial of intensive motivational interviewing for methamphetamine dependence.

    PubMed

    Polcin, Douglas L; Bond, Jason; Korcha, Rachael; Nayak, Madhabika B; Galloway, Gantt P; Evans, Kristy

    2014-01-01

    An intensive, 9-session motivational interviewing (IMI) intervention was assessed using a randomized clinical trial of 217 methamphetamine (MA) dependent individuals. Intensive motivational interviewing (IMI) was compared with a single standard session of MI (SMI) combined with eight nutrition education sessions. Interventions were delivered weekly over 2 months. All study participants also received standard outpatient group treatment three times per week. Both study groups showed significant decreases in MA use and Addiction Severity Index drug scores, but there were no significant differences between the two groups. However, reductions in Addiction Severity Index psychiatric severity scores and days of psychiatric problems during the past 30 days were found for clients in the IMI group but not the SMI group. SMI may be equally beneficial to IMI in reducing MA use and problem severity, but IMI may help alleviate co-occurring psychiatric problems that are unaffected by shorter MI interventions. Additional studies are needed to assess the problems, populations, and contexts for which IMI is effective.

  7. Reduced Risk-Taking After Prior Losses in Pathological Gamblers Under Treatment and Healthy Control Group but not in Problem Gamblers.

    PubMed

    Bonini, Nicolao; Grecucci, Alessandro; Nicolè, Manuel; Savadori, Lucia

    2018-06-01

    A group of pathological gamblers and a group of problem gamblers (i.e., gamblers at risk of becoming pathological) were compared to healthy controls on their risk-taking propensity after prior losses. Each participant played both the Balloon Analogue Risk Taking task (BART) and a modified version of the same task, where individuals face five repeated predetermined early losses at the onset of the game. No significant difference in risk-taking was found between groups on the standard BART task, while significant differences emerged when comparing behaviors in the two tasks: both pathological gamblers and controls reduced their risk-taking tendency after prior losses in the modified BART compared to the standard BART, whereas problem gamblers showed no reduction in risk-taking after prior losses. We interpret these results as a sign of a reduced sensitivity to negative feedback in problem gamblers which might contribute to explain their loss-chasing tendency.

  8. The problem of natural funnel asymmetries: a simulation analysis of meta-analysis in macroeconomics.

    PubMed

    Callot, Laurent; Paldam, Martin

    2011-06-01

    Effect sizes in macroeconomic are estimated by regressions on data published by statistical agencies. Funnel plots are a representation of the distribution of the resulting regression coefficients. They are normally much wider than predicted by the t-ratio of the coefficients and often asymmetric. The standard method of meta-analysts in economics assumes that the asymmetries are because of publication bias causing censoring and adjusts the average accordingly. The paper shows that some funnel asymmetries may be 'natural' so that they occur without censoring. We investigate such asymmetries by simulating funnels by pairs of data generating processes (DGPs) and estimating models (EMs), in which the EM has the problem that it disregards a property of the DGP. The problems are data dependency, structural breaks, non-normal residuals, non-linearity, and omitted variables. We show that some of these problems generate funnel asymmetries. When they do, the standard method often fails. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.

  9. The efficacy of problem-solving treatments after deliberate self-harm: meta-analysis of randomized controlled trials with respect to depression, hopelessness and improvement in problems.

    PubMed

    Townsend, E; Hawton, K; Altman, D G; Arensman, E; Gunnell, D; Hazell, P; House, A; Van Heeringen, K

    2001-08-01

    Brief problem-solving therapy is regarded as a pragmatic treatment for deliberate self-harm (DSH) patients. A recent meta-analysis of randomized controlled trials (RCTs) evaluating this approach indicated a trend towards reduced repetition of DSH but the pooled odds ratio was not statistically significant. We have now examined other important outcomes using this procedure, namely depression, hopelessness and improvement in problems. Six trials in which problem-solving therapy was compared with control treatment were identified from an extensive literature review of RCTs of treatments for DSH patients. Data concerning depression, hopelessness and improvement in problems were extracted. Where relevant statistical data (e.g. standard deviations) were missing these were imputed using various statistical methods. Results were pooled using meta-analytical procedures. At follow-up, patients who were offered problem-solving therapy had significantly greater improvement in scores for depression (standardized mean difference = -0.36; 95% CI -0.61 to -0.11) and hopelessness (weighted mean difference =-3.2; 95% CI -4.0 to -2.41), and significantly more reported improvement in their problems (odds ratio = 2.31; 95% CI 1.29 to 4.13), than patients who were in the control treatment groups. Problem-solving therapy for DSH patients appears to produce better results than control treatment with regard to improvement in depression, hopelessness and problems. It is desirable that this finding is confirmed in a large trial, which will also allow adequate testing of the impact of this treatment on repetition of DSH.

  10. Plantar pressure cartography reconstruction from 3 sensors.

    PubMed

    Abou Ghaida, Hussein; Mottet, Serge; Goujon, Jean-Marc

    2014-01-01

    Foot problem diagnosis is often made by using pressure mapping systems, unfortunately located and used in the laboratories. In the context of e-health and telemedicine for home monitoring of patients having foot problems, our focus is to present an acceptable system for daily use. We developed an ambulatory instrumented insole using 3 pressures sensors to visualize plantar pressure cartographies. We show that a standard insole with fixed sensor position could be used for different foot sizes. The results show an average error measured at each pixel of 0.01 daN, with a standard deviation of 0.005 daN.

  11. Workmanship Challenges for NASA Mission Hardware

    NASA Technical Reports Server (NTRS)

    Plante, Jeannette

    2010-01-01

    This slide presentation reviews several challenges in workmanship for NASA mission hardware development. Several standards for NASA workmanship exist, that are required for all programs, projects, contracts and subcontracts. These Standards contain our best known methods for avoiding past assembly problems and defects. These best practices may not be available if suppliers are used who are not compliant with them. Compliance includes having certified operators and inspectors. Some examples of problems that have occured from the lack of requirements flow-down to contractors are reviewed. The presentation contains a detailed example of the challenge in regards to The Packaging "Design" Dilemma.

  12. Screening for problem gambling within mental health services: a comparison of the classification accuracy of brief instruments.

    PubMed

    Dowling, Nicki A; Merkouris, Stephanie S; Manning, Victorian; Volberg, Rachel; Lee, Stuart J; Rodda, Simone N; Lubman, Dan I

    2018-06-01

    Despite the over-representation of people with gambling problems in mental health populations, there is limited information available to guide the selection of brief screening instruments within mental health services. The primary aim was to compare the classification accuracy of nine brief problem gambling screening instruments (two to five items) with a reference standard among patients accessing mental health services. The classification accuracy of nine brief screening instruments was compared with multiple cut-off scores on a reference standard. Eight mental health services in Victoria, Australia. A total of 837 patients were recruited consecutively between June 2015 and January 2016. The brief screening instruments were the Lie/Bet Questionnaire, Brief Problem Gambling Screen (BPGS) (two- to five-item versions), NODS-CLiP, NODS-CLiP2, Brief Biosocial Gambling Screen (BBGS) and NODS-PERC. The Problem Gambling Severity Index (PGSI) was the reference standard. The five-item BPGS was the only instrument displaying satisfactory classification accuracy in detecting any level of gambling problem (low-risk, moderate-risk or problem gambling) (sensitivity = 0.803, specificity = 0.982, diagnostic efficiency = 0.943). Several shorter instruments adequately detected both problem and moderate-risk, but not low-risk, gambling: two three-item instruments (NODS-CLiP, three-item BPGS) and two four-item instruments (NODS-PERC, four-item BPGS) (sensitivity = 0.854-0.966, specificity = 0.901-0.954, diagnostic efficiency = 0.908-0.941). The four-item instruments, however, did not provide any considerable advantage over the three-item instruments. Similarly, the very brief (two-item) instruments (Lie/Bet and two-item BPGS) adequately detected problem gambling (sensitivity = 0.811-0.868, specificity = 0.938-0.943, diagnostic efficiency = 0.933-0.934), but not moderate-risk or low-risk gambling. The optimal brief screening instrument for mental health services wanting to screen for any level of gambling problem is the five-item Brief Problem Gambling Screen (BPGS). Services wanting to employ a shorter instrument or to screen only for more severe gambling problems (moderate-risk/problem gambling) can employ the NODS-CLiP or the three-item BPGS. Services that are only able to accommodate a very brief instrument can employ the Lie/Bet Questionnaire or the two-item BPGS. © 2017 Society for the Study of Addiction.

  13. Implementation of a Standards-Based Grading Model: A Study of Parent and Teacher Perceptions of Success

    ERIC Educational Resources Information Center

    Wheeler, Amber D.

    2017-01-01

    The purpose of this study is to explore the perceptions of parents and teachers regarding the success of a standards-based grading initiative in meeting its goals. Furthermore, findings from this study will be used to inform decisions made in future grade level implementations. Standards-based grading meets all criteria for a problem of practice.…

  14. Neglecting the Importance of the Decision Making and Care Regimes of Personal Support Workers: A Critique of Standardization of Care Planning through the RAI/MDS

    ERIC Educational Resources Information Center

    Kontos, Pia C.; Miller, Karen-Lee; Mitchell, Gail J.

    2010-01-01

    Purpose: The Resident Assessment Instrument-Minimum Data Set (RAI/MDS) is an interdisciplinary standardized process that informs care plan development in nursing homes. This standardized process has failed to consistently result in individualized care planning, which may suggest problems with content and planning integrity. We examined the…

  15. A Descriptive Case Study of Writing Standards-Based Individualized Education Plan Goals via Problem-Based Learning in a Virtual World

    ERIC Educational Resources Information Center

    Blair, Peter J.

    2017-01-01

    The goal of this study was to examine the professional development experiences of two contrastive participants while they were creating standards-based individualized education plan (IEP) goals using a virtual world called TeacherSim. Two specific focuses of the study were on how special educators engaged with the task of creating standards-based…

  16. Materials and Process Specifications and Standards

    DTIC Science & Technology

    1977-11-01

    Integrity Requirements; Fracture Control 65 5.9.3 Some Special Problems in Electronic 66 Materials Specifications 5.9.3.1 Thermal Stresses 66...fatigue and fracture and by defining human engineering concepts. Conform to OSHA regulations such as toxicity, noise levels etc. Develop...Standardization Society of the Valves and Fittings Industry. 41 4.6.2.4 OTHER ORGANIZATIONS There are a number of standards-making organizations that cannot

  17. "It's Not My Problem": The Growth of Non-Standard Work and Its Impact on Vocational Education and Training in Australia.

    ERIC Educational Resources Information Center

    Hall, Richard; Bretherton, Tanya; Buchanan, John

    A study investigated implications of the increase in non-standard forms of employment (casual work, working through labor-hire companies, and work that is outsourced) for vocational education and training (VET) in Australia. Data sources were published statistics on growth of non-standard work; research on reasons for the growth and the business…

  18. Can Performance-Related Learning Outcomes Have Standards?

    ERIC Educational Resources Information Center

    Brockmann, Michaela; Clarke, Linda; Winch, Christopher

    2008-01-01

    Purpose: This paper aims to explain the distinction between educational standards and learning outcomes and to indicate the problems that potentially arise when a learning outcomes approach is applied to a qualification meta-framework like the European Qualification Framework, or indeed to national qualification frameworks.…

  19. Testing and the Testing Industry: A Third View.

    ERIC Educational Resources Information Center

    Williams, John D.

    Different viewpoints regarding educational testing are described. While some people advocate continuing reliance upon standardized tests, others favor the discontinuation of such achievement and intelligence tests. The author recommends a moderate view somewhere between these two extremes. Problems associated with standardized testing in the…

  20. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  1. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  2. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  3. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  4. 40 CFR 171.4 - Standards for certification of commercial applicators.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS CERTIFICATION OF PESTICIDE APPLICATORS § 171.4 Standards for certification of commercial applicators. (a) Determination of competency. Competence in the use and handling of pesticides... pesticides. Testing shall be based on examples of problems and situations appropriate to the particular...

  5. Generating Linear Equations Based on Quantitative Reasoning

    ERIC Educational Resources Information Center

    Lee, Mi Yeon

    2017-01-01

    The Common Core's Standards for Mathematical Practice encourage teachers to develop their students' ability to reason abstractly and quantitatively by helping students make sense of quantities and their relationships within problem situations. The seventh-grade content standards include objectives pertaining to developing linear equations in…

  6. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  7. Detecting effects of the indicated prevention Programme for Externalizing Problem behaviour (PEP) on child symptoms, parenting, and parental quality of life in a randomized controlled trial.

    PubMed

    Hanisch, Charlotte; Freund-Braier, Inez; Hautmann, Christopher; Jänen, Nicola; Plück, Julia; Brix, Gabriele; Eichelberger, Ilka; Döpfner, Manfred

    2010-01-01

    Behavioural parent training is effective in improving child disruptive behavioural problems in preschool children by increasing parenting competence. The indicated Prevention Programme for Externalizing Problem behaviour (PEP) is a group training programme for parents and kindergarten teachers of children aged 3-6 years with externalizing behavioural problems. To evaluate the effects of PEP on child problem behaviour, parenting practices, parent-child interactions, and parental quality of life. Parents and kindergarten teachers of 155 children were randomly assigned to an intervention group (n = 91) and a nontreated control group (n = 64). They rated children's problem behaviour before and after PEP training; parents also reported on their parenting practices and quality of life. Standardized play situations were video-taped and rated for parent-child interactions, e.g. parental warmth. In the intention to treat analysis, mothers of the intervention group described less disruptive child behaviour and better parenting strategies, and showed more parental warmth during a standardized parent-child interaction. Dosage analyses confirmed these results for parents who attended at least five training sessions. Children were also rated to show less behaviour problems by their kindergarten teachers. Training effects were especially positive for parents who attended at least half of the training sessions. CBCL: Child Behaviour Checklist; CII: Coder Impressions Inventory; DASS: Depression anxiety Stress Scale; HSQ: Home-situation Questionnaire; LSS: Life Satisfaction Scale; OBDT: observed behaviour during the test; PCL: Problem Checklist; PEP: prevention programme for externalizing problem behaviour; PPC: Parent Problem Checklist; PPS: Parent Practices Scale; PS: Parenting Scale; PSBC: Problem Setting and Behaviour checklist; QJPS: Questionnaire on Judging Parental Strains; SEFS: Self-Efficacy Scale; SSC: Social Support Scale; TRF: Caregiver-Teacher Report Form.

  8. Constructed-Response Problems

    ERIC Educational Resources Information Center

    Swinford, Ashleigh

    2016-01-01

    With rigor outlined in state and Common Core standards and the addition of constructed-response test items to most state tests, math constructed-response questions have become increasingly popular in today's classroom. Although constructed-response problems can present a challenge for students, they do offer a glimpse of students' learning through…

  9. Anticipation Guides: Reading for Mathematics Understanding

    ERIC Educational Resources Information Center

    Adams, Anne E.; Pegg, Jerine; Case, Melissa

    2015-01-01

    With the acceptance by many states of the Common Core State Standards for Mathematics, new emphasis is being placed on students' ability to engage in mathematical practices such as understanding problems (including word problems), reading and critiquing arguments, and making explicit use of definitions (CCSSI 2010). Engaging students in…

  10. The Real World of the Beginning Teacher.

    ERIC Educational Resources Information Center

    National Education Association, Washington, DC. National Commission on Teacher Education and Professional Standards.

    Problems and goals of beginning teachers are the subject of these speeches presented by both experienced and beginning teachers at the 1965 national conference of the National Commission on Teacher Education and Professional Standards. The problems include the differences between teacher expectations and encounters, unrealistic teaching and…

  11. 45 CFR Appendix A to Part 1210 - Standard for Examiners

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... of: (i) The personal attributes essential to the effective performance of the duties of an Examiner... causes of complex problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; —Interpret and apply regulations and other complex written material...

  12. 45 CFR Appendix A to Part 1210 - Standard for Examiners

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of: (i) The personal attributes essential to the effective performance of the duties of an Examiner... causes of complex problems and apply mature judgment in assessing the practical implications of alternative solutions to those problems; —Interpret and apply regulations and other complex written material...

  13. The Cake Contest

    ERIC Educational Resources Information Center

    Haberern, Colleen

    2016-01-01

    With the adoption of the Common Core State Standards for Mathematics (CCSSM), many teachers are changing their classroom structure from teacher-directed to student-centered. When the author began designing and using problem-based tasks she saw a drastic improvement in student engagement and problem-solving skills. The author describes the Cake…

  14. Mastery Multiplied

    ERIC Educational Resources Information Center

    Shumway, Jessica F.; Kyriopoulos, Joan

    2014-01-01

    Being able to find the correct answer to a math problem does not always indicate solid mathematics mastery. A student who knows how to apply the basic algorithms can correctly solve problems without understanding the relationships between numbers or why the algorithms work. The Common Core standards require that students actually understand…

  15. [The status and current problems of the radiation protection support for Naval personnel].

    PubMed

    Sharaevskiĭ, G Iu; Murin, M B; Belikov, A D; Petrov, O I

    1999-07-01

    The article focuses on the radiation problems for the Navy personnel dealing with the nuclear and radioactive waste, since the existing standards become obsolete due to some new technologies in the development of the materials, endangering the environment and people's health.

  16. The Behavioural Profile of Psychiatric Disorders in Persons with Intellectual Disability

    ERIC Educational Resources Information Center

    Kishore, M. T.; Nizamie, S. H.; Nizamie, A.

    2005-01-01

    Background: Problems associated with psychiatric diagnoses could be minimized by identifying behavioural clusters of specific psychiatric disorders. Methods: Sixty persons with intellectual disability (ID) and behavioural problems, aged 12?55 years, were assessed with standardized Indian tools for intelligence and adaptive behaviour. Clinical…

  17. Unravelling the confusion caused by GASB, FASB accounting rules.

    PubMed

    Duis, T E

    1994-11-01

    Separate GASB and FASB accounting and financial reporting rules for governmental healthcare providers are producing confusion. Among other problems, they reduce the usefulness of aggregated data about the healthcare industry. This article addresses the inconsistencies of the various reporting standards and identified problems they can cause.

  18. Using information technology for an improved pharmaceutical care delivery in developing countries. Study case: Benin.

    PubMed

    Edoh, Thierry Oscar; Teege, Gunnar

    2011-10-01

    One of the problems in health care in developing countries is the bad accessibility of medicine in pharmacies for patients. Since this is mainly due to a lack of organization and information, it should be possible to improve the situation by introducing information and communication technology. However, for several reasons, standard solutions are not applicable here. In this paper, we describe a case study in Benin, a West African developing country. We identify the problem and the existing obstacles for applying standard ECommerce solutions. We develop an adapted system approach and describe a practical test which has shown that the approach has the potential of actually improving the pharmaceutical care delivery. Finally, we consider the security aspects of the system and propose an organizational solution for some specific security problems.

  19. Toward the automated analysis of plasma physics problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mynick, H.E.

    1989-04-01

    A program (CALC) is described, which carries out nontrivial plasma physics calculations, in a manner intended to emulate the approach of a human theorist. This includes the initial process of gathering the relevant equations from a plasma knowledge base, and then determining how to solve them. Solution of the sets of equations governing physics problems, which in general have a nonuniform,irregular structure, not amenable to solution by standardized algorithmic procedures, is facilitated by an analysis of the structure of the equations and the relations among them. This often permits decompositions of the full problem into subproblems, and other simplifications inmore » form, which renders the resultant subsystems soluble by more standardized tools. CALC's operation is illustrated by a detailed description of its treatment of a sample plasma calculation. 5 refs., 3 figs.« less

  20. Developing and validating the Youth Conduct Problems Scale-Rwanda: a mixed methods approach.

    PubMed

    Ng, Lauren C; Kanyanganzi, Frederick; Munyanah, Morris; Mushashi, Christine; Betancourt, Theresa S

    2014-01-01

    This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R). Qualitative free listing (n = 74) and key informant interviews (n = 47) identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research.

  1. The effectiveness of the Stop Now and Plan (SNAP) program for boys at risk for violence and delinquency.

    PubMed

    Burke, Jeffrey D; Loeber, Rolf

    2015-02-01

    Among the available treatments for disruptive behavior problems, a need remains for additional service options to reduce antisocial behavior and prevent further development along delinquent and violent pathways. The Stop Now and Plan (SNAP) Program is an intervention for antisocial behavior among boys between 6 and 11. This paper describes a randomized controlled treatment effectiveness study of SNAP versus standard behavioral health services. The treatment program was delivered to youth with aggressive, rule-breaking, or antisocial behavior in excess of clinical criterion levels. Outcomes were measured at 3, 9, and 15 months from baseline. Youth in the SNAP condition showed significantly greater reduction in aggression, conduct problems, and overall externalizing behavior, as well as counts of oppositional defiant disorder and attention deficit hyperactivity disorder symptoms. Additional benefits for SNAP were observed on measures of depression and anxiety. Further analyses indicated that the SNAP program was more effective among those with a higher severity of initial behavioral problems. At 1 year follow-up, treatment benefits for SNAP were maintained on some outcome measures (aggression, ADHD and ODD, depression and anxiety) but not others. Although overall juvenile justice system contact was not significantly different, youth in SNAP had significantly fewer charges against them relative to those standard services. The SNAP Program, when contrasted with standard services alone, was associated with greater, clinically meaningful, reductions in targeted behaviors. It may be particularly effective for youth with more severe behavioral problems and may result in improvements in internalizing problems as well.

  2. Evaluation of fluoride levels in bottled water and their contribution to health and teeth problems in the United Arab Emirates.

    PubMed

    Abouleish, Mohamed Yehia Z

    2016-10-01

    Fluoride is needed for better health, yet if ingested at higher levels it may lead to health problems. Fluoride can be obtained from different sources, with drinking water being a major contributor. In the United Arab Emirates (UAE), bottled water is the major source for drinking. The aim of this research is to measure fluoride levels in different bottled water brands sold in UAE, to determine whether fluoride contributes to better health or health problems. The results were compared to international and local standards. Fluoride was present in seven out of 23 brands. One brand exhibited high fluoride levels, which exceeded all standards, suggesting it may pose health problems. Other brands were either below or above standards, suggesting either contribution to better health or health problems, depending on ingested amount. A risk assessment suggested a potential for non-cancer effects from some brands. The results were compared to fluoride levels in bottled water sold in UAE and neighboring countries (e.g. Saudi Arabia, Qatar, Kuwait, and Bahrain), over 24 years, to reflect on changes in fluoride levels in bottled water in this region. The research presents the need for creating, stricter regulations that require careful fluoride monitoring and new regulations that require listing fluoride level on the bottled water label, internationally and regionally. The research will have local and global health impact, as bottled water sold in UAE and neighboring countries, is produced locally and imported from international countries, e.g. Switzerland, the USA, France, Italy, New Zealand, and Fiji.

  3. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  4. Optimal control of a harmonic oscillator: Economic interpretations

    NASA Astrophysics Data System (ADS)

    Janová, Jitka; Hampel, David

    2013-10-01

    Optimal control is a popular technique for modelling and solving the dynamic decision problems in economics. A standard interpretation of the criteria function and Lagrange multipliers in the profit maximization problem is well known. On a particular example, we aim to a deeper understanding of the possible economic interpretations of further mathematical and solution features of the optimal control problem: we focus on the solution of the optimal control problem for harmonic oscillator serving as a model for Phillips business cycle. We discuss the economic interpretations of arising mathematical objects with respect to well known reasoning for these in other problems.

  5. Technical Report, Onondaga Lake, New York, Main Report

    DTIC Science & Technology

    1992-01-01

    growth . Section 3 of this report will expand upon the specific water quality problems. EXISTING CONDITIONS Page 23 Table V - Comparison of Current...This technical report on Ononidaga Lake, New York has compi led existing data to determine which water quality and enviromental enhancements are... bacteria is a problem during storm events causing contravention of the State swimming standards. The source of the problem has been identified as the

  6. The associations of indoor environment and psychosocial factors on the subjective evaluation of Indoor Air Quality among lower secondary school students: a multilevel analysis.

    PubMed

    Finell, E; Haverinen-Shaughnessy, U; Tolvanen, A; Laaksonen, S; Karvonen, S; Sund, R; Saaristo, V; Luopa, P; Ståhl, T; Putus, T; Pekkanen, J

    2017-03-01

    Subjective evaluation of Indoor Air Quality (subjective IAQ) reflects both building-related and psychosocial factors, but their associations have rarely been studied other than on the individual level in occupational settings and their interactions have not been assessed. Therefore, we studied whether schools' observed indoor air problems and psychosocial factors are associated with subjective IAQ and their potential interactions. The analysis was performed with a nationwide sample (N = 195 schools/26946 students) using multilevel modeling. Two datasets were merged: (i) survey data from students, including information on schools' psychosocial environment and subjective IAQ, and (ii) data from school principals, including information on observed indoor air problems. On the student level, school-related stress, poor teacher-student relationship, and whether the student did not easily receive help from school personnel, were significantly associated with poor subjective IAQ. On the school level, observed indoor air problem (standardized β = -0.43) and poor teacher-student relationship (standardized β = -0.22) were significant predictors of poor subjective IAQ. In addition, school-related stress was associated with poor subjective IAQ, but only in schools without observed indoor air problem (standardized β = -0.44). © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Adaptation of interoperability standards for cross domain usage

    NASA Astrophysics Data System (ADS)

    Essendorfer, B.; Kerth, Christian; Zaschke, Christian

    2017-05-01

    As globalization affects most aspects of modern life, challenges of quick and flexible data sharing apply to many different domains. To protect a nation's security for example, one has to look well beyond borders and understand economical, ecological, cultural as well as historical influences. Most of the time information is produced and stored digitally and one of the biggest challenges is to receive relevant readable information applicable to a specific problem out of a large data stock at the right time. These challenges to enable data sharing across national, organizational and systems borders are known to other domains (e.g., ecology or medicine) as well. Solutions like specific standards have been worked on for the specific problems. The question is: what can the different domains learn from each other and do we have solutions when we need to interlink the information produced in these domains? A known problem is to make civil security data available to the military domain and vice versa in collaborative operations. But what happens if an environmental crisis leads to the need to quickly cooperate with civil or military security in order to save lives? How can we achieve interoperability in such complex scenarios? The paper introduces an approach to adapt standards from one domain to another and lines out problems that have to be overcome and limitations that may apply.

  8. Health Complaints Associated with Poor Rental Housing Conditions in Arkansas: The Only State without a Landlord’s Implied Warranty of Habitability

    PubMed Central

    Bachelder, Ashley E.; Stewart, M. Kate; Felix, Holly C.; Sealy, Neil

    2016-01-01

    Arkansas is the only U.S. state that does not have a landlord’s implied warranty of habitability, meaning tenants have a requirement for maintaining their rental properties at certain habitability standards, but landlords are not legally required to contribute to those minimum health and safety standards. This project assessed the possibility that this lack of landlord responsibility affects tenants’ perceived health. Using surveys and interviews, we collected self-reported data on the prevalence and description of problems faced by renters who needed household repairs from their landlords. Of almost 1,000 renters, one-third of them had experienced a problem with their landlord making needed repairs; and one-quarter of those had a health issue they attributed to their housing conditions. Common issues included problems with plumbing, heating, or cooling systems, and pest or rodent control. Reported health problems included elevated stress levels, breathing problems, headaches, high blood pressure, and bites or infections. Hispanic respondents and those with less than a high school education were both significantly more likely to report problems with their landlords not making repairs as requested. These data suggest that the lack of landlord requirements may negatively impact the condition of rental properties and, therefore, may negatively impact the health of Arkansas renters. PMID:27933288

  9. Health Complaints Associated with Poor Rental Housing Conditions in Arkansas: The Only State without a Landlord's Implied Warranty of Habitability.

    PubMed

    Bachelder, Ashley E; Stewart, M Kate; Felix, Holly C; Sealy, Neil

    2016-01-01

    Arkansas is the only U.S. state that does not have a landlord's implied warranty of habitability, meaning tenants have a requirement for maintaining their rental properties at certain habitability standards, but landlords are not legally required to contribute to those minimum health and safety standards. This project assessed the possibility that this lack of landlord responsibility affects tenants' perceived health. Using surveys and interviews, we collected self-reported data on the prevalence and description of problems faced by renters who needed household repairs from their landlords. Of almost 1,000 renters, one-third of them had experienced a problem with their landlord making needed repairs; and one-quarter of those had a health issue they attributed to their housing conditions. Common issues included problems with plumbing, heating, or cooling systems, and pest or rodent control. Reported health problems included elevated stress levels, breathing problems, headaches, high blood pressure, and bites or infections. Hispanic respondents and those with less than a high school education were both significantly more likely to report problems with their landlords not making repairs as requested. These data suggest that the lack of landlord requirements may negatively impact the condition of rental properties and, therefore, may negatively impact the health of Arkansas renters.

  10. Problems and methods of calculating the Legendre functions of arbitrary degree and order

    NASA Astrophysics Data System (ADS)

    Novikova, Elena; Dmitrenko, Alexander

    2016-12-01

    The known standard recursion methods of computing the full normalized associated Legendre functions do not give the necessary precision due to application of IEEE754-2008 standard, that creates a problems of underflow and overflow. The analysis of the problems of the calculation of the Legendre functions shows that the problem underflow is not dangerous by itself. The main problem that generates the gross errors in its calculations is the problem named the effect of "absolute zero". Once appeared in a forward column recursion, "absolute zero" converts to zero all values which are multiplied by it, regardless of whether a zero result of multiplication is real or not. Three methods of calculating of the Legendre functions, that removed the effect of "absolute zero" from the calculations are discussed here. These methods are also of interest because they almost have no limit for the maximum degree of Legendre functions. It is shown that the numerical accuracy of these three methods is the same. But, the CPU calculation time of the Legendre functions with Fukushima method is minimal. Therefore, the Fukushima method is the best. Its main advantage is computational speed which is an important factor in calculation of such large amount of the Legendre functions as 2 401 336 for EGM2008.

  11. Cognitive, emotive, and cognitive-behavioral correlates of suicidal ideation among Chinese adolescents in Hong Kong.

    PubMed

    Kwok, Sylvia Lai Yuk Ching; Shek, Daniel Tan Lei

    2010-03-05

    Utilizing Daniel Goleman's theory of emotional competence, Beck's cognitive theory, and Rudd's cognitive-behavioral theory of suicidality, the relationships between hopelessness (cognitive component), social problem solving (cognitive-behavioral component), emotional competence (emotive component), and adolescent suicidal ideation were examined. Based on the responses of 5,557 Secondary 1 to Secondary 4 students from 42 secondary schools in Hong Kong, results showed that suicidal ideation was positively related to adolescent hopelessness, but negatively related to emotional competence and social problem solving. While standard regression analyses showed that all the above variables were significant predictors of suicidal ideation, hierarchical regression analyses showed that hopelessness was the most important predictor of suicidal ideation, followed by social problem solving and emotional competence. Further regression analyses found that all four subscales of emotional competence, i.e., empathy, social skills, self-management of emotions, and utilization of emotions, were important predictors of male adolescent suicidal ideation. However, the subscale of social skills was not a significant predictor of female adolescent suicidal ideation. Standard regression analysis also revealed that all three subscales of social problem solving, i.e., negative problem orientation, rational problem solving, and impulsiveness/carelessness style, were important predictors of suicidal ideation. Theoretical and practice implications of the findings are discussed.

  12. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  13. Epistemic Beliefs about Justification Employed by Physics Students and Faculty in Two Different Problem Contexts

    NASA Astrophysics Data System (ADS)

    Çağlayan Mercan, Fatih

    2012-06-01

    This study examines the epistemic beliefs about justification employed by physics undergraduate and graduate students and faculty in the context of solving a standard classical physics problem and a frontier physics problem. Data were collected by a think-aloud problem solving session followed by a semi-structured interview conducted with 50 participants, 10 participants at freshmen, seniors, masters, PhD, and faculty levels. Seven modes of justification were identified and used for exploring the relationships between each justification mode and problem context, and expertise level. The data showed that justification modes were not mutually exclusive and many respondents combined different modes in their responses in both problem contexts. Success on solving the standard classical physics problem was not related to any of the justification modes and was independent of expertise level. The strength of the association across the problem contexts for the authoritative, rational, and empirical justification modes fell in the medium range and for the modeling justification mode fell in the large range of practical significance. Expertise level was not related with the empirical and religious justification modes. The strength of the association between the expertise level and the authoritative, rational, experiential, and relativistic justification modes fell in the medium range, and the modeling justification mode fell in the large range of practical significance. The results provide support for the importance of context for the epistemic beliefs about justification and are discussed in terms of the implications for teaching and learning science.

  14. Bootstrap Estimates of Standard Errors in Generalizability Theory

    ERIC Educational Resources Information Center

    Tong, Ye; Brennan, Robert L.

    2007-01-01

    Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…

  15. Software database creature for investment property measurement according to international standards

    NASA Astrophysics Data System (ADS)

    Ponomareva, S. V.; Merzliakova, N. A.

    2018-05-01

    The article deals with investment property measurement and accounting problems at the international, national and enterprise levels. The need to create the software for investment property measurement according to International Accounting Standards was substantiated. The necessary software functions and the processes were described.

  16. 76 FR 38431 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-30

    ... Commission's minimum performance standards regarding registered transfer agents, and (2) to assure that issuers are aware of certain problems and poor performances with respect to the transfer agents that are... failure to comply with the Commission's minimum performance standards then the issuer will be unable to...

  17. Variations on an Historical Case Study

    ERIC Educational Resources Information Center

    Field, Patrick

    2006-01-01

    The National Inquiry Standard for Science Education Preparation requires science teachers to introduce students to scientific inquiry to solve problems by various methods, including active learning in a collaborative environment. In order for science teachers to comply with this inquiry standard, activities must be designed for students to…

  18. Qualification Journey in Teacher Training: Case in Northern Cyprus

    ERIC Educational Resources Information Center

    Erden, Hale

    2016-01-01

    Problem Statement: The identification of professional teaching standards has great value on initial teacher training, hiring teachers, assessing teacher performance, as well as planning and organizing teacher professional development. In Northern Cyprus there are not any identified professional teaching standards. This study aimed at filling this…

  19. Implementing the Curriculum and Evaluation Standards: First-Year Algebra.

    ERIC Educational Resources Information Center

    Kysh, Judith

    1991-01-01

    Described is an alternative first year algebra program developed to bridge the gap between the NCTM's Curriculum and Evaluation Standards and institutional demands of schools. Increased attention is given to graphing as a context for algebra, calculator use, solving "memorable problems," and incorporating geometry concepts, while…

  20. Early Identification of At-Risk LPN-to-RN Students

    ERIC Educational Resources Information Center

    Hawthorne, Lisa K.

    2013-01-01

    Nurse education programs are implementing standardized assessments without evaluating their effectiveness. Graduates of associate degree nursing programs continue to be unsuccessful with licensure examinations, despite standardized testing and stronger admission criteria. This problem is also prevalent for LPN-to-RN education programs due to a…

  1. The Best of Both Worlds

    ERIC Educational Resources Information Center

    Schneider, Jack; Feldman, Joe; French, Dan

    2016-01-01

    Relying on teachers' assessments for the information currently provided by standardized test scores would save instructional time, better capture the true abilities of diverse students, and reduce the problem of teaching to the test. A California high school is implementing standards-based reporting, ensuring that teacher-issued grades function as…

  2. 42 CFR 493.1233 - Standard: Complaint investigations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Complaint investigations. 493.1233 Section 493.1233 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... a system in place to ensure that it documents all complaints and problems reported to the laboratory...

  3. 40 CFR 171.5 - Standards for certification of private applicators.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Standards for certification of private applicators. 171.5 Section 171.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... practical knowledge of the pest problems and pest control practices associated with his agricultural...

  4. A Math-Box Tale

    ERIC Educational Resources Information Center

    Nelson, Catherine J.

    2012-01-01

    The author is a strong proponent of incorporating the Content and Process Standards (NCTM 2000) into the teaching of mathematics. For candidates in her methods course, she models research-based best practices anchored in the Standards. Her students use manipulatives, engage in problem-solving activities, listen to children's literature, and use…

  5. Gaussian-input Gaussian mixture model for representing density maps and atomic models.

    PubMed

    Kawabata, Takeshi

    2018-07-01

    A new Gaussian mixture model (GMM) has been developed for better representations of both atomic models and electron microscopy 3D density maps. The standard GMM algorithm employs an EM algorithm to determine the parameters. It accepted a set of 3D points with weights, corresponding to voxel or atomic centers. Although the standard algorithm worked reasonably well; however, it had three problems. First, it ignored the size (voxel width or atomic radius) of the input, and thus it could lead to a GMM with a smaller spread than the input. Second, the algorithm had a singularity problem, as it sometimes stopped the iterative procedure due to a Gaussian function with almost zero variance. Third, a map with a large number of voxels required a long computation time for conversion to a GMM. To solve these problems, we have introduced a Gaussian-input GMM algorithm, which considers the input atoms or voxels as a set of Gaussian functions. The standard EM algorithm of GMM was extended to optimize the new GMM. The new GMM has identical radius of gyration to the input, and does not suddenly stop due to the singularity problem. For fast computation, we have introduced a down-sampled Gaussian functions (DSG) by merging neighboring voxels into an anisotropic Gaussian function. It provides a GMM with thousands of Gaussian functions in a short computation time. We also have introduced a DSG-input GMM: the Gaussian-input GMM with the DSG as the input. This new algorithm is much faster than the standard algorithm. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  6. What consequences should result from failure to meet internal standards?

    PubMed

    Schramm, J

    1997-01-01

    This paper tries to approach a difficult problem, namely how to deal with a resident who has failed to meet the internal standards of a residency training program. First the problems of the definition of a standard and the associated problems of its reproducibility, documentation, teaching, update, and internal variability inside the same teaching program are dealt with. Consequently the question needs to be answered that constitutes a failure to meet the standard. The results of a survey of residents' attitudes are quoted as are some responses to a survey among the chiefs of teaching programs. Considering the attitudes of residents on how to handle breaches of standard the basic message was that residents want to be told that they do not function. Both parties want the collaboration of senior staff members on this topic. Whereas residents want to re-train, exercise and talk they do not want sanctions. Chiefs, however, want much less re-training, exercising and talking but earlier sanction. The difficult point of dealing with a true failure is discussed in the light of the German legal situation and the actual possibilities of how to handle the case. Before it comes to the point of discontinuing the training of a resident, it needs to be agreed upon what would be a classical situation of failure in which both the chiefs responsible for training and the residents agree that training is better discontinued. The author describes his experience with the real course of events in 7 cases he witnessed in 22 years.

  7. Comparing implementations of penalized weighted least-squares sinogram restoration.

    PubMed

    Forthmann, Peter; Koehler, Thomas; Defrise, Michel; La Riviere, Patrick

    2010-11-01

    A CT scanner measures the energy that is deposited in each channel of a detector array by x rays that have been partially absorbed on their way through the object. The measurement process is complex and quantitative measurements are always and inevitably associated with errors, so CT data must be preprocessed prior to reconstruction. In recent years, the authors have formulated CT sinogram preprocessing as a statistical restoration problem in which the goal is to obtain the best estimate of the line integrals needed for reconstruction from the set of noisy, degraded measurements. The authors have explored both penalized Poisson likelihood (PL) and penalized weighted least-squares (PWLS) objective functions. At low doses, the authors found that the PL approach outperforms PWLS in terms of resolution-noise tradeoffs, but at standard doses they perform similarly. The PWLS objective function, being quadratic, is more amenable to computational acceleration than the PL objective. In this work, the authors develop and compare two different methods for implementing PWLS sinogram restoration with the hope of improving computational performance relative to PL in the standard-dose regime. Sinogram restoration is still significant in the standard-dose regime since it can still outperform standard approaches and it allows for correction of effects that are not usually modeled in standard CT preprocessing. The authors have explored and compared two implementation strategies for PWLS sinogram restoration: (1) A direct matrix-inversion strategy based on the closed-form solution to the PWLS optimization problem and (2) an iterative approach based on the conjugate-gradient algorithm. Obtaining optimal performance from each strategy required modifying the naive off-the-shelf implementations of the algorithms to exploit the particular symmetry and sparseness of the sinogram-restoration problem. For the closed-form approach, the authors subdivided the large matrix inversion into smaller coupled problems and exploited sparseness to minimize matrix operations. For the conjugate-gradient approach, the authors exploited sparseness and preconditioned the problem to speed up convergence. All methods produced qualitatively and quantitatively similar images as measured by resolution-variance tradeoffs and difference images. Despite the acceleration strategies, the direct matrix-inversion approach was found to be uncompetitive with iterative approaches, with a computational burden higher by an order of magnitude or more. The iterative conjugate-gradient approach, however, does appear promising, with computation times half that of the authors' previous penalized-likelihood implementation. Iterative conjugate-gradient based PWLS sinogram restoration with careful matrix optimizations has computational advantages over direct matrix PWLS inversion and over penalized-likelihood sinogram restoration and can be considered a good alternative in standard-dose regimes.

  8. Issues Involved in Developing Ada Real-Time Systems

    DTIC Science & Technology

    1989-02-15

    expensive modifications to the compiler or Ada runtime system to fit a particular application. Whether we can solve the problems of programming real - time systems in...lock in solutions to problems that are not yet well understood in standards as rigorous as the Ada language. Moreover, real - time systems typically have

  9. On Present State of Teaching Russian Language in Russia

    ERIC Educational Resources Information Center

    Tekucheva, Irina V.; Gromova, Liliya Y.

    2016-01-01

    The article discusses the current state of teaching Russian language, discovers the nature of philological education, outlines the main problems of the implementation of the standard in school practice, analyzes the problems of formation of universal educational actions within the context of the implementation of cognitive-communicative approach,…

  10. College Students' Alcohol-Related Problems: An Autophotographic Approach

    ERIC Educational Resources Information Center

    Casey, Patrick F.; Dollinger, Stephen J.

    2007-01-01

    This study related standard self-report measures to an innovative approach (the autophotographic essay) as a way to provide insight into patterns of alcohol consumption and associated problem behaviors. College students (N = 135) completed self-report measures of alcohol consumption and created autophotographic essays of identity coded for alcohol…

  11. Group Mirrors to Support Interaction Regulation in Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Jermann, Patrick; Dillenbourg, Pierre

    2008-01-01

    Two experimental studies test the effect of group mirrors upon quantitative and qualitative aspects of participation in collaborative problem solving. Mirroring tools consist of a graphical representation of the group's actions which is dynamically updated and displayed to the collaborators. In addition, metacognitive tools display a standard for…

  12. 20 CFR 632.23 - Termination and corrective action of a CAP and/or Master Plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... substantiates serious management, fiscal and/or performance problems, information from the Inspector General or gained through incident reports of poor performance, serious administrative problems and/or inability to... termination: (1) Poor performance and inability to meet Federal standards related to such debt collection...

  13. The Locker Problem: An Open and Shut Case

    ERIC Educational Resources Information Center

    Kimani, Patrick M.; Olanoff, Dana; Masingila, Joanna O.

    2016-01-01

    This article discusses how teaching via problem solving helps enact the Mathematics Teaching Practices and supports students' learning and development of the Standards for Mathematical Practice. This approach involves selecting and implementing mathematical tasks that serve as vehicles for meeting the learning goals for the lesson. For the lesson…

  14. On the numerical treatment of Coulomb forces in scattering problems

    NASA Astrophysics Data System (ADS)

    Randazzo, J. M.; Ancarani, L. U.; Colavecchia, F. D.; Gasaneo, G.; Frapiccini, A. L.

    2012-11-01

    We investigate the limiting procedures to obtain Coulomb interactions from short-range potentials. The application of standard techniques used for the two-body case (exponential and sharp cutoff) to the three-body break-up problem is illustrated numerically by considering the Temkin-Poet (TP) model of e-H processes.

  15. The Problem of Underqualified Teachers: A Sociological Perspective

    ERIC Educational Resources Information Center

    Ingersoll, Richard M.

    2005-01-01

    Few educational problems have received more attention than has the failure to ensure that the nation's classrooms are staffed by qualified teachers. Many states have pushed for more-rigorous preservice teacher education, training, and certification standards. Moreover, a host of recruitment initiatives have attempted to increase the supply of…

  16. Problem Space Matters: The Development of Creativity and Intelligence in Primary School Children

    ERIC Educational Resources Information Center

    Welter, Marisete Maria; Jaarsveld, Saskia; Lachmann, Thomas

    2017-01-01

    Previous research showed that in primary school, children's intelligence develops continually, but creativity develops more irregularly. In this study, the development of intelligence, measured traditionally, i.e., operating within well-defined problem spaces (Standard Progressive Matrices) was compared with the development of intelligence…

  17. Evaluation of Undergraduate Teaching at Institutions of Higher Education in China: Problems and Reform

    ERIC Educational Resources Information Center

    Yukun, Chen

    2009-01-01

    This paper reviews the achievements of the first cycle of undergraduate teaching evaluation at institutions of higher education in China. Existing problems are identified, and suggestions are made for corresponding reforms for improving the standard and quality of China's undergraduate teaching evaluation.

  18. Protocol Analysis of Aptitude Differences in Figural Analogy Problem Representation.

    ERIC Educational Resources Information Center

    Schiano, Diane J.

    Individual differences in performance on figural analogy tests are usually attributed to quantitative differences in processing parameters rather than to qualitative differences in the formation and use of representations. Yet aptitude-related differences in categorizing standardized figural analogy problems between high and low scorers have been…

  19. Examining the Relationship between Electronic Health Record Interoperability and Quality Management

    ERIC Educational Resources Information Center

    Purcell, Bernice M.

    2013-01-01

    A lack of interoperability impairs data quality among health care providers' electronic health record (EHR) systems. The problem is whether the International Organization for Standardization (ISO) 9000 principles relate to the problem of interoperability in implementation of EHR systems. The purpose of the nonexperimental quantitative research…

  20. Pathways to Suicidal Behaviors in Childhood

    ERIC Educational Resources Information Center

    Greening, Leilani; Stoppelbein, Laura; Fite, Paula; Dhossche, Dirk; Erath, Stephen; Brown, Jacqueline; Cramer, Robert; Young, Laura

    2008-01-01

    Path analyses were applied to test a model that includes internalizing and externalizing behavior problems as predictors of suicidal behaviors in children. Parents of an inpatient sample of boys (N = 87; M age = 9.81 years) rated the frequency of suicidal ideation and completed standardized measures of behavior problems. Blind raters rated the…

  1. Improving Procedural Knowledge and Transfer by Teaching a Shortcut Strategy First

    ERIC Educational Resources Information Center

    DeCaro, Marci S.

    2015-01-01

    Students often memorize and apply procedures to solve mathematics problems without understanding why these procedures work. In turn, students demonstrate limited ability to transfer strategies to new problem types. Math curriculum reform standards underscore the importance of procedural flexibility and transfer, emphasizing that students need to…

  2. Problem-Solving Therapy for Depression in Adults: A Systematic Review

    ERIC Educational Resources Information Center

    Gellis, Zvi D.; Kenaley, Bonnie

    2008-01-01

    Objectives: This article presents a systematic review of the evidence on problem-solving therapy (PST) for depressive disorders in noninstitutionalized adults. Method: Intervention studies using randomized controlled designs are included and methodological quality is assessed using a standard set of criteria from the Cochrane Collaborative Review…

  3. 7 CFR 205.206 - Crop pest, weed, and disease management practice standard.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... problems may be controlled through mechanical or physical methods including but not limited to: (1... problems may be controlled through: (1) Mulching with fully biodegradable materials; (2) Mowing; (3...) Plastic or other synthetic mulches: Provided, That, they are removed from the field at the end of the...

  4. 7 CFR 205.206 - Crop pest, weed, and disease management practice standard.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... problems may be controlled through mechanical or physical methods including but not limited to: (1... problems may be controlled through: (1) Mulching with fully biodegradable materials; (2) Mowing; (3...) Plastic or other synthetic mulches: Provided, That, they are removed from the field at the end of the...

  5. The Geoboard Triangle Quest

    ERIC Educational Resources Information Center

    Allen, Kasi C.

    2013-01-01

    In line with the Common Core and Standards for Mathematical Practice that portray a classroom where students are engaged in problem-solving experiences, and where various tools and arguments are employed to grow their strategic thinking, this article is the story of such a student-initiated problem. A seemingly simple question was posed by…

  6. Special Problems and Procedures for Identifying Minority Gifted Students.

    ERIC Educational Resources Information Center

    Bernal, Ernest M.

    The author reviews the key problems associated with generally accepted practices for identifying the gifted from the perspective of minority gifted students, particularly the gifted bilingual child; and presents some alternative approaches for testing. Noted among the shortcomings of testing minority students are that standardized tests are not…

  7. Mathematical Problem Solving Ability of Eleventh Standard Students

    ERIC Educational Resources Information Center

    Priya, J. Johnsi

    2017-01-01

    There is a general assertion among mathematics instructors that learners need to acquire problem solving expertise, figure out how to communicate using mathematics knowledge and aptitude, create numerical reasoning and thinking, to see the interconnectedness amongst mathematics and other subjects. Based on this perspective, the present study aims…

  8. Rescuing Computerized Testing by Breaking Zipf's Law.

    ERIC Educational Resources Information Center

    Wainer, Howard

    2000-01-01

    Suggests that because of the nonlinear relationship between item usage and item security, the problems of test security posed by continuous administration of standardized tests cannot be resolved merely by increasing the size of the item pool. Offers alternative strategies to overcome these problems, distributing test items so as to avoid the…

  9. Problem-Based Learning in a General Psychology Course.

    ERIC Educational Resources Information Center

    Willis, Sandra A.

    2002-01-01

    Describes the adoption of problem-based learning (PBL) techniques in a general psychology course. States that the instructor used a combination of techniques, including think-pair-share, lecture/discussion, and PBL. Notes means and standard deviations for graded components of PBL format versus lecture/discussion format. (Contains 18 references.)…

  10. Electron Probe MicroAnalysis (EPMA) Standards. Issues Related to Measurement and Accuracy Evaluation in EPMA

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul

    2003-01-01

    Electron-probe microanalysis standards and issues related to measurement and accuracy of microanalysis will be discussed. Critical evaluation of standards based on homogeneity and comparison with wet-chemical analysis will be made. Measurement problems such as spectrometer dead-time will be discussed. Analytical accuracy issues will be evaluated for systems by alpha-factor analysis and comparison with experimental k-ratio databases.

  11. Using Quality Management Systems to Improve Test Development and Standards and to Promote Good Practice: A Case Study of Testing Italian as a Foreign Language

    ERIC Educational Resources Information Center

    Grego Bolli, Giuliana

    2014-01-01

    This article discusses the problem of quality in the production of language tests in the context of Italian language examinations. The concept of quality is closely related to the application of stated standards and related procedures. These standards, developed over the last thirty years, are mainly related to the concepts of the accountability…

  12. Implementation of an Evidence-Based and Content Validated Standardized Ostomy Algorithm Tool in Home Care: A Quality Improvement Project.

    PubMed

    Bare, Kimberly; Drain, Jerri; Timko-Progar, Monica; Stallings, Bobbie; Smith, Kimberly; Ward, Naomi; Wright, Sandra

    Many nurses have limited experience with ostomy management. We sought to provide a standardized approach to ostomy education and management to support nurses in early identification of stomal and peristomal complications, pouching problems, and provide standardized solutions for managing ostomy care in general while improving utilization of formulary products. This article describes development and testing of an ostomy algorithm tool.

  13. ON THE PERSISTENCE OF TWO SMALL-SCALE PROBLEMS IN ΛCDM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlowski, Marcel S.; Famaey, Benoit; Merritt, David

    2015-12-10

    We investigate the degree to which the inclusion of baryonic physics can overcome two long-standing problems of the standard cosmological model on galaxy scales: (1) the problem of satellite planes around Local Group galaxies, and (2) the “too big to fail” problem. By comparing dissipational and dissipationless simulations, we find no indication that the addition of baryonic physics results in more flattened satellite distributions around Milky-Way-like systems. Recent claims to the contrary are shown to derive in part from a non-standard metric for the degree of flattening, which ignores the satellites’ radial positions. If the full 3D positions of themore » satellite galaxies are considered, none of the simulations we analyze reproduce the observed flattening nor the observed degree of kinematic coherence of the Milky Way satellite system. Our results are consistent with the expectation that baryonic physics should have little or no influence on the structure of satellite systems on scales of hundreds of kiloparsecs. Claims that the “too big to fail” problem can be resolved by the addition of baryonic physics are also shown to be problematic.« less

  14. Evaluation of students' experience with Problem-based Learning (PBL) applied at the College of Medicine, Al-Jouf University, Saudi Arabia.

    PubMed

    Alduraywish, Abdulrahman Abdulwahab; Mohager, Mazin Omer; Alenezi, Mohammed Jayed; Nail, Abdelsalam Mohammed; Aljafari, Alfatih Saifudinn

    2017-12-01

    To evaluate the students' experience with problem-based learning. This cross-sectional, qualitative study was conducted at the College of Medicine, Al Jouf University, Sakakah, Saudi Arabia, in October 2015, and comprised medical students of the 1st to 5th levels. Interviews were conducted using Students' Course Experience Questionnaire. The questionnaire contained 37 questions covering six evaluative categories: appropriate assessment, appropriate workload, clear goals and standards, generic skills, good teaching, and overall satisfaction. The questionnaire follows the Likert's scale model. Mean values were interpreted as: >2.5= at least disagree, 2.5->3= neither/nor (uncertain), and 3 or more= at least agree. Of the 170 respondents, 72(42.7%) agreed that there was an appropriate assessment accompanied with the problem-based learning. Also, 107(63.13%) students agreed that there was a heavy workload on them. The goal and standards of the course were clear for 71(42.35%) students, 104(61.3%) agreed that problem-based learning improved their generic skills, 65(38.07%) agreed the teaching was good and 82(48.08%) students showed overall satisfaction. The students were satisfied with their experience with the problem-based learning.

  15. Long-term neurodevelopmental outcomes of congenital diaphragmatic hernia survivors not treated with extracorporeal membrane oxygenation.

    PubMed

    Frisk, Virginia; Jakobson, Lorna S; Unger, Sharon; Trachsel, Daniel; O'Brien, Karel

    2011-07-01

    Although there has been a marked improvement in the survival of children with congenital diaphragmatic hernia (CDH) in the past 2 decades, there are few reports of long-term neurodevelopmental outcome in this population. The present study examined neurodevelopmental outcomes in 10- to 16-year-old CDH survivors not treated with extracorporeal membrane oxygenation (ECMO). Parents of 27 CDH survivors completed questionnaires assessing medical problems, daily living skills, educational outcomes, behavioral problems, and executive functioning. Fifteen CDH survivors and matched full-term controls completed standardized intelligence, academic achievement, phonological processing, and working memory tests. Non-ECMO-treated CDH survivors demonstrated high rates of clinically significant difficulties on standardized academic achievement measures, and 14 of the 27 survivors had a formal diagnosis of specific learning disability, attention deficit hyperactivity disorder, or developmental disability. Specific problems with executive function, cognitive and attentional weaknesses, and social difficulties were more common in CDH patients than controls. Perioperative hypocapnia was linked to executive dysfunction, behavioral problems, lowered intelligence, and poor achievement in mathematics. Non-ECMO-treated CDH survivors are at substantial risk for neurodevelopmental problems in late childhood and adolescence. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Clear, Complete, and Justified Problem Formulations for Aquatic Life Benchmark Values: Specifying the Dimensions

    EPA Science Inventory

    Nations that develop water quality benchmark values have relied primarily on standard data and methods. However, experience with chemicals such as Se, ammonia, and tributyltin has shown that standard methods do not adequately address some taxa, modes of exposure and effects. Deve...

  17. Behavior Intervention for Students with Externalizing Behavior Problems: Primary-Level Standard Protocol

    ERIC Educational Resources Information Center

    Benner, Gregory J.; Nelson, J. Ron; Sanders, Elizabeth A.; Ralston, Nicole C.

    2012-01-01

    This article examined the efficacy of a primary-level, standard-protocol behavior intervention for students with externalizing behavioral disorders. Elementary schools were randomly assigned to treatment (behavior intervention) or control (business as usual) conditions, and K-3 students were screened for externalizing behavior risk status. The…

  18. Assessing the Complexity of Students' Knowledge in Chemistry

    ERIC Educational Resources Information Center

    Bernholt, Sascha; Parchmann, Ilka

    2011-01-01

    Current reforms in the education policy of various countries are intended to produce a paradigm shift in the educational system towards an outcome orientation. After implementing educational standards as normative objectives, the development of test procedures that adequately reflect these targets and standards is a central problem. This paper…

  19. The Use of Leadership Standards in the Hiring Practices of Effective Principals

    ERIC Educational Resources Information Center

    Kracht, Ritchie E.; Hensley, Melissa A.; Strange, Martha A.

    2013-01-01

    This is a problem based learning project focusing on superintendent use of ISSLC standards in hiring practices for human resource management. Research notes student achievement is affected by effective leadership of principals. School district superintendents charged with hiring effective principals must determine the best candidate for that…

  20. Creating School Communities through Music

    ERIC Educational Resources Information Center

    Marasco, Katelyn

    2011-01-01

    There are many problems facing educators today. Student retention, standardized test scores, and motivational issues are only a few. It seems that students are dropping out of school at higher rates and having more difficulty finding motivation to do well on their school work and standardized tests. This sought to investigate strategies that…

  1. Jumping to Quadratic Models

    ERIC Educational Resources Information Center

    Gunter, Devon

    2016-01-01

    It is no easy feat to engage young people with abstract material as well as push them to greater depths of understanding. Add in the extra pressures of curriculum expectations and standards and the problem is exacerbated. Projects designed around standards and having multiple entry points clearly offer students the best opportunity to engage with…

  2. Facility Management Child Care Resource Book. Child Care Operations Center of Expertise.

    ERIC Educational Resources Information Center

    General Services Administration, Washington, DC. Public Buildings Service.

    This guidebook provides maintenance and operations guidelines for managing General Services Administration (GSA) child care centers within the same standards and level of a GSA operated facility. Areas covered address cleaning standards and guidelines; equipment funding and inventory; maintenance of living environments and problem areas;…

  3. Toward a Standardized Internet Measurement.

    ERIC Educational Resources Information Center

    Chen, Hsiang; Tan, Zixiang

    This paper investigates measurement issues related to elements of the Internet and calls for a standardized measuring scheme to resolve the problem of the measurement. The dilemmas in measuring the elements of the Internet are identified, and previous studies are reviewed. Elements of the Internet are categorized into population, usage, protocol…

  4. Helping Children Learn Mathematics through Multiple Intelligences and Standards for School Mathematics.

    ERIC Educational Resources Information Center

    Adams, Thomasenia Lott

    2001-01-01

    Focuses on the National Council of Teachers of Mathematics 2000 process-oriented standards of problem solving, reasoning and proof, communication, connections, and representation as providing a framework for using the multiple intelligences that children bring to mathematics learning. Presents ideas for mathematics lessons and activities to…

  5. Going on a Science Trek!

    ERIC Educational Resources Information Center

    Kreider, Gail Yohe

    2008-01-01

    In this problem-based learning activity (PBL), students embark on a science trek to answer the question "Where is the science in my neighborhood?" The project serves as an excellent review of science curriculum in anticipation of Virginia's year-end standardized test--the Standards of Learning (SOL). This has proved to be an interesting…

  6. Children's Competence or Adults' Incompetence: Different Developmental Trajectories in Different Tasks

    ERIC Educational Resources Information Center

    Furlan, Sarah; Agnoli, Franca; Reyna, Valerie F.

    2013-01-01

    Dual-process theories have been proposed to explain normative and heuristic responses to reasoning and decision-making problems. Standard unitary and dual-process theories predict that normative responses should increase with age. However, research has focused recently on exceptions to this standard pattern, including developmental increases in…

  7. Job security and fear: Do these drive our radiation guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, R.G.

    1994-01-01

    This commentary asks why scientists want radiation standard setting at a level well below that at which any health related problem has been observed in a human being. The idea that job security and fear actually may drive radiation standards is presented as a possibility. 3 refs.

  8. Robust Confidence Interval for a Ratio of Standard Deviations

    ERIC Educational Resources Information Center

    Bonett, Douglas G.

    2006-01-01

    Comparing variability of test scores across alternate forms, test conditions, or subpopulations is a fundamental problem in psychometrics. A confidence interval for a ratio of standard deviations is proposed that performs as well as the classic method with normal distributions and performs dramatically better with nonnormal distributions. A simple…

  9. Training Requirements in OSHA Standards and Training Guidelines. Revised.

    ERIC Educational Resources Information Center

    Occupational Safety and Health Administration, Washington, DC.

    This guide provides an overview of Occupational Safety and Health Act (OSHA) standards and training guidelines for various industries. The first section introduces the concept of voluntary training guidelines, explaining that the guidelines are designed to help employers determine whether a worksite problem can be solved by training, what training…

  10. CLEAR, COMPLETE, AND JUSTIFIED PROBLEM FORMULATIONS FOR AQUATIC LIFE BENCHMARK VALUES: SPECIFYING THE DIMENSIONS

    EPA Science Inventory

    Nations that develop water quality benchmark values have relied primarily on standard data and methods. However, experience with chemicals such as Se, ammonia, and tributyltin has shown that standard methods do not adequately address some taxa, modes of exposure and effects. Deve...

  11. Paper Towers: Building Students' Understandings of Technological Design

    ERIC Educational Resources Information Center

    Minogue, James; Guentensberger, Todd

    2006-01-01

    One set of ideas at the core of the National Science Education Standards (NSES) Science and Technology Standards is that of engaging middle school students in activities that help them develop their understandings of technological design. More precisely, students should be able to identify appropriate problems for technological design, design a…

  12. Modern "Challenges" in the System of Personnel Training: Standardization and Innovations

    ERIC Educational Resources Information Center

    Zaitseva, Natalia; Dzhandzhugazova, Elena; Bondarchuk, Natalya; Zhukova, Marina

    2017-01-01

    Purpose: The study of the problems hindering improvement of the system of training through standardization of qualification requirements is relevant because, in a globalized system of teaching staff and high rates of migration, not only national but also international requirements for employees should be considered. This increases the…

  13. Smart Moves: Powering up the Brain with Physical Activity

    ERIC Educational Resources Information Center

    Conyers, Marcus; Wilson, Donna

    2015-01-01

    The Common Core State Standards emphasize higher-order thinking, problem solving, and the creation, retention, and application of knowledge. Achieving these standards creates greater cognitive demands on students. Recent research suggests that active play and regular exercise have a positive effect on brain regions associated with executive…

  14. Replication and Reporting: A Commentary.

    ERIC Educational Resources Information Center

    Polio, Charlene; Gass, Susan

    1997-01-01

    Addresses the need for replication studies in the field of second-language acquisition and discusses the problems surrounding standards of reporting research. Notes a lack of uniform standards in reporting second-language learners' proficiency levels and proposes ways to achieve more thorough reporting of research that will allow others to engage…

  15. NCTM Principles and Standards for Mathematically Talented Students

    ERIC Educational Resources Information Center

    Deal, Linda J.; Wismer, Michael G.

    2010-01-01

    The "Principles and Standards for School Mathematics" published in 2000 by the National Council of Teachers of Mathematics (NCTM) created a vision of mathematical concepts and processes to establish core educational guidelines for instruction from grades K to 12. The overall plan does emphasize higher level thinking, problem solving, and…

  16. Standards for Privacy in Medical Information Systems: A Technico-Legal Revolution

    PubMed Central

    Brannigan, Vincent; Beier, Bernd

    1990-01-01

    The treatment of non-poor patients in hospitals creates a conflict between the privacy expectations of the patients and the historical traditions and administrative convenience of the hospital. Resolution of this problem requires a sophisticated theory of privacy, and one possible solution includes consensus standards for data privacy.

  17. Sources of Biased Inference in Alcohol and Drug Services Research: An Instrumental Variable Approach

    PubMed Central

    Schmidt, Laura A.; Tam, Tammy W.; Larson, Mary Jo

    2012-01-01

    Objective: This study examined the potential for biased inference due to endogeneity when using standard approaches for modeling the utilization of alcohol and drug treatment. Method: Results from standard regression analysis were compared with those that controlled for endogeneity using instrumental variables estimation. Comparable models predicted the likelihood of receiving alcohol treatment based on the widely used Aday and Andersen medical care–seeking model. Data were from the National Epidemiologic Survey on Alcohol and Related Conditions and included a representative sample of adults in households and group quarters throughout the contiguous United States. Results: Findings suggested that standard approaches for modeling treatment utilization are prone to bias because of uncontrolled reverse causation and omitted variables. Compared with instrumental variables estimation, standard regression analyses produced downwardly biased estimates of the impact of alcohol problem severity on the likelihood of receiving care. Conclusions: Standard approaches for modeling service utilization are prone to underestimating the true effects of problem severity on service use. Biased inference could lead to inaccurate policy recommendations, for example, by suggesting that people with milder forms of substance use disorder are more likely to receive care than is actually the case. PMID:22152672

  18. Problems of the development of international standards of “green building” in Russia

    NASA Astrophysics Data System (ADS)

    Meshcheryakova, Tatiana

    2017-10-01

    Problems of environmental friendliness and energy efficiency in recent decades have become not only the most important issues of economic development of the main industrial economies, but also the basis for the processes of maintaining the security and relative stability of the global ecosystem. The article presents the results of the study of the status and trends of the development of environmental standards for the construction and maintenance of real estate in the world and particularly in Russia. Special market instruments for assessing the compliance with the quality of real estate projects under construction and modern principles of environmental friendliness and energy efficiency include voluntary building certification systems that are actively used in international practice. In Russia there is active use of the following international systems of certification: BREEAM, LEED, DGNB, HQE. Also in the Russian certification market, the national standard STO NOSTROY 2.35.4-2011 “Residential and public buildings” is being implemented, which summarizes the best international experience of the rating evaluation procedure. Comparative characteristics of the “green” standards and the principles of rating assessments of the ecological compatibility of buildings give an idea of applying these standards in Russia.

  19. Peer Victimization as a Mediator of the Relation between Facial Attractiveness and Internalizing Problems

    PubMed Central

    Rosen, Lisa H.; Underwood, Marion K.; Beron, Kurt J.

    2011-01-01

    This study examined the relations between facial attractiveness, peer victimization, and internalizing problems in early adolescence. We hypothesized that experiences of peer victimization would partially mediate the relationship between attractiveness and internalizing problems. Ratings of attractiveness were obtained from standardized photographs of participants (93 girls, 82 boys). Teachers provided information regarding peer victimization experiences in sixth grade, and seventh grade teachers assessed internalizing problems. Attractiveness was negatively correlated with victimization and internalizing problems. Experiences of peer victimization were positively correlated with internalizing problems. Structural equation modeling provided support for the hypothesized model of peer victimization partially mediating the relationship between attractiveness and internalizing problems. Implications for intervention programs and future research directions are discussed. PMID:21984861

  20. The random fractional matching problem

    NASA Astrophysics Data System (ADS)

    Lucibello, Carlo; Malatesta, Enrico M.; Parisi, Giorgio; Sicuro, Gabriele

    2018-05-01

    We consider two formulations of the random-link fractional matching problem, a relaxed version of the more standard random-link (integer) matching problem. In one formulation, we allow each node to be linked to itself in the optimal matching configuration. In the other one, on the contrary, such a link is forbidden. Both problems have the same asymptotic average optimal cost of the random-link matching problem on the complete graph. Using a replica approach and previous results of Wästlund (2010 Acta Mathematica 204 91–150), we analytically derive the finite-size corrections to the asymptotic optimal cost. We compare our results with numerical simulations and we discuss the main differences between random-link fractional matching problems and the random-link matching problem.

  1. Current challenges in fundamental physics

    NASA Astrophysics Data System (ADS)

    Egana Ugrinovic, Daniel

    The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.

  2. The European experience.

    PubMed

    Bisgaard, N

    2001-06-01

    This article presents an overview of past and current experiences with time division multiple assess-based (Global System for Mobil Communication) mobile telephones in Europe as seen by the European Hearing Instrument Manufacturers Association. Initial fear of widespread interference problems for hearing aid users in general owing to use of a new generation of mobile telephones seems unjustified. The background for the International Electrotechnical Commission 118-13 standard for measuring interference is described. No solution to complete elimination of interference problems resulting from direct contact between hearing aids and mobile telephones has yet been found. Several reports on the subjects are cited, and new work on measurement standards for near-field situations is mentioned.

  3. Combustion and fires in low gravity

    NASA Technical Reports Server (NTRS)

    Friedman, Robert

    1994-01-01

    Fire safety always receives priority attention in NASA mission designs and operations, with emphasis on fire prevention and material acceptance standards. Recently, interest in spacecraft fire-safety research and development has increased because improved understanding of the significant differences between low-gravity and normal-gravity combustion suggests that present fire-safety techniques may be inadequate or, at best, non-optimal; and the complex and permanent orbital operations in Space Station Freedom demand a higher level of safety standards and practices. This presentation outlines current practices and problems in fire prevention and detection for spacecraft, specifically the Space Station Freedom's fire protection. Also addressed are current practices and problems in fire extinguishment for spacecraft.

  4. Applying a Genetic Algorithm to Reconfigurable Hardware

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl; Weir, John; Trevino, Luis; Patrick, Clint; Steincamp, Jim

    2004-01-01

    This paper investigates the feasibility of applying genetic algorithms to solve optimization problems that are implemented entirely in reconfgurable hardware. The paper highlights the pe$ormance/design space trade-offs that must be understood to effectively implement a standard genetic algorithm within a modem Field Programmable Gate Array, FPGA, reconfgurable hardware environment and presents a case-study where this stochastic search technique is applied to standard test-case problems taken from the technical literature. In this research, the targeted FPGA-based platform and high-level design environment was the Starbridge Hypercomputing platform, which incorporates multiple Xilinx Virtex II FPGAs, and the Viva TM graphical hardware description language.

  5. Simultaneous and semi-alternating projection algorithms for solving split equality problems.

    PubMed

    Dong, Qiao-Li; Jiang, Dan

    2018-01-01

    In this article, we first introduce two simultaneous projection algorithms for solving the split equality problem by using a new choice of the stepsize, and then propose two semi-alternating projection algorithms. The weak convergence of the proposed algorithms is analyzed under standard conditions. As applications, we extend the results to solve the split feasibility problem. Finally, a numerical example is presented to illustrate the efficiency and advantage of the proposed algorithms.

  6. Fokker-Planck-Based Acceleration for SN Equations with Highly Forward Peaked Scattering in Slab Geometry

    NASA Astrophysics Data System (ADS)

    Patel, Japan

    Short mean free paths are characteristic of charged particles. High energy charged particles often have highly forward peaked scattering cross sections. Transport problems involving such charged particles are also highly optically thick. When problems simultaneously have forward peaked scattering and high optical thickness, their solution, using standard iterative methods, becomes very inefficient. In this dissertation, we explore Fokker-Planck-based acceleration for solving such problems.

  7. Standard Model–axion–seesaw–Higgs portal inflation. Five problems of particle physics and cosmology solved in one stroke

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballesteros, Guillermo; Redondo, Javier; Ringwald, Andreas

    We present a minimal extension of the Standard Model (SM) providing a consistent picture of particle physics from the electroweak scale to the Planck scale and of cosmology from inflation until today. Three right-handed neutrinos N {sub i} , a new color triplet Q and a complex SM-singlet scalar σ, whose vacuum expectation value v {sub σ} ∼ 10{sup 11} GeV breaks lepton number and a Peccei-Quinn symmetry simultaneously, are added to the SM. At low energies, the model reduces to the SM, augmented by seesaw generated neutrino masses and mixing, plus the axion. The latter solves the strong CPmore » problem and accounts for the cold dark matter in the Universe. The inflaton is comprised by a mixture of σ and the SM Higgs, and reheating of the Universe after inflation proceeds via the Higgs portal. Baryogenesis occurs via thermal leptogenesis. Thus, five fundamental problems of particle physics and cosmology are solved at one stroke in this unified Standard Model—axion—seesaw—Higgs portal inflation (SMASH) model. It can be probed decisively by upcoming cosmic microwave background and axion dark matter experiments.« less

  8. Toward a North American Standard for Mobile Data Services

    NASA Technical Reports Server (NTRS)

    Dean, Richard A.; Levesque, Allen H.

    1991-01-01

    The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.

  9. Toward a North American standard for mobile data services

    NASA Astrophysics Data System (ADS)

    Dean, Richard A.; Levesque, Allen H.

    1991-09-01

    The rapid introduction of digital mobile communications systems is an important part of the emerging digital communications scene. These developments pose both a potential problem and a challenge. On one hand, these separate market driven developments can result in an uncontrolled mixture of analog and digital links which inhibit data modem services across the mobile/Public Switched network (PSTN). On the other hand, the near coincidence of schedules for development of some of these systems, i.e., Digital Cellular, Mobile Satellite, Land Mobile Radio, and ISDN, provides an opportunity to address interoperability problems by defining interfaces, control, and service standards that are compatible among these new services. In this paper we address the problem of providing data services interoperation between mobile terminals and data devices on the PSTN. The expected data services include G3 Fax, asynchronous data, and the government's STU-3 secure voice system, and future data services such as ISDN. We address a common architecture and a limited set of issues that are key to interoperable mobile data services. We believe that common mobile data standards will both improve the quality of data service and simplify the systems for manufacturers, data users, and service providers.

  10. Recommended fine positioning test for the Development Test Flight (DTF-1) of the NASA Flight Telerobotic Servicer (FTS)

    NASA Technical Reports Server (NTRS)

    Dagalakis, N.; Wavering, A. J.; Spidaliere, P.

    1991-01-01

    Test procedures are proposed for the NASA DTF (Development Test Flight)-1 positioning tests of the FTS (Flight Telerobotic Servicer). The unique problems associated with the DTF-1 mission are discussed, standard robot performance tests and terminology are reviewed and a very detailed description of flight-like testing and analysis is presented. The major technical problem associated with DTF-1 is that only one position sensor can be used, which will be fixed at one location, with a working volume which is probably smaller than some of the robot errors to be measured. Radiation heating of the arm and the sensor could also cause distortions that would interfere with the test. Two robot performance testing committees have established standard testing procedures relevant to the DTF-1. Due to the technical problems associated with DTF-1, these procedures cannot be applied directly. These standard tests call for the use of several test positions at specific locations. Only one position, that of the position sensor, can be used by DTF-1. Off-line programming accuracy might be impossible to measure and in that case it will have to be replaced by forward kinetics accuracy.

  11. Teaching cross-cultural communication skills online: a multi-method evaluation.

    PubMed

    Lee, Amy L; Mader, Emily M; Morley, Christopher P

    2015-04-01

    Cultural competency education is an important and required part of undergraduate medical education. The objective of this study was to evaluate whether an online cross-cultural communication module could increase student use of cross-cultural communication questions that assess the patient's definition of the problem, the way the problem affects their life, their concerns about the problem, and what the treatment should be (PACT). We used multi-method assessment of students assigned to family medicine clerkship blocks that were randomized to receive online cultural competency and PACT training added to their standard curriculum or to a control group receiving the standard curriculum only. Outcomes included comparison, via analysis of variance, of number of PACT questions used during an observed Standardized Patient Exercise, end-of-year OSCE scores, and qualitative analysis of student narratives. Students (n=119) who participated in the online module (n=60) demonstrated increased use of cross-cultural communication PACT questions compared to the control group (n=59) and generally had positive themes emerge from their reflective writing. The module had the biggest impact on students who later went on to match in high communication specialties. Online teaching of cross-cultural communication skills can be effective at changing medical student behavior.

  12. Skin problems in individuals with lower-limb loss: literature review and proposed classification system.

    PubMed

    Bui, Kelly M; Raugi, Gregory J; Nguyen, Viet Q; Reiber, Gayle E

    2009-01-01

    Problems with skin integrity can disrupt daily prosthesis use and lead to decreased mobility and function in individuals with lower-limb loss. This study reviewed the literature to examine how skin problems are defined and diagnosed and to identify the prevalence and types of skin problems in individuals with lower-limb loss. We searched the literature for terms related to amputation and skin problems. We identified 777 articles. Of the articles, 90 met criteria for review of research methodology. Four clinical studies met our selection criteria. The prevalence rate of skin problems was 15% to 41%. The most commonly reported skin problems were wounds, abscesses, and blisters. Given the lack of standardized definitions of skin problems on residual limbs, we conclude this article with a system for classification.

  13. Problem Internet Overuse Behaviors in College Students: Readiness-to-Change and Receptivity to Treatment.

    PubMed

    O'Brien, Jennifer E; Li, Wen; Snyder, Susan M; Howard, Matthew O

    2016-01-01

    This mixed methods study explores college students' readiness-to-change and receptivity to treatment for problem Internet overuse behaviors. Focus groups were conducted with 27 college students who self-identified as Internet over-users, and had experienced biopsychosocial problems related to Internet overuse. Participants completed standardized questionnaires assessing their Internet use and sociodemographic forms. Focus groups explored readiness to change problem Internet overuse behaviors and receptivity to treatment. Similar to college students with other addictive behaviors, students with problem Internet overuse fall along a continuum vis-à-vis readiness-to-change their behaviors. Over half of the participants were receptive to treatment for their problem Internet overuse behaviors.

  14. Use of human engineering standards in design

    NASA Technical Reports Server (NTRS)

    Rogers, J. G.; Armstrong, R.

    1977-01-01

    Results are presented for a research study intended to assess the impact of present human engineering standards on product design. The approach consisted of three basic steps: a comparison of two display panels to determine if, in fact, products designed to the same standards are truly standardized; a review of two existing standards to determine how well their information can be used to solve design problems; and a survey of human factors specialists to assess their opinions about standards. It is shown that standards have less than the desired influence on product design. This is evidenced by a lack of standardization between hardware designed under common standards, by deficiencies within the standards that detract from their usefulness and encourage users to ignore them, and by the respondents of the survey who consider standards less valuable than other reference sources for design implementation. Recommendations aimed at enhancing the use of standards are set forth.

  15. Axions, Inflation and String Theory

    NASA Astrophysics Data System (ADS)

    Mack, Katherine J.; Steinhardt, P. J.

    2009-01-01

    The QCD axion is the leading contender to rid the standard model of the strong-CP problem. If the Peccei-Quinn symmetry breaking occurs before inflation, which is likely in string theory models, axions manifest themselves cosmologically as a form of cold dark matter with a density determined by the axion's initial conditions and by the energy scale of inflation. Constraints on the dark matter density and on the amplitude of CMB isocurvature perturbations currently demand an exponential degree of fine-tuning of both axion and inflationary parameters beyond what is required for particle physics. String theory models generally produce large numbers of axion-like fields; the prospect that any of these fields exist at scales close to that of the QCD axion makes the problem drastically worse. I will discuss the challenge of accommodating string-theoretic axions in standard inflationary cosmology and show that the fine-tuning problems cannot be fully addressed by anthropic principle arguments.

  16. Too easily lead? Health effects of gasoline additives.

    PubMed Central

    Menkes, D B; Fawcett, J P

    1997-01-01

    Octane-enhancing constituents of gasoline pose a number of health hazards. This paper considers the relative risks of metallic (lead, manganese), aromatic (e.g., benzene), and oxygenated additives in both industrialized and developing countries. Technological advances, particularly in industrialized countries, have allowed the progressive removal of lead from gasoline and the increased control of exhaust emissions. The developing world, by contrast, has relatively lax environmental standards and faces serious public health problems from vehicle exhaust and the rapid increase in automobile use. Financial obstacles to the modernization of refineries and vehicle fleets compound this problem and the developing world continues to import large quantities of lead additives and other hazardous materials. Progress in decreasing environmental health problems depends both on the adoption of international public health standards as well as efforts to decrease dependence on the private automobile for urban transport. Images Figure 1. Figure 2. PMID:9171982

  17. An intervention for parents with AIDS and their adolescent children.

    PubMed

    Rotheram-Borus, M J; Lee, M B; Gwadz, M; Draimin, B

    2001-08-01

    This study evaluated an intervention designed to improve behavioral and mental health outcomes among adolescents and their parents with AIDS. Parents with AIDS (n = 307) and their adolescent children (n = 412) were randomly assigned to an intensive intervention or a standard care control condition. Ninety-five percent of subjects were reassessed at least once annually over 2 years. Adolescents in the intensive intervention condition reported significantly lower levels of emotional distress, of multiple problem behaviors, of conduct problems, and of family-related stressors and higher levels of self-esteem than adolescents in the standard care condition. Parents with AIDS in the intervention condition also reported significantly lower levels of emotional distress and multiple problem behaviors. Coping style, levels of disclosure regarding serostatus, and formation of legal custody plans were similar across intervention conditions. Interventions can reduce the long-term impact of parents' HIV status on themselves and their children.

  18. Intrinsic optimization using stochastic nanomagnets

    PubMed Central

    Sutton, Brian; Camsari, Kerem Yunus; Behin-Aein, Behtash; Datta, Supriyo

    2017-01-01

    This paper draws attention to a hardware system which can be engineered so that its intrinsic physics is described by the generalized Ising model and can encode the solution to many important NP-hard problems as its ground state. The basic constituents are stochastic nanomagnets which switch randomly between the ±1 Ising states and can be monitored continuously with standard electronics. Their mutual interactions can be short or long range, and their strengths can be reconfigured as needed to solve specific problems and to anneal the system at room temperature. The natural laws of statistical mechanics guide the network of stochastic nanomagnets at GHz speeds through the collective states with an emphasis on the low energy states that represent optimal solutions. As proof-of-concept, we present simulation results for standard NP-complete examples including a 16-city traveling salesman problem using experimentally benchmarked models for spin-transfer torque driven stochastic nanomagnets. PMID:28295053

  19. Intrinsic optimization using stochastic nanomagnets

    NASA Astrophysics Data System (ADS)

    Sutton, Brian; Camsari, Kerem Yunus; Behin-Aein, Behtash; Datta, Supriyo

    2017-03-01

    This paper draws attention to a hardware system which can be engineered so that its intrinsic physics is described by the generalized Ising model and can encode the solution to many important NP-hard problems as its ground state. The basic constituents are stochastic nanomagnets which switch randomly between the ±1 Ising states and can be monitored continuously with standard electronics. Their mutual interactions can be short or long range, and their strengths can be reconfigured as needed to solve specific problems and to anneal the system at room temperature. The natural laws of statistical mechanics guide the network of stochastic nanomagnets at GHz speeds through the collective states with an emphasis on the low energy states that represent optimal solutions. As proof-of-concept, we present simulation results for standard NP-complete examples including a 16-city traveling salesman problem using experimentally benchmarked models for spin-transfer torque driven stochastic nanomagnets.

  20. a Speculative Study on Negative-Dimensional Potential and Wave Problems by Implicit Calculus Modeling Approach

    NASA Astrophysics Data System (ADS)

    Chen, Wen; Wang, Fajie

    Based on the implicit calculus equation modeling approach, this paper proposes a speculative concept of the potential and wave operators on negative dimensionality. Unlike the standard partial differential equation (PDE) modeling, the implicit calculus modeling approach does not require the explicit expression of the PDE governing equation. Instead the fundamental solution of physical problem is used to implicitly define the differential operator and to implement simulation in conjunction with the appropriate boundary conditions. In this study, we conjecture an extension of the fundamental solution of the standard Laplace and Helmholtz equations to negative dimensionality. And then by using the singular boundary method, a recent boundary discretization technique, we investigate the potential and wave problems using the fundamental solution on negative dimensionality. Numerical experiments reveal that the physics behaviors on negative dimensionality may differ on positive dimensionality. This speculative study might open an unexplored territory in research.

  1. Why Massachusetts Should Abandon the PARCC Tests and the 2011 Coleman et al English Language Arts Standards on Which the MCAS Tests Are Based. Testimony

    ERIC Educational Resources Information Center

    Stotsky, Sandra

    2015-01-01

    In this testimony, the author first describes her qualifications, as well as the lack of relevant qualifications in Common Core's standards writers and in most of the members of Common Core's Validation Committee, on which she served in 2009-2010. The author then details some of the many problems in the 2011 Massachusetts ELA standards, written by…

  2. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less

  3. Geometric representation methods for multi-type self-defining remote sensing data sets

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.

    1980-01-01

    Efficient and convenient representation of remote sensing data is highly important for an effective utilization. The task of merging different data types is currently dealt with by treating each case as an individual problem. A description is provided of work which is carried out to standardize the multidata merging process. The basic concept of the new approach is that of the self-defining data set (SDDS). The creation of a standard is proposed. This standard would be such that data which may be of interest in a large number of earth resources remote sensing applications would be in a format which allows convenient and automatic merging. Attention is given to details regarding the multidata merging problem, a geometric description of multitype data sets, image reconstruction from track-type data, a data set generation system, and an example multitype data set.

  4. Lepton number violation in theories with a large number of standard model copies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovalenko, Sergey; Schmidt, Ivan; Paes, Heinrich

    2011-03-01

    We examine lepton number violation (LNV) in theories with a saturated black hole bound on a large number of species. Such theories have been advocated recently as a possible solution to the hierarchy problem and an explanation of the smallness of neutrino masses. On the other hand, the violation of the lepton number can be a potential phenomenological problem of this N-copy extension of the standard model as due to the low quantum gravity scale black holes may induce TeV scale LNV operators generating unacceptably large rates of LNV processes. We show, however, that this issue can be avoided bymore » introducing a spontaneously broken U{sub 1(B-L)}. Then, due to the existence of a specific compensation mechanism between contributions of different Majorana neutrino states, LNV processes in the standard model copy become extremely suppressed with rates far beyond experimental reach.« less

  5. Interlaboratory studies and initiatives developing standards for proteomics

    PubMed Central

    Ivanov, Alexander R.; Colangelo, Christopher M.; Dufresne, Craig P.; Friedman, David B.; Lilley, Kathryn S.; Mechtler, Karl; Phinney, Brett S.; Rose, Kristie L.; Rudnick, Paul A.; Searle, Brian C.; Shaffer, Scott A.; Weintraub, Susan T.

    2013-01-01

    Proteomics is a rapidly transforming interdisciplinary field of research that embraces a diverse set of analytical approaches to tackle problems in fundamental and applied biology. This view-point article highlights the benefits of interlaboratory studies and standardization initiatives to enable investigators to address many of the challenges found in proteomics research. Among these initiatives, we discuss our efforts on a comprehensive performance standard for characterizing PTMs by MS that was recently developed by the Association of Biomolecular Resource Facilities (ABRF) Proteomics Standards Research Group (sPRG). PMID:23319436

  6. "The NASA Sci Files": The Case of the Biological Biosphere. [Videotape].

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Hampton, VA. Langley Research Center.

    The NASA Science Files is a series of instructional programs consisting of broadcast, print, and online elements. Emphasizing standards-based instruction, problem-based learning, and science as inquiry, the series seeks to motivate students in grades 3-5 to become critical thinkers and active problem solvers. Each program supports the national…

  7. Mathematics Achievement of Children in China and the United States.

    ERIC Educational Resources Information Center

    Stevenson, Harold W.; And Others

    1990-01-01

    Performance of U.S. first and fifth graders was consistently inferior to that of Chinese children on both problems requiring computation and problems requiring application of knowledge about mathematics. The poor performance of U.S. students appeared to be attributable to low motivation of students, low standards of parents, and low interest of…

  8. Postural Determinants in the Blind. Final Report.

    ERIC Educational Resources Information Center

    Siegel, Irwin M.; Murphy, Thomas J.

    The problem of malposture in the blind and its affect on orientation and travel skills was explored. A group of 45 students were enrolled in a standard 3-month mobility training program. Each student suffered a postural problem, some compounded by severe orthopedic and/or neurological deficit. All subjects were given complete orthopedic and…

  9. Students' Use of Technological Tools for Verification Purposes in Geometry Problem Solving

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis; Dagdilelis, Vassilios

    2008-01-01

    Despite its importance in mathematical problem solving, verification receives rather little attention by the students in classrooms, especially at the primary school level. Under the hypotheses that (a) non-standard tasks create a feeling of uncertainty that stimulates the students to proceed to verification processes and (b) computational…

  10. 75 FR 75761 - Water Quality Standards for the State of Florida's Lakes and Flowing Waters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-06

    ... widespread, persistent, and growing problem. Nitrogen/phosphorus pollution in fresh water systems can... Florida's regulated drinking water systems and a 10 mg/L criteria for nitrate in Class I waters. FDEP..., kidney, and central nervous system problems. 44 45 \\44\\ USEPA. 2009. National Primary Drinking Water...

  11. Moore and Less!

    ERIC Educational Resources Information Center

    Asghari, Amir

    2012-01-01

    This article is the story of a very non-standard, absolutely student-centered multivariable calculus course. The course advocates the so-called problem method in which the problems used are a bridge between what the learners know and what they are about to know. The main feature of the course is a unique conceptual story that runs through the…

  12. Silvicultural Use of Wastewater Sludge

    Treesearch

    J.B. Hart; P.V. Nguyen; D.H. Urie; Dale G. Brockway

    1988-01-01

    Generation of wastewater sludge in the United States has become a problem of increasing proportion, with annual production at 4 million tons in 1970 (Walsh 1976) and 7 million tons currently(Maness 1987). While population and industrial growth have contributed to this problem, legislation requiring higher standards of treatment for wastewater processed in the 15,378...

  13. Multidimensional Perfectionism and Internalizing Problems: Do Teacher and Classmate Support Matter?

    ERIC Educational Resources Information Center

    Fredrick, Stephanie Secord; Demaray, Michelle Kilpatrick; Jenkins, Lyndsay N.

    2017-01-01

    Adolescent stressors coupled with environmental demands, such as pressures to achieve, might lead to negative outcomes for some students. Students who worry about their ability to meet high standards might be more at risk of internalizing problems. The current study investigated the relations among perfectionism, social support, and internalizing…

  14. Creating School and Community Partnerships for Substance Abuse Prevention Programs.

    ERIC Educational Resources Information Center

    Adelman, Howard S.; Taylor, Linda

    2003-01-01

    The article reviews the scope and scale of the problem, explores a transactional view of etiology, and summarizes the prevailing approaches to prevention, exemplary and promising approaches, and standards for research and practice. The authors stress the importance of addressing the complexity of the problem through creation of comprehensive,…

  15. Poverty on the Land - In a Land of Plenty.

    ERIC Educational Resources Information Center

    Myers, Robin, Comp.

    The problems of agricultural laborers were discussed in this report on the public hearings held by the National Advisory Committee on Farm Labor. Problems were identified in the areas of wages of farm workers, labor standards and labor shortages, welfare, marginal farmers, Federal aid, public assistance in rural areas, family farmers, and the…

  16. What's on Your Radar Screen? Distance-Rate-Time Problems from NASA

    ERIC Educational Resources Information Center

    Condon, Gregory W.; Landesman, Miriam F.; Calasanz-Kaiser, Agnes

    2006-01-01

    This article features NASA's FlyBy Math, a series of six standards-based distance-rate-time investigations in air traffic control. Sixth-grade students--acting as pilots, air traffic controllers, and NASA scientists--conduct an experiment and then use multiple mathematical representations to analyze and solve a problem involving two planes flying…

  17. The Public School Infrastructure Problem: Deteriorating Buildings and Deferred Maintenance

    ERIC Educational Resources Information Center

    Hunter, Richard C.

    2009-01-01

    The deterioration of public school buildings is more prevalent in large cities that, because of funding shortfalls, have deferred maintenance and require huge sums to bring their buildings up to acceptable standards. Cities such as New York will require approximately $680 million to address the problem of deferred maintenance for needed painting,…

  18. Preserving Pelicans with Models That Make Sense

    ERIC Educational Resources Information Center

    Moore, Tamara J.; Doerr, Helen M.; Glancy, Aran W.; Ntow, Forster D.

    2015-01-01

    Getting students to think deeply about mathematical concepts is not an easy job, which is why we often use problem-solving tasks to engage students in higher-level mathematical thinking. Mathematical modeling, one of the mathematical practices found in the Common Core State Standards for Mathematics (CCSSM), is a type of problem solving that can…

  19. Judging Plagiarism: A Problem of Morality and Convention

    ERIC Educational Resources Information Center

    East, Julianne

    2010-01-01

    This paper considers the problem of plagiarism as an issue of morality. Outrage about student plagiarism in universities positions it as dishonesty and a transgression of standards. Despite this, there has been little work analysing the implications of positioning plagiarism as a moral matter in the making of judgments about plagiarism and…

  20. Science Modelling in Pre-Calculus: How to Make Mathematics Problems Contextually Meaningful

    ERIC Educational Resources Information Center

    Sokolowski, Andrzej; Yalvac, Bugrahan; Loving, Cathleen

    2011-01-01

    "Use of mathematical representations to model and interpret physical phenomena and solve problems is one of the major teaching objectives in high school math curriculum" [National Council of Teachers of Mathematics (NCTM), "Principles and Standards for School Mathematics", NCTM, Reston, VA, 2000]. Commonly used pre-calculus textbooks provide a…

  1. Interactive Problem-Solving Geography: An Introduction in Chinese Classrooms to Locational Analysis

    ERIC Educational Resources Information Center

    Wai, Nu Nu; Giles, John H.

    2006-01-01

    Reform in geography education, as reflected in "Geography for Life: National Geography Standards" (1994) for the U.S.A., favors a constructivist approach to learning. This study examines the acceptance of this approach among students in two upper secondary schools in China. A lesson was developed to illustrate interactive problem solving…

  2. THE USES AND ABUSES OF VISUAL TRAINING FOR CHILDREN WITH PERCEPTUAL-MOTOR LEARNING PROBLEMS.

    ERIC Educational Resources Information Center

    CARLSON, PAUL V.; GREENSPOON, MORTON K.

    THE ROLE OF THE OPTOMETRIST IN DIAGNOSING AND CORRECTING PERCEPTUAL-MOTOR LEARNING PROBLEMS IS DISCUSSED. ONE GROUP OF OPTOMETRISTS ADHERES TO STANDARD TECHNIQUES, INCLUDING THE PRESCRIPTION OF CORRECTIVE LENSES AND THE USE OF ORTHOPTIC TECHNIQUES FOR THE SAKE OF CLEAR, COMFORTABLE, AND EFFECTIVE VISUAL PERFORMANCE. OTHERS EMPLOY DIVERSE…

  3. Van: An Open Letter

    ERIC Educational Resources Information Center

    Tieman, John Samuel

    2011-01-01

    This essay is an open letter from a classroom teacher to a concerned citizen. The letter lists a variety of problems caused largely by standardization and the more corrosive effects of positivism. Many of these problems are unknown to those outside the immediate school setting. While the letter focuses on a specific setting, an inner city school…

  4. Model of Distributed Learning Objects Repository for a Heterogenic Internet Environment

    ERIC Educational Resources Information Center

    Kaczmarek, Jerzy; Landowska, Agnieszka

    2006-01-01

    In this article, an extension of the existing structure of learning objects is described. The solution addresses the problem of the access and discovery of educational resources in the distributed Internet environment. An overview of e-learning standards, reference models, and problems with educational resources delivery is presented. The paper…

  5. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  6. A new shock-capturing numerical scheme for ideal hydrodynamics

    NASA Astrophysics Data System (ADS)

    Fecková, Z.; Tomášik, B.

    2015-05-01

    We present a new algorithm for solving ideal relativistic hydrodynamics based on Godunov method with an exact solution of Riemann problem for an arbitrary equation of state. Standard numerical tests are executed, such as the sound wave propagation and the shock tube problem. Low numerical viscosity and high precision are attained with proper discretization.

  7. Teaching Science Problem Solving: An Overview of Experimental Work.

    ERIC Educational Resources Information Center

    Taconis, R.; Ferguson-Hessler, M. G. M.; Broekkamp, H.

    2001-01-01

    Performs analysis on a number of articles published between 1985 and 1995 describing experimental research into the effectiveness of a wide variety of teaching strategies for science problem solving. Identifies 22 articles describing 40 experiments that met standards for meta-analysis. Indicates that few of the independent variables were found to…

  8. Standardized Observational Assessment of Attention Deficit Hyperactivity Disorder Combined and Predominantly Inattentive Subtypes. II. Classroom Observations.

    PubMed

    McConaughy, Stephanie H; Ivanova, Masha Y; Antshel, Kevin; Eiraldi, Ricardo B; Dumenci, Levent

    2009-07-01

    Trained classroom observers used the Direct Observation Form (DOF; McConaughy & Achenbach, 2009) to rate observations of 163 6- to 11-year-old children in their school classrooms. Participants were assigned to four groups based on a parent diagnostic interview and parent and teacher rating scales: Attention Deficit Hyperactivity Disorder (ADHD)-Combined type (n = 64); ADHD-Inattentive type (n = 22); clinically referred without ADHD (n = 51); and nonreferred control children (n = 26). The ADHD-Combined group scored significantly higher than the referred without ADHD group and controls on the DOF Intrusive and Oppositional syndromes, Attention Deficit Hyperactivity Problems scale, Hyperactivity-Impulsivity subscale, and Total Problems; and significantly lower on the DOF On-Task score. The ADHD-Inattentive group scored significantly higher than controls on the DOF Sluggish Cognitive Tempo and Attention Problems syndromes, Inattention subscale, and Total Problems; and significantly lower on the DOF On-Task score. Implications are discussed regarding the discriminative validity of standardized classroom observations for identifying children with ADHD and differentiating between the two ADHD subtypes.

  9. Standardized Observational Assessment of Attention Deficit Hyperactivity Disorder Combined and Predominantly Inattentive Subtypes. II. Classroom Observations

    PubMed Central

    McConaughy, Stephanie H.; Ivanova, Masha Y.; Antshel, Kevin; Eiraldi, Ricardo B.; Dumenci, Levent

    2010-01-01

    Trained classroom observers used the Direct Observation Form (DOF; McConaughy & Achenbach, 2009) to rate observations of 163 6- to 11-year-old children in their school classrooms. Participants were assigned to four groups based on a parent diagnostic interview and parent and teacher rating scales: Attention Deficit Hyperactivity Disorder (ADHD)—Combined type (n = 64); ADHD—Inattentive type (n = 22); clinically referred without ADHD (n = 51); and nonreferred control children (n = 26). The ADHD—Combined group scored significantly higher than the referred without ADHD group and controls on the DOF Intrusive and Oppositional syndromes, Attention Deficit Hyperactivity Problems scale, Hyperactivity-Impulsivity subscale, and Total Problems; and significantly lower on the DOF On-Task score. The ADHD—Inattentive group scored significantly higher than controls on the DOF Sluggish Cognitive Tempo and Attention Problems syndromes, Inattention subscale, and Total Problems; and significantly lower on the DOF On-Task score. Implications are discussed regarding the discriminative validity of standardized classroom observations for identifying children with ADHD and differentiating between the two ADHD subtypes. PMID:20802813

  10. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    NASA Astrophysics Data System (ADS)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  11. Beyond the Standard Model: The pragmatic approach to the gauge hierarchy problem

    NASA Astrophysics Data System (ADS)

    Mahbubani, Rakhi

    The current favorite solution to the gauge hierarchy problem, the Minimal Supersymmetric Standard Model (MSSM), is looking increasingly fine tuned as recent results from LEP-II have pushed it to regions of its parameter space where a light higgs seems unnatural. Given this fact it seems sensible to explore other approaches to this problem; we study three alternatives here. The first is a Little Higgs theory, in which the Higgs particle is realized as the pseudo-Goldstone boson of an approximate global chiral symmetry and so is naturally light. We analyze precision electroweak observables in the Minimal Moose model, one example of such a theory, and look for regions in its parameter space that are consistent with current limits on these. It is also possible to find a solution within a supersymmetric framework by adding to the MSSM superpotential a lambdaSHuH d term and UV completing with new strong dynamics under which S is a composite before lambda becomes non-perturbative. This allows us to increase the MSSM tree level higgs mass bound to a value that alleviates the supersymmetric fine-tuning problem with elementary higgs fields, maintaining gauge coupling unification in a natural way. Finally we try an entirely different tack, in which we do not attempt to solve the hierarchy problem, but rather assume that the tuning of the higgs can be explained in some unnatural way, from environmental considerations for instance. With this philosophy in mind we study in detail the low-energy phenomenology of the minimal extension to the Standard Model with a dark matter candidate and gauge coupling unification, consisting of additional fermions with the quantum numbers of SUSY higgsinos, and a singlet.

  12. Benchmarking and Threshold Standards in Higher Education. Staff and Educational Development Series.

    ERIC Educational Resources Information Center

    Smith, Helen, Ed.; Armstrong, Michael, Ed.; Brown, Sally, Ed.

    This book explores the issues involved in developing standards in higher education, examining the practical issues involved in benchmarking and offering a critical analysis of the problems associated with this developmental tool. The book focuses primarily on experience in the United Kingdom (UK), but looks also at international activity in this…

  13. Where Is Lake Wobegone, Anyway? The Controversy Surrounding Social Promotion.

    ERIC Educational Resources Information Center

    Rothstein, Richard

    1998-01-01

    The dilemma of what to do with children who do not progress "normally" is not new, and did not arise because educators grew too timid to uphold academic standards. The problem is an unavoidable consequence of compulsory education. Advantages of social promotion still outweigh difficulties. Deterioration of school standards cannot be blamed on…

  14. Kuhn's Paradigm and Example-Based Teaching of Newtonian Mechanics.

    ERIC Educational Resources Information Center

    Whitaker, M. A. B.

    1980-01-01

    Makes a recommendation for more direct teaching of the basic principles of mechanics. Contends that students currently learn mechanics in terms of standard examples. This causes difficulty when the student is confronted with a problem that can be solved from basic principles, but which does not fit a standard category. (GS)

  15. Successfully Transitioning to Linear Equations

    ERIC Educational Resources Information Center

    Colton, Connie; Smith, Wendy M.

    2014-01-01

    The Common Core State Standards for Mathematics (CCSSI 2010) asks students in as early as fourth grade to solve word problems using equations with variables. Equations studied at this level generate a single solution, such as the equation x + 10 = 25. For students in fifth grade, the Common Core standard for algebraic thinking expects them to…

  16. Return of the Tug-of-War

    ERIC Educational Resources Information Center

    McNamara, Julie

    2017-01-01

    Long before the release of the Common Core State Standards (CCSSI 2010), the Mathematical Tug-of-War was engaging students in the type of reasoning and problem solving described by the Standards for Mathematical Practice (SMP). In this updated version of a Marilyn Burns task, students use algebraic reasoning to determine the outcome of a contest…

  17. "Standards Will Drop"--and Other Fears about the Equality Agenda in Higher Education

    ERIC Educational Resources Information Center

    Brink, Chris

    2009-01-01

    I discuss, on the basis of experience in Australia, South Africa, and the United Kingdom, some common fears and negative opinions about the equality agenda in higher education. These include: (1) "Standards will drop."; (2) "Our reputation will suffer."; (3) "It's not our problem."; (4) "It's social…

  18. Mobile Learning to Enrich Vocabulary in English

    ERIC Educational Resources Information Center

    Singaravelu, G.

    2009-01-01

    The study enlightens the impact of Mobile learning in enriching the vocabulary in English at standard VIII. Objectives of the study: 1. To find out the problems in enriching vocabulary in English at standard VIII. 2. To find out the impact of Mobile learning in enriching vocabulary in English. Hypothesis: There is no significant difference in…

  19. 78 FR 55037 - Approval and Promulgation of Implementation Plans; Texas; Attainment Demonstration for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-09

    ... protective of human health, especially children and adults who are active outdoors, and individuals with a... trigger a variety of health problems including chest pain, coughing, throat irritation, and congestion. It...). Primary standards are set to protect human health while secondary standards are set to protect public...

  20. Higher Standards: We'd Love to But . . .

    ERIC Educational Resources Information Center

    Kosar, Kevin R.

    During the past 12 years there was a sudden and unexpected consensus held by U.S. Congressmen and Presidents that students in U.S. public schools were learning less than they should. Moreover, the conservatives and liberals agreed that the proper policy response to this public problem was to raise education standards. Recent years brought five…

Top