Sample records for standardized high current

  1. Gigahertz single-electron pumping in silicon with an accuracy better than 9.2 parts in 10{sup 7}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamahata, Gento, E-mail: yamahata.gento@lab.ntt.co.jp; Karasawa, Takeshi; Fujiwara, Akira

    2016-07-04

    High-speed and high-accuracy pumping of a single electron is crucial for realizing an accurate current source, which is a promising candidate for a quantum current standard. Here, using a high-accuracy measurement system traceable to primary standards, we evaluate the accuracy of a Si tunable-barrier single-electron pump driven by a single sinusoidal signal. The pump operates at frequencies up to 6.5 GHz, producing a current of more than 1 nA. At 1 GHz, the current plateau with a level of about 160 pA is found to be accurate to better than 0.92 ppm (parts per million), which is a record value for 1-GHz operation. At 2 GHz,more » the current plateau offset from 1ef (∼320 pA) by 20 ppm is observed. The current quantization accuracy is improved by applying a magnetic field of 14 T, and we observe a current level of 1ef with an accuracy of a few ppm. The presented gigahertz single-electron pumping with a high accuracy is an important step towards a metrological current standard.« less

  2. Current activities in standardization of high-temperature, low-cycle-fatigue testing techniques in the United States

    NASA Technical Reports Server (NTRS)

    Verrilli, Michael J.; Ellis, J. Rodney; Swindeman, Robert W.

    1990-01-01

    The American Society for Testing and Materials (ASTM) standard E606-80 is the most often used recommended testing practice for low-cycle-fatigue (LCF) testing in the United States. The standard was first adopted in 1977 for LCF testing at room temperature and was modified in 1980 to include high-temperature testing practices. Current activity within ASTM is aimed at extending the E606-80 recommended practices to LCF under thermomechanical conditions, LCF in high-pressure hydrogen, and LCF in metal-matrix composite materials. Interlaboratory testing programs conducted to generate a technical base for modifying E606-80 for the aforementioned LCF test types are discussed.

  3. Superintendent's Commentary: The High School Diploma--A Standards-Based Passport or Reward for Effort?

    ERIC Educational Resources Information Center

    Livovich, Michael

    2004-01-01

    There has been a continuing drama over the unanswered question of whether students with disabilities deserve a high school diploma. Currently, a student is awarded a high school diploma if he or she meets local standards and if he or she passes the state's competency test of if his or her passage of the Indiana Standardized Test of Educational…

  4. National Standards for High School Psychology Curricula

    ERIC Educational Resources Information Center

    American Psychologist, 2013

    2013-01-01

    The "National Standards for High School Psychology Curricula" attempts to represent current knowledge in the field of psychology in developmentally appropriate ways. Psychology is a popular high school course, one that can introduce students to scientific ideas and engage students in the learning process. However, it is difficult for even the best…

  5. Relational Dynamics in Teacher Professional Development

    ERIC Educational Resources Information Center

    Finkelstein, Carla

    2013-01-01

    Teacher professional development (PD) is considered essential to improving student achievement toward high standards. I argue that while current notions of high quality PD foreground cognitive aspects of learning, they undertheorize the influence of relational dynamics in teacher learning interactions. That is, current conceptions of high quality…

  6. Trade Electricity. Motors & Controls--Level 3. Standardized Curriculum.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Office of Occupational and Career Education.

    This curriculum guide consists of seven modules on motors and controls, one of the three divisions of the standardized trade electricity curriculum in high schools in New York City. The seven modules cover the following subjects: energy conservation wiring, direct current (DC) motor repair and rewinding, DC motor controls, alternating current (AC)…

  7. Digital High-Current Monitor

    NASA Technical Reports Server (NTRS)

    Cash, B.

    1985-01-01

    Simple technique developed for monitoring direct currents up to several hundred amperes and digitally displaying values directly in current units. Used to monitor current magnitudes beyond range of standard laboratory ammeters, which typically measure 10 to 20 amperes maximum. Technique applicable to any current-monitoring situation.

  8. STATUS OF EPA/DOE MOU TECHNICAL WORKGROUP ACTIVITIES: HG WASTE TREATMENT

    EPA Science Inventory

    EPA's Land Disposal Restrictions program currently has technology-specific treatment standards for hazardous wastes containing greater than or equal to 260ppm total mercury (Hg) (i.e., high Hg subcategory wastes). The treatment standards specify RMERC for high Hg subcategory wast...

  9. A review of the International Atomic Energy Agency (IAEA) international standards for tissue banks.

    PubMed

    Morales Pedraza, Jorge; Lobo Gajiwala, Astrid; Martinez Pardo, María Esther

    2012-03-01

    The IAEA International Standards for Tissue Banks published in 2003 were based on the Standards then currently in use in the USA and the European Union, among others, and reflect the best practices associated with the operation of a tissue bank. They cover legal, ethical and regulatory controls as well as requirements and procedures from donor selection and tissue retrieval to processing and distribution of finished tissue for clinical use. The application of these standards allows tissue banks to operate with the current good tissue practice, thereby providing grafts of high quality that satisfy the national and international demand for safe and biologically useful grafts. The objective of this article is to review the IAEA Standards and recommend new topics that could improve the current version.

  10. Data standards for clinical research data collection forms: current status and challenges.

    PubMed

    Richesson, Rachel L; Nadkarni, Prakash

    2011-05-01

    Case report forms (CRFs) are used for structured-data collection in clinical research studies. Existing CRF-related standards encompass structural features of forms and data items, content standards, and specifications for using terminologies. This paper reviews existing standards and discusses their current limitations. Because clinical research is highly protocol-specific, forms-development processes are more easily standardized than is CRF content. Tools that support retrieval and reuse of existing items will enable standards adoption in clinical research applications. Such tools will depend upon formal relationships between items and terminological standards. Future standards adoption will depend upon standardized approaches for bridging generic structural standards and domain-specific content standards. Clinical research informatics can help define tools requirements in terms of workflow support for research activities, reconcile the perspectives of varied clinical research stakeholders, and coordinate standards efforts toward interoperability across healthcare and research data collection.

  11. 1998 Conference on Precision Electromagnetic Measurements Digest. Proceedings.

    NASA Astrophysics Data System (ADS)

    Nelson, T. L.

    The following topics were dealt with: fundamental constants; caesium standards; AC-DC transfer; impedance measurement; length measurement; units; statistics; cryogenic resonators; time transfer; QED; resistance scaling and bridges; mass measurement; atomic fountains and clocks; single electron transport; Newtonian constant of gravitation; stabilised lasers and frequency measurements; cryogenic current comparators; optical frequency standards; high voltage devices and systems; international compatibility; magnetic measurement; precision power measurement; high resolution spectroscopy; DC transport standards; waveform acquisition and analysis; ion trap standards; optical metrology; quantised Hall effect; Josephson array comparisons; signal generation and measurement; Avogadro constant; microwave networks; wideband power standards; antennas, fields and EMC; quantum-based standards.

  12. Oral Reading Fluency and Maze Measures as Predictors of Performance on North Carolina End-of-Grade Assessment of Reading Comprehension

    ERIC Educational Resources Information Center

    Galloway, Tara Watkins

    2010-01-01

    Current legislation (IDEA, 2004; NCLB, 2001) mandates all students, including students with disabilities, demonstrate progress toward the same standards. However, students continue to struggle with attainment of statewide academic standards as measured by high-stakes assessment. The purpose of the current study was to examine the degree that…

  13. Blood Cholesterol Measurement in Clinical Laboratories in the United States. Current Status. A Report from the Laboratory Standardization Panel of the National Cholesterol Education Program.

    ERIC Educational Resources Information Center

    National Heart, Lung, and Blood Inst. (DHHS/NIH), Bethesda, MD.

    Precise and accurate cholesterol measurements are required to identify and treat individuals with high blood cholesterol levels. However, the current state of reliability of blood cholesterol measurements suggests that considerable inaccuracy in cholesterol testing exists. This report describes the Laboratory Standardization Panel findings on the…

  14. [Ophthalmologic reading charts : Part 2: Current logarithmically scaled reading charts].

    PubMed

    Radner, W

    2016-12-01

    To analyze currently available reading charts regarding print size, logarithmic print size progression, and the background of test-item standardization. For the present study, the following logarithmically scaled reading charts were investigated using a measuring microscope (iNexis VMA 2520; Nikon, Tokyo): Eschenbach, Zeiss, OCULUS, MNREAD (Minnesota Near Reading Test), Colenbrander, and RADNER. Calculations were made according to EN-ISO 8596 and the International Research Council recommendations. Modern reading charts and cards exhibit a logarithmic progression of print sizes. The RADNER reading charts comprise four different cards with standardized test items (sentence optotypes), a well-defined stop criterion, accurate letter sizes, and a high print quality. Numbers and Landolt rings are also given in the booklet. The OCULUS cards have currently been reissued according to recent standards and also exhibit a high print quality. In addition to letters, numbers, Landolt rings, and examples taken from a timetable and the telephone book, sheet music is also offered. The Colenbrander cards use short sentences of 44 characters, including spaces, and exhibit inaccuracy at smaller letter sizes, as do the MNREAD cards. The MNREAD cards use sentences of 60 characters, including spaces, and have a high print quality. Modern reading charts show that international standards can be achieved with test items similar to optotypes, by using recent technology and developing new concepts of test-item standardization. Accurate print sizes, high print quality, and a logarithmic progression should become the minimum requirements for reading charts and reading cards in ophthalmology.

  15. National standards for high school psychology curricula.

    PubMed

    2013-01-01

    The National Standards for High School Psychology Curricula attempts to represent current knowledge in the field of psychology in developmentally appropriate ways. Psychology is a popular high school course, one that can introduce students to scientific ideas and engage students in the learning process. However, it is difficult for even the best of teachers to present all of psychology in a single course for students who begin with virtually no formal knowledge of psychology. The standards presented here constitute the first of two reports in this issue of the American Psychologist (January 2013) representing recent American Psychological Association (APA) policies that support high-quality instruction in the teaching of high school psychology. These standards provide curricular benchmarks for student learning in the high school course.

  16. Research Says…/High-Stakes Testing Narrows the Curriculum

    ERIC Educational Resources Information Center

    David, Jane L.

    2011-01-01

    The current rationale for standards-based reform goes like this: If standards are demanding and tests accurately measure achievement of those standards, then curriculum and instruction will become richer and more rigorous. By attaching serious consequences to schools that fail to increase test scores, U.S. policymakers believe that educators will…

  17. State Standards Rise in Reading, Fall in Math

    ERIC Educational Resources Information Center

    Peterson, Paul E.; Lastra-Anadon, Carlos Xabel

    2010-01-01

    Much ado has been made about setting high standards over the past year. Current conversations about creating a common national standard largely focus on the substantive curriculum to be taught at various grade levels. Even more important is each state's expectations for student performance with respect to the curriculum, as expressed through its…

  18. "Natural Philosophy" as a Foundation for Science Education in an Age of High-Stakes Accountability

    ERIC Educational Resources Information Center

    Buxton, Cory; Provenzo, Eugene F., Jr.

    2011-01-01

    Science curriculum and instruction in K-12 settings in the United States is currently dominated by an emphasis on the science standards movement of the 1990s and the resulting standards-based high-stakes assessment and accountability movement of the 2000s. We argue that this focus has moved the field away from important philosophical…

  19. Calibration of High Heat Flux Sensors at NIST

    PubMed Central

    Murthy, A. V.; Tsai, B. K.; Gibson, C. E.

    1997-01-01

    An ongoing program at the National Institute of Standards and Technology (NIST) is aimed at improving and standardizing heat-flux sensor calibration methods. The current calibration needs of U.S. science and industry exceed the current NIST capability of 40 kW/m2 irradiance. In achieving this goal, as well as meeting lower-level non-radiative heat flux calibration needs of science and industry, three different types of calibration facilities currently are under development at NIST: convection, conduction, and radiation. This paper describes the research activities associated with the NIST Radiation Calibration Facility. Two different techniques, transfer and absolute, are presented. The transfer calibration technique employs a transfer standard calibrated with reference to a radiometric standard for calibrating the sensors using a graphite tube blackbody. Plans for an absolute calibration facility include the use of a spherical blackbody and a cooled aperture and sensor-housing assembly to calibrate the sensors in a low convective environment. PMID:27805156

  20. Content Standards: High Stakes Anti-Differentiation

    ERIC Educational Resources Information Center

    Schroeder-Davis, Stephen

    2011-01-01

    Currently, American schooling, driven by No Child Left Behind (NCLB) and standardized tests, emphasizes development of intelligence. Because of this, teachers must heavily emphasize acquisition of foundational information (facts) in lectures, assessments, and of course, time-consuming test preparation, at the expense of intellect, that…

  1. The Devil's in the Details: Evidence from the GED on Large Effects of Small Differences in High Stakes Exams

    ERIC Educational Resources Information Center

    Tyler, John H.; Murnane, Richard J.; Willett, John B.

    2004-01-01

    As part of standards-based educational reform efforts, more than 40 states will soon require students to achieve passing scores on standardized exams in order to obtain a high school diploma. Currently, many states are struggling with the design of their examination systems, debating such questions as which subjects should be tested, what should…

  2. "What Turns You On!": An Exploration of Urban South African Xhosa and Zulu Youth Texts

    ERIC Educational Resources Information Center

    Finlayson, Rosalie; Slabbert, Sarah

    2003-01-01

    The status of the current standard African languages has been seriously undermined by factors such as the association of the standardisation process with colonial and neo-colonial structures, the lack of function of the standards and the rise of high status non-standard urban varieties. This paper describes the process leading to and some…

  3. A Comprehensive Analysis of High School Genetics Standards: Are States Keeping Pace with Modern Genetics?

    ERIC Educational Resources Information Center

    Dougherty, M. J.; Pleasants, C.; Solow, L.; Wong, A.; Zhang, H.

    2011-01-01

    Science education in the United States will increasingly be driven by testing and accountability requirements, such as those mandated by the No Child Left Behind Act, which rely heavily on learning outcomes, or "standards," that are currently developed on a state-by-state basis. Those standards, in turn, drive curriculum and instruction.…

  4. FBI compression standard for digitized fingerprint images

    NASA Astrophysics Data System (ADS)

    Brislawn, Christopher M.; Bradley, Jonathan N.; Onyshczak, Remigius J.; Hopper, Thomas

    1996-11-01

    The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the current status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.

  5. Estimating Adolescent Risk for Hearing Loss Based on Data From a Large School-Based Survey

    PubMed Central

    Verschuure, Hans; van der Ploeg, Catharina P. B.; Brug, Johannes; Raat, Hein

    2010-01-01

    Objectives. We estimated whether and to what extent a group of adolescents were at risk of developing permanent hearing loss as a result of voluntary exposure to high-volume music, and we assessed whether such exposure was associated with hearing-related symptoms. Methods. In 2007, 1512 adolescents (aged 12–19 years) in Dutch secondary schools completed questionnaires about their music-listening behavior and whether they experienced hearing-related symptoms after listening to high-volume music. We used their self-reported data in conjunction with published average sound levels of music players, discotheques, and pop concerts to estimate their noise exposure, and we compared that exposure to our own “loosened” (i.e., less strict) version of current European safety standards for occupational noise exposure. Results. About half of the adolescents exceeded safety standards for occupational noise exposure. About one third of the respondents exceeded safety standards solely as a result of listening to MP3 players. Hearing symptoms that occurred after using an MP3 player or going to a discotheque were associated with exposure to high-volume music. Conclusions. Adolescents often exceeded current occupational safety standards for noise exposure, highlighting the need for specific safety standards for leisure-time noise exposure. PMID:20395587

  6. The Best of Both Worlds

    ERIC Educational Resources Information Center

    Schneider, Jack; Feldman, Joe; French, Dan

    2016-01-01

    Relying on teachers' assessments for the information currently provided by standardized test scores would save instructional time, better capture the true abilities of diverse students, and reduce the problem of teaching to the test. A California high school is implementing standards-based reporting, ensuring that teacher-issued grades function as…

  7. An accurate online calibration system based on combined clamp-shape coil for high voltage electronic current transformers.

    PubMed

    Li, Zhen-hua; Li, Hong-bin; Zhang, Zhi

    2013-07-01

    Electronic transformers are widely used in power systems because of their wide bandwidth and good transient performance. However, as an emerging technology, the failure rate of electronic transformers is higher than that of traditional transformers. As a result, the calibration period needs to be shortened. Traditional calibration methods require the power of transmission line be cut off, which results in complicated operation and power off loss. This paper proposes an online calibration system which can calibrate electronic current transformers without power off. In this work, the high accuracy standard current transformer and online operation method are the key techniques. Based on the clamp-shape iron-core coil and clamp-shape air-core coil, a combined clamp-shape coil is designed as the standard current transformer. By analyzing the output characteristics of the two coils, the combined clamp-shape coil can achieve verification of the accuracy. So the accuracy of the online calibration system can be guaranteed. Moreover, by employing the earth potential working method and using two insulating rods to connect the combined clamp-shape coil to the high voltage bus, the operation becomes simple and safe. Tests in China National Center for High Voltage Measurement and field experiments show that the proposed system has a high accuracy of up to 0.05 class.

  8. Very high speed integrated circuits - Into the second generation. V - The issues of standardization and technology insertion

    NASA Astrophysics Data System (ADS)

    Martin, J.

    1982-04-01

    It is shown that the fulfillment of very high speed integrated circuit (VHSIC) device development goals entails the restructuring of military electronics acquisition policy, standardization which produces the maximum number of systems and subsystems by means of the minimum number of flexible, broad-purpose, high-power semiconductors, and especially the standardization of bus structures incorporating a priorization system. It is expected that the Design Specification Handbook currently under preparation by the VHSIC program office of the DOD will make the design of such systems a task whose complexity is comparable to that of present integrated circuit electronics.

  9. 78 FR 16051 - Vehicle/Track Interaction Safety Standards; High-Speed and High Cant Deficiency Operations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-13

    ...FRA is amending the Track Safety Standards and Passenger Equipment Safety Standards to promote the safe interaction of rail vehicles with the track over which they operate under a variety of conditions at speeds up to 220 m.p.h. The final rule revises standards for track geometry and safety limits for vehicle response to track conditions, enhances vehicle/track qualification procedures, and adds flexibility for permitting high cant deficiency train operations through curves at conventional speeds. The rule accounts for a range of vehicle types that are currently in operation, as well as vehicle types that may likely be used in future high-speed or high cant deficiency rail operations, or both. The rule is based on the results of simulation studies designed to identify track geometry irregularities associated with unsafe wheel/rail forces and accelerations, thorough reviews of vehicle qualification and revenue service test data, and consideration of international practices.

  10. Current National Approach to Healthcare ICT Standardization: Focus on Progress in New Zealand.

    PubMed

    Park, Young-Taek; Atalag, Koray

    2015-07-01

    Many countries try to efficiently deliver high quality healthcare services at lower and manageable costs where healthcare information and communication technologies (ICT) standardisation may play an important role. New Zealand provides a good model of healthcare ICT standardisation. The purpose of this study was to review the current healthcare ICT standardisation and progress in New Zealand. This study reviewed the reports regarding the healthcare ICT standardisation in New Zealand. We also investigated relevant websites related with the healthcare ICT standards, most of which were run by the government. Then, we summarised the governance structure, standardisation processes, and their output regarding the current healthcare ICT standards status of New Zealand. New Zealand government bodies have established a set of healthcare ICT standards and clear guidelines and procedures for healthcare ICT standardisation. Government has actively participated in various enactments of healthcare ICT standards from the inception of ideas to their eventual retirement. Great achievements in eHealth have already been realized, and various standards are currently utilised at all levels of healthcare regionally and nationally. Standard clinical terminologies, such as International Classification of Diseases (ICD) and Systematized Nomenclature of Medicine - Clinical Terms (SNOMED-CT) have been adopted and Health Level Seven (HL7) standards are actively used in health information exchanges. The government to New Zealand has well organised ICT institutions, guidelines, and regulations, as well as various programs, such as e-Medications and integrated care services. Local district health boards directly running hospitals have effectively adopted various new ICT standards. They might already be benefiting from improved efficiency resulting from healthcare ICT standardisation.

  11. Titration Calorimetry Standards and the Precision of Isothermal Titration Calorimetry Data

    PubMed Central

    Baranauskienė, Lina; Petrikaitė, Vilma; Matulienė, Jurgita; Matulis, Daumantas

    2009-01-01

    Current Isothermal Titration Calorimetry (ITC) data in the literature have relatively high errors in the measured enthalpies of protein-ligand binding reactions. There is a need for universal validation standards for titration calorimeters. Several inorganic salt co-precipitation and buffer protonation reactions have been suggested as possible enthalpy standards. The performances of several commercial calorimeters, including the VP-ITC, ITC200, and Nano ITC-III, were validated using these suggested standard reactions. PMID:19582227

  12. Performances of OsO(4) stabilized CO(2) lasers as optical frequency standards near 29 THz.

    PubMed

    Daussy, C; Ducos, F; Rovera, G D; Acef, O

    2000-01-01

    In this paper, we report on the metrological capabilities of CO (2)/OsO(4) optical frequency standards operating around 29 THz. Those frequency standards are currently involved in various fields, such as frequency metrology, high resolution spectroscopy, and Rydberg constant measurements. The most impressive features of the standards lies in the 10(-15) level frequency stability allied to a long-term reproducibility (1 yr) of 1.3x10 (-13).

  13. Educational Partnership and the Dilemmas of School Reform.

    ERIC Educational Resources Information Center

    Seeley, David

    Today's educational reform proposals are undermined by four dilemmas. First, the public may demand visible results before it will provide the funding needed to achieve them. Second, higher academic standards will increase failure rates, while more attainable standards will inadequately educate students. Third, the current focus on high schools may…

  14. Implementing Standards-Based Mathematics Instruction: A Casebook for Professional Development.

    ERIC Educational Resources Information Center

    Stein, Mary Kay; Smith, Margaret Schwan; Henningsen, Marjorie A.; Silver, Edward A.

    Teachers and teacher educators interested in synthesizing their current practice with new mathematics standards will welcome this highly useful volume. The QUASAR Project at the University of Pittsburgh presents prevalent cases of mathematics instruction drawn from their research of nearly 500 classroom lessons. The Mathematical Tasks…

  15. Exploring the Relationships between Dialogue and Inclusive School Communities

    ERIC Educational Resources Information Center

    Orzel, Courtney Leigh

    2012-01-01

    High standards and expectations for all students drive current school reform efforts which target accountability measures and focus on standardized tests, leaving many American students, particularly those who have been traditionally underserved and marginalized, feeling excluded and silenced in school. Thus, it is important for school leaders to…

  16. Low Dose MDCT with Tube Current Modulation: Role in Detection of Urolithiasis and Patient Effective Dose Reduction

    PubMed Central

    Kakkar, Chandan; Sripathi, Smiti; Parakh, Anushri; Shrivastav, Rajendra

    2016-01-01

    Introduction Urolithiasis is one of the major, recurring problem in young individuals and CT being the commonest diagnostic modality used. In order to reduce the radiation dose to the patient who are young and as stone formation is a recurring process; one of the simplest way would be, low dose CT along with tube current modulation. Aim Aim of this study was to compare the sensitivity and specificity of low dose (70mAs) with standard dose (250mAs) protocol in detecting urolithiasis and to define the tube current and mean effective patient dose by these protocols. Materials and Methods A prospective study was conducted in 200 patients over a period of 2 years with acute flank pain presentation. CT was performed in 100 cases with standard dose and another 100 with low dose protocol using tube current modulation. Sensitivity and specificity for calculus detection, percentage reduction of dose and tube current with low dose protocol was calculated. Results Urolithiasis was detected in 138 patients, 67 were examined by high dose and 71 were by low dose protocol. Sensitivity and Specificity of low dose protocol was 97.1% and 96.4% with similar results found in high BMI patients. Tube current modulation resulted in reduction of effective tube current by 12.17%. The mean effective patient dose for standard dose was 10.33 mSv whereas 2.92 mSv for low dose with 51.13–53.8% reduction in low dose protocol. Conclusion The study has reinforced that low-dose CT with tube current modulation is appropriate for diagnosis of urolithiasis with significant reduction in tube current and patient effective dose. PMID:27437322

  17. Low Dose MDCT with Tube Current Modulation: Role in Detection of Urolithiasis and Patient Effective Dose Reduction.

    PubMed

    Koteshwar, Prakashini; Kakkar, Chandan; Sripathi, Smiti; Parakh, Anushri; Shrivastav, Rajendra

    2016-05-01

    Urolithiasis is one of the major, recurring problem in young individuals and CT being the commonest diagnostic modality used. In order to reduce the radiation dose to the patient who are young and as stone formation is a recurring process; one of the simplest way would be, low dose CT along with tube current modulation. Aim of this study was to compare the sensitivity and specificity of low dose (70mAs) with standard dose (250mAs) protocol in detecting urolithiasis and to define the tube current and mean effective patient dose by these protocols. A prospective study was conducted in 200 patients over a period of 2 years with acute flank pain presentation. CT was performed in 100 cases with standard dose and another 100 with low dose protocol using tube current modulation. Sensitivity and specificity for calculus detection, percentage reduction of dose and tube current with low dose protocol was calculated. Urolithiasis was detected in 138 patients, 67 were examined by high dose and 71 were by low dose protocol. Sensitivity and Specificity of low dose protocol was 97.1% and 96.4% with similar results found in high BMI patients. Tube current modulation resulted in reduction of effective tube current by 12.17%. The mean effective patient dose for standard dose was 10.33 mSv whereas 2.92 mSv for low dose with 51.13-53.8% reduction in low dose protocol. The study has reinforced that low-dose CT with tube current modulation is appropriate for diagnosis of urolithiasis with significant reduction in tube current and patient effective dose.

  18. A complete electrical shock hazard classification system and its application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon, Lloyd; Cartelli, Laura; Graham, Nicole

    Current electrical safety standards evolved to address the hazards of 60-Hz power that are faced primarily by electricians, linemen, and others performing facility and utility work. As a result, this leaves a substantial gap in the management of electrical hazards in Research and Development (R&D) and specialized high voltage and high power equipment. We find substantial use of direct current (dc) electrical energy, and the use of capacitors, inductors, batteries, and radiofrequency (RF) power. The electrical hazards of these forms of electricity and their systems are different than for 50/60 Hz power. This paper proposes a method of classifying allmore » of the electrical shock hazards found in all types of R&D and utilization equipment. Examples of the variation of these hazards from NFPA 70E include (a) high voltage can be harmless, if the available current is sufficiently low, (b) low voltage can be harmful if the available current/power is high, (c) high voltage capacitor hazards are unique and include severe reflex action, affects on the heart, and tissue damage, and (d) arc flash hazard analysis for dc and capacitor systems are not provided in existing standards. This work has led to a comprehensive electrical hazard classification system that is based on various research conducted over the past 100 years, on analysis of such systems in R&D, and on decades of experience. Lastly, the new comprehensive electrical shock hazard classification system uses a combination of voltage, shock current available, fault current available, power, energy, and waveform to classify all forms of electrical hazards.« less

  19. A complete electrical shock hazard classification system and its application

    DOE PAGES

    Gordon, Lloyd; Cartelli, Laura; Graham, Nicole

    2018-02-08

    Current electrical safety standards evolved to address the hazards of 60-Hz power that are faced primarily by electricians, linemen, and others performing facility and utility work. As a result, this leaves a substantial gap in the management of electrical hazards in Research and Development (R&D) and specialized high voltage and high power equipment. We find substantial use of direct current (dc) electrical energy, and the use of capacitors, inductors, batteries, and radiofrequency (RF) power. The electrical hazards of these forms of electricity and their systems are different than for 50/60 Hz power. This paper proposes a method of classifying allmore » of the electrical shock hazards found in all types of R&D and utilization equipment. Examples of the variation of these hazards from NFPA 70E include (a) high voltage can be harmless, if the available current is sufficiently low, (b) low voltage can be harmful if the available current/power is high, (c) high voltage capacitor hazards are unique and include severe reflex action, affects on the heart, and tissue damage, and (d) arc flash hazard analysis for dc and capacitor systems are not provided in existing standards. This work has led to a comprehensive electrical hazard classification system that is based on various research conducted over the past 100 years, on analysis of such systems in R&D, and on decades of experience. Lastly, the new comprehensive electrical shock hazard classification system uses a combination of voltage, shock current available, fault current available, power, energy, and waveform to classify all forms of electrical hazards.« less

  20. A novel mechanism for electrical currents inducing ventricular fibrillation: The three-fold way to fibrillation.

    PubMed

    Kroll, Mark W; Panescu, Dorin; Hinz, Andrew F; Lakkireddy, Dhanunjaya

    2010-01-01

    It has been long recognized that there are 2 methods for inducing VF (ventricular fibrillation) with electrical currents‥ These are: (1) delivering a high-charge shock into the cardiac T-wave, and (2) delivering lower level currents for 1-5 seconds. Present electrical safety standards are based on this understanding. We present new data showing a 3(rd) mechanism of inducing VF which involves the steps of delivering sufficient current to cause high-rate cardiac capture, causing cardiac output collapse, leading to ischemia, for sufficiently long duration, which then lowers the VFT (VF threshold) to the level of the current, which finally results in VF. This requires about 40% of the normal VF-induction current but requires a duration of minutes instead of seconds for the VF to be induced. Anesthetized and ventilated swine (n=6) had current delivered from a probe tip 10 mm from the epicardium sufficient to cause hypotensive capture but not directly induce VF within 5 s. After a median time of 90 s, VF was induced. This 3(rd) mechanism of VF induction should be studied further and considered for electrical safety standards and is relevant to long-duration TASER Electronic Control Device applications.

  1. Ion transport and loss in the earth's quiet ring current. I - Data and standard model

    NASA Technical Reports Server (NTRS)

    Sheldon, R. B.; Hamilton, D. C.

    1993-01-01

    A study of the transport and loss of ions in the earth's quiet time ring current, in which the standard radial diffusion model developed for the high-energy radiation belt particles is compared with the measurements of the lower-energy ring current ions, is presented. The data set provides ionic composition information in an energy range that includes the bulk of the ring current energy density, 1-300 keV/e. Protons are found to dominate the quiet time energy density at all altitudes, peaking near L of about 4 at 60 keV/cu cm, with much smaller contributions from O(+) (1-10 percent), He(+) (1-5 percent), and He(2+) (less than 1 percent). A minimization procedure is used to fit the amplitudes of the standard electric radial diffusion coefficient, yielding 5.8 x 10 exp -11 R(E-squared)/s. Fluctuation ionospheric electric fields are suggested as the source of the additional diffusion detected.

  2. Pharmacist perceptions of new competency standards

    PubMed Central

    Maitreemit, Pagamas; Pongcharoensuk, Petcharat; Kapol, Nattiya; Armstrong, Edward P.

    2008-01-01

    Objective To suggest revisions to the Thai pharmacy competency standards and determine the perceptions of Thai pharmacy practitioners and faculty about the proposed pharmacy competency standards. Methods The current competency standards were revised by brainstorming session with nine Thai pharmacy experts according to their perceptions of society’s pharmacy needs. The revised standards were proposed and validated by 574 pharmacy practitioners and faculty members by using a written questionnaire. The respondents were classified based on their practice setting. Results The revision of pharmacy competency standard proposed the integration and addition to current competencies. Of 830 distributed questionnaires, 574 completed questionnaires were received (69.2% response rate). The proposed new competency standards contained 7 domains and 46 competencies. The majority of the respondents were supportive of all 46 proposed competencies. The highest ranked domain was Domain 1 (Practice Pharmacy within Laws, Professional Standards, and Ethics). The second and third highest expectations of pharmacy graduates were Domain 4 (Provide pharmaceutical care) and Domain 3 (Communicate and disseminate knowledge effectively). Conclusion The expectation for pharmacy graduates’ competencies were high and respondents encouraged additional growth in multidisciplinary efforts to improve patient care. PMID:25177401

  3. Purposes, Principles, and Standards for School Art Programs. Updated 2014

    ERIC Educational Resources Information Center

    National Art Education Association, 2014

    2014-01-01

    Fully updated to reflect current issues in the field of art education. Checklists embedded in charts allow users to indicate where their school or district stands in relation to the criteria--which has been expanded to include district wide, elementary, middle, high school, and superior standards. The criteria within the checklists reflect…

  4. Please, Not Another Push to Get Tough on Student Retention

    ERIC Educational Resources Information Center

    Norton, M. Scott

    2011-01-01

    Standardized academic testing, under-performing schools, demands for high standards in America's schools and current levels of student dropouts have resulted in renewed calls for "getting tough on student retention." The push for student retention is demanded by school boards and others in spite of the overwhelming research evidence that…

  5. Industry Interests in the HDTV Debate.

    ERIC Educational Resources Information Center

    Neil, Suzanne Chambliss

    This analysis of the pattern of industrial interests in the current debate over high definition television systems argues that the debate involves more than just television; rather, it is an expression of a shift in the conceptualization of the nature of standards, one which conceives of standards as guidelines for the development of specific…

  6. Radiation safety standards and their application: international policies and current issues.

    PubMed

    González, Abel J

    2004-09-01

    This paper briefly describes the current policies of the United Nations Scientific Committee on the Effects of Atomic Radiation and the International Commission on Radiological Protection and how these policies are converted into international radiation safety standards by the International Atomic Energy Agency, which is the only global organization-within the United Nations family of international agencies-with a statutory mandate not only to establish such standards but also to provide for their application. It also summarizes the current status of the established corpus of such international standards, and of it foreseeable evolution, as well as of legally binding undertakings by countries around the world that are linked to these standards. Moreover, this paper also reviews some major current global issues related to the application of international standards, including the following: strengthening of national infrastructures for radiation safety, including technical cooperation programs for assisting developing countries; occupational radiation safety challenges, including the protection of pregnant workers and their unborn children, dealing with working environments with high natural radiation levels, and occupational attributability of health effects (probability of occupational causation); restricting discharges of radioactive substances into the environment: reviewing current international policies vis-a-vis the growing concern on the radiation protection of the "environment;" radiological protection of patients undergoing radiodiagnostic and radiotherapeutic procedures: the current International Action Plan; safety and security of radiation sources: post-11 September developments; preparedness and response to radiation emergencies: enhancing the international network; safe transport of radioactive materials: new apprehensions; safety of radioactive waste management: concerns and connections with radiation protection; and radioactive residues remaining after the termination of activities: radiation protection response to the forthcoming wave of decommissioning of installations with radioactive materials. The ultimate aim of this paper is to encourage information exchange, cooperation, and collaboration within the radiation protection professional community. In particular, the paper tries to facilitate consolidation of the growing international regime on radiation safety, including the expansion of legally binding undertakings by countries, the strengthening of the current corpus of international radiation safety standards, and the development of international provisions for ensuring the proper worldwide application of these standards, such as a system of international appraisals by peer review.

  7. An accurate online calibration system based on combined clamp-shape coil for high voltage electronic current transformers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhen-hua; Li, Hong-bin; Zhang, Zhi

    Electronic transformers are widely used in power systems because of their wide bandwidth and good transient performance. However, as an emerging technology, the failure rate of electronic transformers is higher than that of traditional transformers. As a result, the calibration period needs to be shortened. Traditional calibration methods require the power of transmission line be cut off, which results in complicated operation and power off loss. This paper proposes an online calibration system which can calibrate electronic current transformers without power off. In this work, the high accuracy standard current transformer and online operation method are the key techniques. Basedmore » on the clamp-shape iron-core coil and clamp-shape air-core coil, a combined clamp-shape coil is designed as the standard current transformer. By analyzing the output characteristics of the two coils, the combined clamp-shape coil can achieve verification of the accuracy. So the accuracy of the online calibration system can be guaranteed. Moreover, by employing the earth potential working method and using two insulating rods to connect the combined clamp-shape coil to the high voltage bus, the operation becomes simple and safe. Tests in China National Center for High Voltage Measurement and field experiments show that the proposed system has a high accuracy of up to 0.05 class.« less

  8. Combinatorial electrochemical cell array for high throughput screening of micro-fuel-cells and metal/air batteries.

    PubMed

    Jiang, Rongzhong

    2007-07-01

    An electrochemical cell array was designed that contains a common air electrode and 16 microanodes for high throughput screening of both fuel cells (based on polymer electrolyte membrane) and metal/air batteries (based on liquid electrolyte). Electrode materials can easily be coated on the anodes of the electrochemical cell array and screened by switching a graphite probe from one cell to the others. The electrochemical cell array was used to study direct methanol fuel cells (DMFCs), including high throughput screening of electrode catalysts and determination of optimum operating conditions. For screening of DMFCs, there is about 6% relative standard deviation (percentage of standard deviation versus mean value) for discharge current from 10 to 20 mAcm(2). The electrochemical cell array was also used to study tin/air batteries. The effect of Cu content in the anode electrode on the discharge performance of the tin/air battery was investigated. The relative standard deviations for screening of metal/air battery (based on zinc/air) are 2.4%, 3.6%, and 5.1% for discharge current at 50, 100, and 150 mAcm(2), respectively.

  9. Advanced Imaging Technologies for the Detection of Dysplasia and Early Cancer in Barrett Esophagus

    PubMed Central

    Espino, Alberto; Cirocco, Maria; DaCosta, Ralph

    2014-01-01

    Advanced esophageal adenocarcinomas arising from Barrett esophagus (BE) are tumors with an increasing incidence and poor prognosis. The aim of endoscopic surveillance of BE is to detect dysplasia, particularly high-grade dysplasia and intramucosal cancers that can subsequently be treated endoscopically before progression to invasive cancer with lymph node metastases. Current surveillance practice standards require the collection of random 4-quadrant biopsy specimens over every 1 to 2 cm of BE (Seattle protocol) to detect dysplasia with the assistance of white light endoscopy, in addition to performing targeted biopsies of recognizable lesions. This approach is labor-intensive but should currently be considered state of the art. Chromoendoscopy, virtual chromoendoscopy (e.g., narrow band imaging), and confocal laser endomicroscopy, in addition to high-definition standard endoscopy, might increase the diagnostic yield for the detection of dysplastic lesions. Until these modalities have been demonstrated to enhance efficiency or cost effectiveness, the standard protocol will remain careful examination using conventional off the shelf high-resolution endoscopes, combined with as longer inspection time which is associated with increased detection of dysplasia. PMID:24570883

  10. Parallels in Arts Education and CTE: Some Guiding Reflections

    ERIC Educational Resources Information Center

    Hull, Bradley J.

    2010-01-01

    Many forces shape the current national conversation regarding career and technical education (CTE). Perkins IV guides the discussion through concepts such as challenging academic and technical standards; high skill, high wage, or high demand occupations; and programs of study. Workforce development and training, the economic recession,…

  11. Study on Oxygen Supply Standard for Physical Health of Construction Personnel of High-Altitude Tunnels.

    PubMed

    Guo, Chun; Xu, Jianfeng; Wang, Mingnian; Yan, Tao; Yang, Lu; Sun, Zhitao

    2015-12-22

    The low atmospheric pressure and low oxygen content in high-altitude environment have great impacts on the functions of human body. Especially for the personnel engaged in complicated physical labor such as tunnel construction, high altitude can cause a series of adverse physiological reactions, which may result in multiple high-altitude diseases and even death in severe cases. Artificial oxygen supply is required to ensure health and safety of construction personnel in hypoxic environments. However, there are no provisions for oxygen supply standard for tunnel construction personnel in high-altitude areas in current tunnel construction specifications. As a result, this paper has theoretically studied the impacts of high-altitude environment on human bodies, analyzed the relationship between labor intensity and oxygen consumption in high-altitude areas and determined the critical oxygen-supply altitude values for tunnel construction based on two different standard evaluation systems, i.e., variation of air density and equivalent PIO₂. In addition, it has finally determined the oxygen supply standard for construction personnel in high-altitude areas based on the relationship between construction labor intensity and oxygen consumption.

  12. High efficiency video coding for ultrasound video communication in m-health systems.

    PubMed

    Panayides, A; Antoniou, Z; Pattichis, M S; Pattichis, C S; Constantinides, A G

    2012-01-01

    Emerging high efficiency video compression methods and wider availability of wireless network infrastructure will significantly advance existing m-health applications. For medical video communications, the emerging video compression and network standards support low-delay and high-resolution video transmission, at the clinically acquired resolution and frame rates. Such advances are expected to further promote the adoption of m-health systems for remote diagnosis and emergency incidents in daily clinical practice. This paper compares the performance of the emerging high efficiency video coding (HEVC) standard to the current state-of-the-art H.264/AVC standard. The experimental evaluation, based on five atherosclerotic plaque ultrasound videos encoded at QCIF, CIF, and 4CIF resolutions demonstrates that 50% reductions in bitrate requirements is possible for equivalent clinical quality.

  13. Animal Health and Welfare Issues Facing Organic Production Systems

    PubMed Central

    Sutherland, Mhairi A.; Webster, Jim; Sutherland, Ian

    2013-01-01

    Simple Summary The demand for organically grown, animal derived produce is increasing due to a growing desire for consumer products that have minimal chemical inputs and high animal welfare standards. Evaluation of the scientific literature suggests that a major challenge facing organic animal production systems is the management and treatment of health-related issues. However, implementation of effective management practices can help organic animal producers achieve and maintain high standards of health and welfare, which is necessary to assure consumers that organic animal-based food and fibre has not only been produced with minimal or no chemical input, but under high standards of animal welfare. Abstract The demand for organically-grown produce is increasing worldwide, with one of the drivers being an expectation among consumers that animals have been farmed to a high standard of animal welfare. This review evaluates whether this expectation is in fact being met, by describing the current level of science-based knowledge of animal health and welfare in organic systems. The primary welfare risk in organic production systems appears to be related to animal health. Organic farms use a combination of management practices, alternative and complementary remedies and convenional medicines to manage the health of their animals and in many cases these are at least as effective as management practices employed by non-organic producers. However, in contrast to non-organic systems, there is still a lack of scientifically evaluated, organically acceptable therapeutic treatments that organic animal producers can use when current management practices are not sufficient to maintain the health of their animals. The development of such treatments are necessary to assure consumers that organic animal-based food and fibre has not only been produced with minimal or no chemical input, but under high standards of animal welfare. PMID:26479750

  14. The History of Preconception Care: Evolving Guidelines and Standards

    PubMed Central

    Moos, Merry-K.; Curtis, Michele

    2006-01-01

    This article explores the history of the preconception movement in the United States and the current status of professional practice guidelines and standards. Professionals with varying backgrounds (nurses, nurse practitioners, family practice physicians, pediatricians, nurse midwives, obstetricians/gynecologists) are in a position to provide preconception health services; standards and guidelines for numerous professional organizations, therefore, are explored. The professional nursing organization with the most highly developed preconception health standards is the American Academy of Nurse Midwives (ACNM); for physicians, it is the American College of Obstetricians and Gynecologists (ACOG). These guidelines and standards are discussed in detail. PMID:16710764

  15. [Practical considerations for high resolution anoscopy in patients infected with human immunodeficiency virus].

    PubMed

    Iribarren-Díaz, Mauricio; Ocampo Hermida, Antonio; González-Carreró Fojón, Joaquín; Alonso-Parada, María; Rodríguez-Girondo, Mar

    2014-12-01

    Anal cancer is uncommon in the general population, however its incidence is increasing significantly in certain risk groups, mainly in men who have sex with men, and particularly those infected with human immunodeficiency virus. High resolution anoscopy technique is currently considered the standard in the diagnosis of anal intraepithelial neoplasia, but at present there is no agreed standard method between health areas. High resolution anoscopy is an affordable technique that can be critical in the screening of anal carcinoma and its precursor lesions, but is not without difficulties. We are currently studying the most effective strategy for managing premalignant anal lesions, and with this article we attempt to encourage other groups interested in reducing the incidence of an increasing neoplasia. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  16. Pseudo-differential CMOS analog front-end circuit for wide-bandwidth optical probe current sensor

    NASA Astrophysics Data System (ADS)

    Uekura, Takaharu; Oyanagi, Kousuke; Sonehara, Makoto; Sato, Toshiro; Miyaji, Kousuke

    2018-04-01

    In this paper, we present a pseudo-differential analog front-end (AFE) circuit for a novel optical probe current sensor (OPCS) aimed for high-frequency power electronics. It employs a regulated cascode transimpedance amplifier (RGC-TIA) to achieve a high gain and a large bandwidth without using an extremely high performance operational amplifier. The AFE circuit is designed in a 0.18 µm standard CMOS technology achieving a high transimpedance gain of 120 dB Ω and high cut off frequency of 16 MHz. The measured slew rate is 70 V/µs and the input referred current noise is 1.02 pA/\\sqrt{\\text{Hz}} . The magnetic resolution and bandwidth of OPCS are estimated to be 1.29 mTrms and 16 MHz, respectively; the bandwidth is higher than that of the reported Hall effect current sensor.

  17. New Directions in Space Operations Services in Support of Interplanetary Exploration

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.

    2005-01-01

    To gain access to the necessary operational processes and data in support of NASA's Lunar/Mars Exploration Initiative, new services, adequate levels of computing cycles and access to myriad forms of data must be provided to onboard spacecraft and ground based personnel/systems (earth, lunar and Martian) to enable interplanetary exploration by humans. These systems, cycles and access to vast amounts of development, test and operational data will be required to provide a new level of services not currently available to existing spacecraft, on board crews and other operational personnel. Although current voice, video and data systems in support of current space based operations has been adequate, new highly reliable and autonomous processes and services will be necessary for future space exploration activities. These services will range from the more mundane voice in LEO to voice in interplanetary travel which because of the high latencies will require new voice processes and standards. New services, like component failure predictions based on data mining of significant quantities of data, located at disparate locations, will be required. 3D or holographic representation of onboard components, systems or family members will greatly improve maintenance, operations and service restoration not to mention crew morale. Current operational systems and standards, like the Internet Protocol, will not able to provide the level of service required end to end from an end point on the Martian surface like a scientific instrument to a researcher at a university. Ground operations whether earth, lunar or Martian and in flight operations to the moon and especially to Mars will require significant autonomy that will require access to highly reliable processing capabilities, data storage based on network storage technologies. Significant processing cycles will be needed onboard but could be borrowed from other locations either ground based or onboard other spacecraft. Reliability will be a key factor with onboard and distributed backup processing an absolutely necessary requirement. Current cluster processing/Grid technologies may provide the basis for providing these services. An overview of existing services, future services that will be required and the technologies and standards required to be developed will be presented. The purpose of this paper will be to initiate a technological roadmap, albeit at a high level, of current voice, video, data and network technologies and standards (which show promise for adaptation or evolution) to what technologies and standards need to be redefined, adjusted or areas where new ones require development. The roadmap should begin the differentiation between non manned and manned processes/services where applicable. The paper will be based in part on the activities of the CCSDS Monitor and Control working group which is beginning the process of standardization of the these processes. Another element of the paper will be based on an analysis of current technologies supporting space flight processes and services at JSC, MSFC, GSFC and to a lesser extent at KSC. Work being accomplished in areas such as Grid computing, data mining and network storage at ARC, IBM and the University of Alabama at Huntsville will be researched and analyzed.

  18. Programmable high-output-impedance, large-voltage compliance, microstimulator for low-voltage biomedical applications.

    PubMed

    Farahmand, Sina; Maghami, Mohammad Hossein; Sodagar, Amir M

    2012-01-01

    This paper reports on the design of a programmable, high output impedance, large voltage compliance microstimulator for low-voltage biomedical applications. A 6-bit binary-weighted digital to analog converter (DAC) is used to generate biphasic stimulus current pulses. A compact current mirror with large output voltage compliance and high output resistance conveys the current pulses to the target tissue. Designed and simulated in a standard 0.18µm CMOS process, the microstimulator circuit is capable of delivering a maximum stimulation current of 160µA to a 10-kΩ resistive load. Operated at a 1.8-V supply voltage, the output stage exhibits a voltage compliance of 1.69V and output resistance of 160MΩ at full scale stimulus current. Layout of the core microelectrode circuit measures 25.5µm×31.5µm.

  19. High-current-density electrodeposition using pulsed and constant currents to produce thick CoPt magnetic films on silicon substrates

    NASA Astrophysics Data System (ADS)

    Ewing, Jacob; Wang, Yuzheng; Arnold, David P.

    2018-05-01

    This paper investigates methods for electroplating thick (>20 μm), high-coercivity CoPt films using high current densities (up to 1 A/cm2) and elevated bath temperatures (70 °C). Correlations are made tying current-density and temperature process parameters with plating rate, elemental ratio and magnetic properties of the deposited CoPt films. It also investigates how pulsed currents can increase the plating rate and film to substrate adhesion. Using 500 mA/cm2 and constant current, high-quality, dense CoPt films were successfully electroplated up to 20 μm thick in 1 hr on silicon substrates (0.35 μm/min plating rate). After standard thermal treatment (675°C, 30 min) to achieve the ordered L10 crystalline phase, strong magnetic properties were measured: coercivities up 850 kA/m, remanences >0.5 T, and maximum energy products up to 46 kJ/m3.

  20. Intensified treatment with high dose Rifampicin and Levofloxacin compared to standard treatment for adult patients with Tuberculous Meningitis (TBM-IT): protocol for a randomized controlled trial

    PubMed Central

    2011-01-01

    Background Tuberculous meningitis is the most severe form of tuberculosis. Mortality for untreated tuberculous meningitis is 100%. Despite the introduction of antibiotic treatment for tuberculosis the mortality rate for tuberculous meningitis remains high; approximately 25% for HIV-negative and 67% for HIV positive patients with most deaths occurring within one month of starting therapy. The high mortality rate in tuberculous meningitis reflects the severity of the condition but also the poor antibacterial activity of current treatment regimes and relatively poor penetration of these drugs into the central nervous system. Improving the antitubercular activity in the central nervous system of current therapy may help improve outcomes. Increasing the dose of rifampicin, a key drug with known poor cerebrospinal fluid penetration may lead to higher drug levels at the site of infection and may improve survival. Of the second generation fluoroquinolones, levofloxacin may have the optimal pharmacological features including cerebrospinal fluid penetration, with a ratio of Area Under the Curve (AUC) in cerebrospinal fluid to AUC in plasma of >75% and strong bactericidal activity against Mycobacterium tuberculosis. We propose a randomized controlled trial to assess the efficacy of an intensified anti-tubercular treatment regimen in tuberculous meningitis patients, comparing current standard tuberculous meningitis treatment regimens with standard treatment intensified with high-dose rifampicin and additional levofloxacin. Methods/Design A randomized, double blind, placebo-controlled trial with two parallel arms, comparing standard Vietnamese national guideline treatment for tuberculous meningitis with standard treatment plus an increased dose of rifampicin (to 15 mg/kg/day total) and additional levofloxacin. The study will include 750 patients (375 per treatment group) including a minimum of 350 HIV-positive patients. The calculation assumes an overall mortality of 40% vs. 30% in the two arms, respectively (corresponding to a target hazard ratio of 0.7), a power of 80% and a two-sided significance level of 5%. Randomization ratio is 1:1. The primary endpoint is overall survival, i.e. time from randomization to death during a follow-up period of 9 months. Secondary endpoints are: neurological disability at 9 months, time to new neurological event or death, time to new or recurrent AIDS-defining illness or death (in HIV-positive patients only), severe adverse events, and rate of treatment interruption for adverse events. Discussion Currently very few options are available for the treatment of TBM and the mortality rate remains unacceptably high with severe disabilities seen in many of the survivors. This trial is based on the hypothesis that current anti-mycobacterial treatment schedules for TBM are not potent enough and that outcomes will be improved by increasing the CSF penetrating power of this regimen by optimising dosage and using additional drugs with better CSF penetration. Trial registration International Standard Randomised Controlled Trial Number ISRCTN61649292 PMID:21288325

  1. Intensified treatment with high dose rifampicin and levofloxacin compared to standard treatment for adult patients with tuberculous meningitis (TBM-IT): protocol for a randomized controlled trial.

    PubMed

    Heemskerk, Dorothee; Day, Jeremy; Chau, Tran Thi Hong; Dung, Nguyen Huy; Yen, Nguyen Thi Bich; Bang, Nguyen Duc; Merson, Laura; Olliaro, Piero; Pouplin, Thomas; Caws, Maxine; Wolbers, Marcel; Farrar, Jeremy

    2011-02-02

    Tuberculous meningitis is the most severe form of tuberculosis. Mortality for untreated tuberculous meningitis is 100%. Despite the introduction of antibiotic treatment for tuberculosis the mortality rate for tuberculous meningitis remains high; approximately 25% for HIV-negative and 67% for HIV positive patients with most deaths occurring within one month of starting therapy. The high mortality rate in tuberculous meningitis reflects the severity of the condition but also the poor antibacterial activity of current treatment regimes and relatively poor penetration of these drugs into the central nervous system. Improving the antitubercular activity in the central nervous system of current therapy may help improve outcomes. Increasing the dose of rifampicin, a key drug with known poor cerebrospinal fluid penetration may lead to higher drug levels at the site of infection and may improve survival. Of the second generation fluoroquinolones, levofloxacin may have the optimal pharmacological features including cerebrospinal fluid penetration, with a ratio of Area Under the Curve (AUC) in cerebrospinal fluid to AUC in plasma of >75% and strong bactericidal activity against Mycobacterium tuberculosis. We propose a randomized controlled trial to assess the efficacy of an intensified anti-tubercular treatment regimen in tuberculous meningitis patients, comparing current standard tuberculous meningitis treatment regimens with standard treatment intensified with high-dose rifampicin and additional levofloxacin. A randomized, double blind, placebo-controlled trial with two parallel arms, comparing standard Vietnamese national guideline treatment for tuberculous meningitis with standard treatment plus an increased dose of rifampicin (to 15 mg/kg/day total) and additional levofloxacin. The study will include 750 patients (375 per treatment group) including a minimum of 350 HIV-positive patients. The calculation assumes an overall mortality of 40% vs. 30% in the two arms, respectively (corresponding to a target hazard ratio of 0.7), a power of 80% and a two-sided significance level of 5%. Randomization ratio is 1:1. The primary endpoint is overall survival, i.e. time from randomization to death during a follow-up period of 9 months. Secondary endpoints are: neurological disability at 9 months, time to new neurological event or death, time to new or recurrent AIDS-defining illness or death (in HIV-positive patients only), severe adverse events, and rate of treatment interruption for adverse events. Currently very few options are available for the treatment of TBM and the mortality rate remains unacceptably high with severe disabilities seen in many of the survivors. This trial is based on the hypothesis that current anti-mycobacterial treatment schedules for TBM are not potent enough and that outcomes will be improved by increasing the CSF penetrating power of this regimen by optimising dosage and using additional drugs with better CSF penetration. International Standard Randomised Controlled Trial Number ISRCTN61649292.

  2. Shatter resistance of spectacle lenses.

    PubMed

    Vinger, P F; Parver, L; Alfaro, D V; Woods, T; Abrams, B S

    1997-01-08

    To evaluate the relative strength and shatter resistance of spectacle lenses currently used in sunglasses and dress, sports, and industrial eyewear. Seven lenses that met the US American National Standards Institute (ANSI) Z80 standards for dress glasses (made of high-index plastic, allyl resin plastic, heat tempered glass, chemically tempered glass, and polycarbonate, and with center thickness ranging from 1 mm to 2.2 mm) and 4 lenses that met ANSI Z87 standards for industrial safety eyewear (allyl resin plastic, heat-tempered glass, chemically tempered glass, and polycarbonate, all with 3.0-mm center thickness) were tested for impact resistance to 5 projectiles (air gun pellets, golf balls, tennis balls, lacrosse balls, and baseballs). Impact energy required to shatter spectacle lenses. Based on 348 lens impacts, dress and industrial lenses made from glass, allyl resin plastic, and high-index plastic shattered at impact energies less than those expected to be encountered from the test projectiles during their routine use. Polycarbonate lenses demonstrated resistance to impact for all tested projectiles exceeding the impact potential expected during routine use. Under the test conditions of this study, polycarbonate lenses demonstrated greater impact resistance than other commonly used spectacle lenses that conform to prevailing eyewear standards. These findings suggest that current ANSI Z80 and ANSI Z87 standards should be reevaluated.

  3. MREIT experiments with 200 µA injected currents: a feasibility study using two reconstruction algorithms, SMM and harmonic B(Z).

    PubMed

    Arpinar, V E; Hamamura, M J; Degirmenci, E; Muftuler, L T

    2012-07-07

    Magnetic resonance electrical impedance tomography (MREIT) is a technique that produces images of conductivity in tissues and phantoms. In this technique, electrical currents are applied to an object and the resulting magnetic flux density is measured using magnetic resonance imaging (MRI) and the conductivity distribution is reconstructed using these MRI data. Currently, the technique is used in research environments, primarily studying phantoms and animals. In order to translate MREIT to clinical applications, strict safety standards need to be established, especially for safe current limits. However, there are currently no standards for safe current limits specific to MREIT. Until such standards are established, human MREIT applications need to conform to existing electrical safety standards in medical instrumentation, such as IEC601. This protocol limits patient auxiliary currents to 100 µA for low frequencies. However, published MREIT studies have utilized currents 10-400 times larger than this limit, bringing into question whether the clinical applications of MREIT are attainable under current standards. In this study, we investigated the feasibility of MREIT to accurately reconstruct the relative conductivity of a simple agarose phantom using 200 µA total injected current and tested the performance of two MREIT reconstruction algorithms. These reconstruction algorithms used are the iterative sensitivity matrix method (SMM) by Ider and Birgul (1998 Elektrik 6 215-25) with Tikhonov regularization and the harmonic B(Z) proposed by Oh et al (2003 Magn. Reason. Med. 50 875-8). The reconstruction techniques were tested at both 200 µA and 5 mA injected currents to investigate their noise sensitivity at low and high current conditions. It should be noted that 200 µA total injected current into a cylindrical phantom generates only 14.7 µA current in imaging slice. Similarly, 5 mA total injected current results in 367 µA in imaging slice. Total acquisition time for 200 µA and 5 mA experiments was about 1 h and 8.5 min, respectively. The results demonstrate that conductivity imaging is possible at low currents using the suggested imaging parameters and reconstructing the images using iterative SMM with Tikhonov regularization, which appears to be more tolerant to noisy data than harmonic B(Z).

  4. GIS Education in Taiwanese Senior High Schools: A National Survey among Geography Teachers

    ERIC Educational Resources Information Center

    Wang, Yao-Hui; Chen, Che-Ming

    2013-01-01

    Following the integration of GIS into the national curriculum standards of senior high school geography, Taiwan has systematically implemented GIS education for over a decade. However, the effectiveness of this implementation is currently unclear. Therefore, this study investigates the status of GIS education in Taiwanese senior high schools. A…

  5. High School Physics: An Interactive Instructional Approach That Meets the Next Generation Science Standards

    ERIC Educational Resources Information Center

    Huang, Shaobo; Mejia, Joel Alejandro; Becker, Kurt; Neilson, Drew

    2015-01-01

    Improving high school physics teaching and learning is important to the long-term success of science, technology, engineering, and mathematics (STEM) education. Efforts are currently in place to develop an understanding of science among high school students through formal and informal educational experiences in engineering design activities…

  6. Thermal-Error Regime in High-Accuracy Gigahertz Single-Electron Pumping

    NASA Astrophysics Data System (ADS)

    Zhao, R.; Rossi, A.; Giblin, S. P.; Fletcher, J. D.; Hudson, F. E.; Möttönen, M.; Kataoka, M.; Dzurak, A. S.

    2017-10-01

    Single-electron pumps based on semiconductor quantum dots are promising candidates for the emerging quantum standard of electrical current. They can transfer discrete charges with part-per-million (ppm) precision in nanosecond time scales. Here, we employ a metal-oxide-semiconductor silicon quantum dot to experimentally demonstrate high-accuracy gigahertz single-electron pumping in the regime where the number of electrons trapped in the dot is determined by the thermal distribution in the reservoir leads. In a measurement with traceability to primary voltage and resistance standards, the averaged pump current over the quantized plateau, driven by a 1-GHz sinusoidal wave in the absence of a magnetic field, is equal to the ideal value of e f within a measurement uncertainty as low as 0.27 ppm.

  7. Low-threshold high-T/0/ constricted double heterojunction AlGaAs diode lasers

    NASA Technical Reports Server (NTRS)

    Botez, D.; Connolly, J. C.

    1980-01-01

    Constricted double heterojunction diode lasers of relatively low CW thresholds (28-40 mA) are obtained by growing structures that maximize the amount of current flow into the lasing spot. These values are obtained while still using standard 10 microns wide oxide-defined stripe contacts. Over the 20-70 C temperature interval, threshold current temperature coefficients as high as 320 C and a virtually constant external differential quantum efficiency, are found.

  8. True random bit generators based on current time series of contact glow discharge electrolysis

    NASA Astrophysics Data System (ADS)

    Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain

    2018-05-01

    Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.

  9. Statistical parametric mapping of LORETA using high density EEG and individual MRI: application to mismatch negativities in schizophrenia.

    PubMed

    Park, Hae-Jeong; Kwon, Jun Soo; Youn, Tak; Pae, Ji Soo; Kim, Jae-Jin; Kim, Myung-Sun; Ha, Kyoo-Seob

    2002-11-01

    We describe a method for the statistical parametric mapping of low resolution electromagnetic tomography (LORETA) using high-density electroencephalography (EEG) and individual magnetic resonance images (MRI) to investigate the characteristics of the mismatch negativity (MMN) generators in schizophrenia. LORETA, using a realistic head model of the boundary element method derived from the individual anatomy, estimated the current density maps from the scalp topography of the 128-channel EEG. From the current density maps that covered the whole cortical gray matter (up to 20,000 points), volumetric current density images were reconstructed. Intensity normalization of the smoothed current density images was used to reduce the confounding effect of subject specific global activity. After transforming each image into a standard stereotaxic space, we carried out statistical parametric mapping of the normalized current density images. We applied this method to the source localization of MMN in schizophrenia. The MMN generators, produced by a deviant tone of 1,200 Hz (5% of 1,600 trials) under the standard tone of 1,000 Hz, 80 dB binaural stimuli with 300 msec of inter-stimulus interval, were measured in 14 right-handed schizophrenic subjects and 14 age-, gender-, and handedness-matched controls. We found that the schizophrenic group exhibited significant current density reductions of MMN in the left superior temporal gyrus and the left inferior parietal gyrus (P < 0. 0005). This study is the first voxel-by-voxel statistical mapping of current density using individual MRI and high-density EEG. Copyright 2002 Wiley-Liss, Inc.

  10. Race to the Top Leaves Children and Future Citizens behind: The Devastating Effects of Centralization, Standardization, and High Stakes Accountability

    ERIC Educational Resources Information Center

    Onosko, Joe

    2011-01-01

    President Barack Obama's Race to the Top (RTT) is a profoundly flawed educational reform plan that increases standardization, centralization, and test-based accountability in our nation's schools. Following a brief summary of the interest groups supporting the plan, who is currently participating in this race, why so many states voluntarily…

  11. What Would He Say? Harold O. Rugg and Contemporary Issues in Social Studies Education

    ERIC Educational Resources Information Center

    Boyle-Baise, Marilynne; Goodman, Jesse

    2009-01-01

    The purpose of this paper is to consider the continued saliency of the ideas of Harold O. Rugg, particularly for social studies education. Given the conservative political times in which we work, and the current educational emphases on academic standards, high-stakes standardized testing, and mastery of specified knowledge, and the impact of these…

  12. Can informed consent to research be adapted to risk?

    PubMed

    Bromwich, Danielle; Rid, Annette

    2015-07-01

    The current ethical and regulatory framework for research is often charged with burdening investigators and impeding socially valuable research. To address these concerns, a growing number of research ethicists argue that informed consent should be adapted to the risks of research participation. This would require less rigorous consent standards in low-risk research than in high-risk research. However, the current discussion is restricted to cases of research in which the risks of research participation are outweighed by the potential clinical benefits for the individual research participant. Furthermore, current proposals do not address the concern that risk-adapted informed consent may result in enrolling participants into research without their autonomous authorisation. In this paper, we show how the standard view of informed consent--consent as autonomous authorisation--can be adapted to risk even when the research does not have a favourable risk-benefit profile for the participant. Our argument has two important implications: first, it implies that current and proposed consent standards are not adequately calibrated to risk and, second, that consent standards also need to be adapted to factors other than risk. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Paperbacks Expand High School Collections.

    ERIC Educational Resources Information Center

    Epstein, Connie C.

    1983-01-01

    Lists current nonfiction paperback titles of special interest to high school readers submitted by publishers such as Bantam, Harper and Row, MacMillan, and Scholastic from best of titles on their lists. Arrangement is alphabetical by publisher and citations provide author, title, price, publication date, international standard book number, and…

  14. Accountability Strain, College Readiness Drain: Sociopolitical Tensions Involved in Maintaining a College-Going Culture in a High "Minority", High Poverty, Texas High School

    ERIC Educational Resources Information Center

    Welton, Anjale; Williams, Montrischa

    2015-01-01

    Currently school reform discourse encourages states to adopt college readiness standards. Meanwhile, federal and state accountability and related mandated reforms remain a policy concern. As such, it is important to examine the interplay between accountability and the establishment of a college-going culture in high "minority", high…

  15. A New Standard Pulsar Magnetosphere

    NASA Technical Reports Server (NTRS)

    Contopoulos, Ioannis; Kalapotharakos, Constantinos; Kazanas, Demosthenes

    2014-01-01

    In view of recent efforts to probe the physical conditions in the pulsar current sheet, we revisit the standard solution that describes the main elements of the ideal force-free pulsar magnetosphere. The simple physical requirement that the electric current contained in the current layer consists of the local electric charge moving outward at close to the speed of light yields a new solution for the pulsar magnetosphere everywhere that is ideal force-free except in the current layer. The main elements of the new solution are as follows: (1) the pulsar spindown rate of the aligned rotator is 23% larger than that of the orthogonal vacuum rotator; (2) only 60% of the magnetic flux that crosses the light cylinder opens up to infinity; (3) the electric current closes along the other 40%, which gradually converges to the equator; (4) this transfers 40% of the total pulsar spindown energy flux in the equatorial current sheet, which is then dissipated in the acceleration of particles and in high-energy electromagnetic radiation; and (5) there is no separatrix current layer. Our solution is a minimum free-parameter solution in that the equatorial current layer is electrostatically supported against collapse and thus does not require a thermal particle population. In this respect, it is one more step toward the development of a new standard solution. We discuss the implications for intermittent pulsars and long-duration gamma-ray bursts. We conclude that the physical conditions in the equatorial current layer determine the global structure of the pulsar magnetosphere.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arndt, T.E., Fluor Daniel Hanford

    A previous evaluation documented in report WHC-SD-GN-RPT-30005, Rev. 0, titled ``Evaluation on Self-Contained High Efficiency Particulate Filters,`` revealed that the SCHEPA filters do not have required documentation to be in compliance with the design, testing, and fabrication standards required in ASME N-509, ASME N-510, and MIL-F-51068. These standards are required by DOE Order 6430.IA. Without this documentation, filter adequacy cannot be verified. The existing SCHEPA filters can be removed and replaced with new filters and filter housing which meet current codes and standards.

  17. Fabrication of 12% {sup 240}Pu calorimetry standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, S.M.; Hildner, S.; Gutierrez, D.

    1995-08-01

    Throughout the DOE complex, laboratories are performing calorimetric assays on items containing high burnup plutonium. These materials contain higher isotopic range and higher wattages than materials previously encountered in vault holdings. Currently, measurement control standards have been limited to utilizing 6% {sup 240}Pu standards. The lower isotopic and wattage value standards do not complement the measurement of the higher burnup material. Participants of the Calorimetry Exchange (CALEX) Program have identified the need for new calorimetric assay standards with a higher wattage and isotopic range. This paper describes the fabrication and verification measurements of the new CALEX standard containing 12% {supmore » 240}Pu oxide with a wattage of about 6 to 8 watts.« less

  18. Examining a Public Montessori School's Response to the Pressures of High-Stakes Accountability

    ERIC Educational Resources Information Center

    Block, Corrie Rebecca

    2015-01-01

    A public Montessori school is expected to demonstrate high student scores on standardized assessments to succeed in the current school accountability era. A problem for a public Montessori elementary school is how to make sense of the school's high-stakes assessment scores in terms of Montessori's unique educational approach. This case study…

  19. Fermilab Today

    Science.gov Websites

    Seminar - Curia II Speaker: Chris Weaver, University of Wisconsin Title: Evidence for High-Energy up to the current code standards. The high-bay area (the space with the big crane), which will be away each year, and that's only the Styrofoam found in cups, not other packaging or food containers

  20. Teacher Leadership and High Standards in a Summer Middle School.

    ERIC Educational Resources Information Center

    Kelleher, James

    2003-01-01

    Notes that summer school has been affected by current curricular reform and high stakes testing. Describes an innovative summer school program, created through transformational teacher leadership, that developed a new vision for integrated curriculum--one that revolved around rebuilding a boat. Presents implications for both an integrated academic…

  1. Multidimensional Scaling of High School Students' Perceptions of Academic Dishonesty

    ERIC Educational Resources Information Center

    Schmelkin, Liora Pedhazur; Gilbert, Kimberly A.; Silva, Rebecca

    2010-01-01

    Although cheating on tests and other forms of academic dishonesty are considered rampant, no standard definition of academic dishonesty exists. The current study was conducted to investigate the perceptions of academic dishonesty in high school students, utilizing an innovative methodology, multidimensional scaling (MDS). Two methods were used to…

  2. Re-engineering Nascom's network management architecture

    NASA Technical Reports Server (NTRS)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated the potential value of commercial-off-the-shelf (COTS) and standards through reduced cost and high quality. The FARM will allow the application of the lessons learned from these projects to all future Nascom systems.

  3. The early universe history from contraction-deformation of the Standard Model

    NASA Astrophysics Data System (ADS)

    Gromov, N. A.

    2017-03-01

    The elementary particles evolution in the early Universe from Plank time up to several milliseconds is presented. The developed theory is based on the high-temperature (high-energy) limit of the Standard Model which is generated by the contractions of its gauge groups. At the infinite temperature all particles lose masses. Only massless neutral -bosons, massless Z-quarks, neutrinos and photons are survived in this limit. The weak interactions become long-range and are mediated by neutral currents, quarks have only one color degree of freedom.

  4. High Temperature Gas Reactors: Assessment of Applicable Codes and Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDowell, Bruce K.; Nickolaus, James R.; Mitchell, Mark R.

    2011-10-31

    Current interest expressed by industry in HTGR plants, particularly modular plants with power up to about 600 MW(e) per unit, has prompted NRC to task PNNL with assessing the currently available literature related to codes and standards applicable to HTGR plants, the operating history of past and present HTGR plants, and with evaluating the proposed designs of RPV and associated piping for future plants. Considering these topics in the order they are arranged in the text, first the operational histories of five shut-down and two currently operating HTGR plants are reviewed, leading the authors to conclude that while small, simplemore » prototype HTGR plants operated reliably, some of the larger plants, particularly Fort St. Vrain, had poor availability. Safety and radiological performance of these plants has been considerably better than LWR plants. Petroleum processing plants provide some applicable experience with materials similar to those proposed for HTGR piping and vessels. At least one currently operating plant - HTR-10 - has performed and documented a leak before break analysis that appears to be applicable to proposed future US HTGR designs. Current codes and standards cover some HTGR materials, but not all materials are covered to the high temperatures envisioned for HTGR use. Codes and standards, particularly ASME Codes, are under development for proposed future US HTGR designs. A 'roadmap' document has been prepared for ASME Code development; a new subsection to section III of the ASME Code, ASME BPVC III-5, is scheduled to be published in October 2011. The question of terminology for the cross-duct structure between the RPV and power conversion vessel is discussed, considering the differences in regulatory requirements that apply depending on whether this structure is designated as a 'vessel' or as a 'pipe'. We conclude that designing this component as a 'pipe' is the more appropriate choice, but that the ASME BPVC allows the owner of the facility to select the preferred designation, and that either designation can be acceptable.« less

  5. High-definition vs. standard-definition colonoscopy in the characterization of small colonic polyps: results from a randomized trial.

    PubMed

    Longcroft-Wheaton, G; Brown, J; Cowlishaw, D; Higgins, B; Bhandari, P

    2012-10-01

    The resolution of endoscopes has increased in recent years. Modern Fujinon colonoscopes have a charge-coupled device (CCD) pixel density of 650,000 pixels compared with the 410,000 pixel CCD in standard-definition scopes. Acquiring high-definition scopes represents a significant capital investment and their clinical value remains uncertain. The aim of the current study was to investigate the impact of high-definition endoscopes on the in vivo histology prediction of colonic polyps. Colonoscopy procedures were performed using Fujinon colonoscopes and EPX-4400 processor. Procedures were randomized to be performed using either a standard-definition EC-530 colonoscope or high-definition EC-530 and EC-590 colonoscopes. Polyps of <10 mm were assessed using both white light imaging (WLI) and flexible spectral imaging color enhancement (FICE), and the predicted diagnosis was recorded. Polyps were removed and sent for histological analysis by a pathologist who was blinded to the endoscopic diagnosis. The predicted diagnosis was compared with the histology to calculate the accuracy, sensitivity, and specificity of in vivo assessment using either standard or high-definition scopes. A total of 293 polyps of <10 mm were examined–150 polyps using the standard-definition colonoscope and 143 polyps using high-definition colonoscopes. There was no difference in sensitivity, specificity or accuracy between the two scopes when WLI was used (standard vs. high: accuracy 70% [95% CI 62–77] vs. 73% [95% CI 65–80]; P=0.61). When FICE was used, high-definition colonoscopes showed a sensitivity of 93% compared with 83% for standard-definition colonoscopes (P=0.048); specificity was 81% and 82%, respectively. There was no difference between high- and standard-definition colonoscopes when white light was used, but FICE significantly improved the in vivo diagnosis of small polyps when high-definition scopes were used compared with standard definition.

  6. Historical review of lung counting efficiencies for low energy photon emitters

    DOE PAGES

    Jeffers, Karen L.; Hickman, David P.

    2014-03-01

    This publication reviews the measured efficiency and variability over time of a high purity planar germanium in vivo lung count system for multiple photon energies using increasingly thick overlays with the Lawrence Livermore Torso Phantom. Furthermore, the measured variations in efficiency are compared with the current requirement for in vivo bioassay performance as defined by the American National Standards Institute Standard.

  7. High-Voltage-Input Level Translator Using Standard CMOS

    NASA Technical Reports Server (NTRS)

    Yager, Jeremy A.; Mojarradi, Mohammad M.; Vo, Tuan A.; Blalock, Benjamin J.

    2011-01-01

    proposed integrated circuit would translate (1) a pair of input signals having a low differential potential and a possibly high common-mode potential into (2) a pair of output signals having the same low differential potential and a low common-mode potential. As used here, "low" and "high" refer to potentials that are, respectively, below or above the nominal supply potential (3.3 V) at which standard complementary metal oxide/semiconductor (CMOS) integrated circuits are designed to operate. The input common-mode potential could lie between 0 and 10 V; the output common-mode potential would be 2 V. This translation would make it possible to process the pair of signals by use of standard 3.3-V CMOS analog and/or mixed-signal (analog and digital) circuitry on the same integrated-circuit chip. A schematic of the circuit is shown in the figure. Standard 3.3-V CMOS circuitry cannot withstand input potentials greater than about 4 V. However, there are many applications that involve low-differential-potential, high-common-mode-potential input signal pairs and in which standard 3.3-V CMOS circuitry, which is relatively inexpensive, would be the most appropriate circuitry for performing other functions on the integrated-circuit chip that handles the high-potential input signals. Thus, there is a need to combine high-voltage input circuitry with standard low-voltage CMOS circuitry on the same integrated-circuit chip. The proposed circuit would satisfy this need. In the proposed circuit, the input signals would be coupled into both a level-shifting pair and a common-mode-sensing pair of CMOS transistors. The output of the level-shifting pair would be fed as input to a differential pair of transistors. The resulting differential current output would pass through six standoff transistors to be mirrored into an output branch by four heterojunction bipolar transistors. The mirrored differential current would be converted back to potential by a pair of diode-connected transistors, which, by virtue of being identical to the input transistors, would reproduce the input differential potential at the output

  8. Research and Construction of DC Energy Measurement Traceability Technology

    NASA Astrophysics Data System (ADS)

    Zhi, Wang; Maotao, Yang; Jing, Yang

    2018-02-01

    With the implementation of energy saving and emission reduction policies, DC energy metering has been widely used in many fields. In view of the lack of a DC energy measurementtraceability system, in combination with the process of downward measurement transfer in relation to the DC charger-based field calibration technology and DC energy meter and shunt calibration technologies, the paper proposed DC fast charging, high DC, small DC voltage output and measuring technologies, and built a time-based plan by converting high DC voltage into low voltage and high current into low current and then into low voltage, leaving DC energy traceable to national standards in terms of voltage, current and time and thus filling in the gap in DC energy measurement traceability.

  9. Designing display primaries with currently available light sources for UHDTV wide-gamut system colorimetry.

    PubMed

    Masaoka, Kenichiro; Nishida, Yukihiro; Sugawara, Masayuki

    2014-08-11

    The wide-gamut system colorimetry has been standardized for ultra-high definition television (UHDTV). The chromaticities of the primaries are designed to lie on the spectral locus to cover major standard system colorimetries and real object colors. Although monochromatic light sources are required for a display to perfectly fulfill the system colorimetry, highly saturated emission colors using recent quantum dot technology may effectively achieve the wide gamut. This paper presents simulation results on the chromaticities of highly saturated non-monochromatic light sources and gamut coverage of real object colors to be considered in designing wide-gamut displays with color filters for the UHDTV.

  10. DESIGN NOTE: New apparatus for haze measurement for transparent media

    NASA Astrophysics Data System (ADS)

    Yu, H. L.; Hsiao, C. C.; Liu, W. C.

    2006-08-01

    Precise measurement of luminous transmittance and haze of transparent media is increasingly important to the LCD industry. Currently there are at least three documentary standards for measuring transmission haze. Unfortunately, none of those standard methods by itself can obtain the precise values for the diffuse transmittance (DT), total transmittance (TT) and haze. This note presents a new apparatus capable of precisely measuring all three variables simultaneously. Compared with current structures, the proposed design contains one more compensatory port. For optimal design, the light trap absorbs the beam completely, light scattered by the instrument is zero and the interior surface of the integrating sphere, baffle, as well as the reflectance standard, are of equal characteristic. The accurate values of the TT, DT and haze can be obtained using the new apparatus. Even if the design is not optimal, the measurement errors of the new apparatus are smaller than those of other methods especially for high sphere reflectance. Therefore, the sphere can be made of a high reflectance material for the new apparatus to increase the signal-to-noise ratio.

  11. Current HDTV overview in the United States, Japan, and Europe

    NASA Astrophysics Data System (ADS)

    Cripps, Dale E.

    1991-08-01

    Vast resources are being spent on three continents, preparing for the commercialization of HDTV. The forces that together will launch this new industry are moving at dizzying speeds. This paper covers the highlights of events past and present and offers some predictions for the future. Difficult standards problems that keep brakes on the industry, and that will continue for some time to come. Standards committees have been set up around the world and are hard at work. It is a job with considerable technical and political challenges. By the time major plans and resources come together for commercialization of HDTV, one can trust that there will be adequately stable standards. But to observe the current status is to see a mess. High definition is not only consumer television. Because of its versatility, it is much more likely to find its way first into areas offering high returns such as medicine, education, printing, corporate communications, military and space, and even criminal control. HDTV is very likely to deliver movies and cultural events to theaters, and may also become the platform for a new generation of computers.

  12. Standard cell electrical and physical variability analysis based on automatic physical measurement for design-for-manufacturing purposes

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan; Parag, Allon; Khmaisy, Hafez; Krispil, Uri; Adan, Ofer; Levi, Shimon; Latinski, Sergey; Schwarzband, Ishai; Rotstein, Israel

    2011-04-01

    A fully automated system for process variability analysis of high density standard cell was developed. The system consists of layout analysis with device mapping: device type, location, configuration and more. The mapping step was created by a simple DRC run-set. This database was then used as an input for choosing locations for SEM images and for specific layout parameter extraction, used by SPICE simulation. This method was used to analyze large arrays of standard cell blocks, manufactured using Tower TS013LV (Low Voltage for high-speed applications) Platforms. Variability of different physical parameters like and like Lgate, Line-width-roughness and more as well as of electrical parameters like drive current (Ion), off current (Ioff) were calculated and statistically analyzed, in order to understand the variability root cause. Comparison between transistors having the same W/L but with different layout configurations and different layout environments (around the transistor) was made in terms of performances as well as process variability. We successfully defined "robust" and "less-robust" transistors configurations, and updated guidelines for Design-for-Manufacturing (DfM).

  13. Localized Fire Protection Assessment for Vehicle Compressed Hydrogen Containers

    DOT National Transportation Integrated Search

    2010-03-01

    Industry has identified localized flame impingement on high pressure composite storage cylinders as an area requiring research due to several catastrophic failures in recent years involving compressed natural gas (CNG) vehicles. Current standards and...

  14. Fuel inspection and reconstitution experience at Surry Power Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brookmire, T.A.

    Surry Power Station, located on the James River near Williamsburg, Virginia, has two Westinghouse pressurized water reactors. Unit 2 consistently sets a high standard of fuel performance (no indication of fuel failures in recent cycles), while unit 1, since cycle 6, has been plagued with numerous fuel failures. Both Surry units operate with Westinghouse standard 15 x 15 fuel. Virginia Power management set goals to reduce the coolant activity, thus reducing person-rem exposure and the associated costs of high coolant activity. To achieve this goal, extensive fuel examination campaigns were undertaken that included high-magnification video inspectionsa, debris cleaning, wet andmore » vacuum fuel sipping, fuel rod ultrasonic testing, and eddy current examination. In the summer of 1985, during cycle 8 operation, Kraftwerk Union reconstituted (repaired) the damage, once-burned assemblies from cycles 6 and 7 by replacing failed fuel rods with solid Zircaloy-4 rods. Currently, cycle 9 has operated for 5 months without any indication of fuel failure (the cycle 9 core has two reconstituted assemblies).« less

  15. Ultrastable low-noise current amplifier: a novel device for measuring small electric currents with high accuracy.

    PubMed

    Drung, D; Krause, C; Becker, U; Scherer, H; Ahlers, F J

    2015-02-01

    An ultrastable low-noise current amplifier (ULCA) is presented. The ULCA is a non-cryogenic instrument based on specially designed operational amplifiers and resistor networks. It involves two stages, the first providing a 1000-fold current gain and the second performing a current-to-voltage conversion via an internal 1 MΩ reference resistor or, optionally, an external standard resistor. The ULCA's transfer coefficient is highly stable versus time, temperature, and current amplitude within the full dynamic range of ±5 nA. The low noise level of 2.4 fA/√Hz helps to keep averaging times short at small input currents. A cryogenic current comparator is used to calibrate both input current gain and output transresistance, providing traceability to the quantum Hall effect. Within one week after calibration, the uncertainty contribution from short-term fluctuations and drift of the transresistance is about 0.1 parts per million (ppm). The long-term drift is typically 5 ppm/yr. A high-accuracy variant is available that shows improved stability of the input gain at the expense of a higher noise level of 7.5 fA/√Hz. The ULCA also allows the traceable generation of small electric currents or the calibration of high-ohmic resistors.

  16. Ultrastable low-noise current amplifier: A novel device for measuring small electric currents with high accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drung, D.; Krause, C.; Becker, U.

    2015-02-15

    An ultrastable low-noise current amplifier (ULCA) is presented. The ULCA is a non-cryogenic instrument based on specially designed operational amplifiers and resistor networks. It involves two stages, the first providing a 1000-fold current gain and the second performing a current-to-voltage conversion via an internal 1 MΩ reference resistor or, optionally, an external standard resistor. The ULCA’s transfer coefficient is highly stable versus time, temperature, and current amplitude within the full dynamic range of ±5 nA. The low noise level of 2.4 fA/√Hz helps to keep averaging times short at small input currents. A cryogenic current comparator is used to calibratemore » both input current gain and output transresistance, providing traceability to the quantum Hall effect. Within one week after calibration, the uncertainty contribution from short-term fluctuations and drift of the transresistance is about 0.1 parts per million (ppm). The long-term drift is typically 5 ppm/yr. A high-accuracy variant is available that shows improved stability of the input gain at the expense of a higher noise level of 7.5 fA/√Hz. The ULCA also allows the traceable generation of small electric currents or the calibration of high-ohmic resistors.« less

  17. Ultrastable low-noise current amplifier: A novel device for measuring small electric currents with high accuracy

    NASA Astrophysics Data System (ADS)

    Drung, D.; Krause, C.; Becker, U.; Scherer, H.; Ahlers, F. J.

    2015-02-01

    An ultrastable low-noise current amplifier (ULCA) is presented. The ULCA is a non-cryogenic instrument based on specially designed operational amplifiers and resistor networks. It involves two stages, the first providing a 1000-fold current gain and the second performing a current-to-voltage conversion via an internal 1 MΩ reference resistor or, optionally, an external standard resistor. The ULCA's transfer coefficient is highly stable versus time, temperature, and current amplitude within the full dynamic range of ±5 nA. The low noise level of 2.4 fA/√Hz helps to keep averaging times short at small input currents. A cryogenic current comparator is used to calibrate both input current gain and output transresistance, providing traceability to the quantum Hall effect. Within one week after calibration, the uncertainty contribution from short-term fluctuations and drift of the transresistance is about 0.1 parts per million (ppm). The long-term drift is typically 5 ppm/yr. A high-accuracy variant is available that shows improved stability of the input gain at the expense of a higher noise level of 7.5 fA/√Hz. The ULCA also allows the traceable generation of small electric currents or the calibration of high-ohmic resistors.

  18. Perspectives on setting limits for RF contact currents: a commentary.

    PubMed

    Tell, Richard A; Tell, Christopher A

    2018-01-15

    Limits for exposure to radiofrequency (RF) contact currents are specified in the two dominant RF safety standards and guidelines developed by the Institute of Electrical and Electronics Engineers (IEEE) and the International Commission on Non-Ionizing Radiation Protection (ICNIRP). These limits are intended to prevent RF burns when contacting RF energized objects caused by high local tissue current densities. We explain what contact currents are and review some history of the relevant limits with an emphasis on so-called "touch" contacts, i.e., contact between a person and a contact current source during touch via a very small contact area. Contact current limits were originally set on the basis of controlling the specific absorption rate resulting from the current flowing through regions of small conductive cross section within the body, such as the wrist or ankle. More recently, contact currents have been based on thresholds of perceived heating. In the latest standard from the IEEE developed for NATO, contact currents have been based on two research studies in which thresholds for perception of thermal warmth or thermal pain have been measured. Importantly, these studies maximized conductive contact between the subject and the contact current source. This factor was found to dominate the response to heating wherein high resistance contact, such as from dry skin, can result in local heating many times that from a highly conductive contact. Other factors such as electrode size and shape, frequency of the current and the physical force associated with contact are found to introduce uncertainty in threshold values when comparing data across multiple studies. Relying on studies in which the contact current is minimized for a given threshold does not result in conservative protection limits. Future efforts to develop limits on contact currents should include consideration of (1) the basis for the limits (perception, pain, tissue damage); (2) understanding of the practical conditions of real world exposure for contact currents such as contact resistance, size and shape of the contact electrode and applied force at the point of contact; (3) consistency of how contact currents are applied in research studies across different researchers; (4) effects of frequency.

  19. Performance of the NASA Digitizing Core-Loss Instrumentation

    NASA Technical Reports Server (NTRS)

    Schwarze, Gene E. (Technical Monitor); Niedra, Janis M.

    2003-01-01

    The standard method of magnetic core loss measurement was implemented on a high frequency digitizing oscilloscope in order to explore the limits to accuracy when characterizing high Q cores at frequencies up to 1 MHz. This method computes core loss from the cycle mean of the product of the exciting current in a primary winding and induced voltage in a separate flux sensing winding. It is pointed out that just 20 percent accuracy for a Q of 100 core material requires a phase angle accuracy of 0.1 between the voltage and current measurements. Experiment shows that at 1 MHz, even high quality, high frequency current sensing transformers can introduce phase errors of a degree or more. Due to the fact that the Q of some quasilinear core materials can exceed 300 at frequencies below 100 kHz, phase angle errors can be a problem even at 50 kHz. Hence great care is necessary with current sensing and ground loops when measuring high Q cores. Best high frequency current sensing accuracy was obtained from a fabricated 0.1-ohm coaxial resistor, differentially sensed. Sample high frequency core loss data taken with the setup for a permeability-14 MPP core is presented.

  20. Evaluation of LOINC for Representing Constitutional Cytogenetic Test Result Reports

    PubMed Central

    Heras, Yan Z.; Mitchell, Joyce A.; Williams, Marc S.; Brothman, Arthur R.; Huff, Stanley M.

    2009-01-01

    Genetic testing is becoming increasingly important to medical practice. Integrating genetics and genomics data into electronic medical records is crucial in translating genetic discoveries into improved patient care. Information technology, especially Clinical Decision Support Systems, holds great potential to help clinical professionals take full advantage of genomic advances in their daily medical practice. However, issues relating to standard terminology and information models for exchanging genetic testing results remain relatively unexplored. This study evaluates whether the current LOINC standard is adequate to represent constitutional cytogenetic test result reports using sample result reports from ARUP Laboratories. The results demonstrate that current standard terminology is insufficient to support the needs of coding cytogenetic test results. The terminology infrastructure must be developed before clinical information systems will be able to handle the high volumes of genetic data expected in the near future. PMID:20351857

  1. Evaluation of LOINC for representing constitutional cytogenetic test result reports.

    PubMed

    Heras, Yan Z; Mitchell, Joyce A; Williams, Marc S; Brothman, Arthur R; Huff, Stanley M

    2009-11-14

    Genetic testing is becoming increasingly important to medical practice. Integrating genetics and genomics data into electronic medical records is crucial in translating genetic discoveries into improved patient care. Information technology, especially Clinical Decision Support Systems, holds great potential to help clinical professionals take full advantage of genomic advances in their daily medical practice. However, issues relating to standard terminology and information models for exchanging genetic testing results remain relatively unexplored. This study evaluates whether the current LOINC standard is adequate to represent constitutional cytogenetic test result reports using sample result reports from ARUP Laboratories. The results demonstrate that current standard terminology is insufficient to support the needs of coding cytogenetic test results. The terminology infrastructure must be developed before clinical information systems will be able to handle the high volumes of genetic data expected in the near future.

  2. Applying Registry Services to Spaceflight Technologies to Aid in the Assignment of Assigned Numbers to Disparate Systems and their Technologies to Further Enable Interoperability

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Nichols, Kelvin F.; Witherspoon, Keith R.

    2006-01-01

    To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative, a requirement to standardize and control the naming conventions of very disparate systems and technologies is emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and Birds of a Feather (BoF) activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another standards body or technologies that are currently not standardized. For activities one through three, we will provide the analysis by either discipline or technology with rationale, identification and brief description of requirements and precedence. For activity four, we will provide a list of current standards bodies e.g. IETF and a list of potential candidates.

  3. Searching for new physics at the frontiers with lattice quantum chromodynamics.

    PubMed

    Van de Water, Ruth S

    2012-07-01

    Numerical lattice-quantum chromodynamics (QCD) simulations, when combined with experimental measurements, allow the determination of fundamental parameters of the particle-physics Standard Model and enable searches for physics beyond-the-Standard Model. We present the current status of lattice-QCD weak matrix element calculations needed to obtain the elements and phase of the Cabibbo-Kobayashi-Maskawa (CKM) matrix and to test the Standard Model in the quark-flavor sector. We then discuss evidence that may hint at the presence of new physics beyond the Standard Model CKM framework. Finally, we discuss two opportunities where we expect lattice QCD to play a pivotal role in searching for, and possibly discovery of, new physics at upcoming high-intensity experiments: rare decays and the muon anomalous magnetic moment. The next several years may witness the discovery of new elementary particles at the Large Hadron Collider (LHC). The interplay between lattice QCD, high-energy experiments at the LHC, and high-intensity experiments will be needed to determine the underlying structure of whatever physics beyond-the-Standard Model is realized in nature. © 2012 New York Academy of Sciences.

  4. A new standard pulsar magnetosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Contopoulos, Ioannis; Kalapotharakos, Constantinos; Kazanas, Demosthenes, E-mail: icontop@academyofathens.gr

    2014-01-20

    In view of recent efforts to probe the physical conditions in the pulsar current sheet, we revisit the standard solution that describes the main elements of the ideal force-free pulsar magnetosphere. The simple physical requirement that the electric current contained in the current layer consists of the local electric charge moving outward at close to the speed of light yields a new solution for the pulsar magnetosphere everywhere that is ideal force-free except in the current layer. The main elements of the new solution are as follows: (1) the pulsar spindown rate of the aligned rotator is 23% larger thanmore » that of the orthogonal vacuum rotator; (2) only 60% of the magnetic flux that crosses the light cylinder opens up to infinity; (3) the electric current closes along the other 40%, which gradually converges to the equator; (4) this transfers 40% of the total pulsar spindown energy flux in the equatorial current sheet, which is then dissipated in the acceleration of particles and in high-energy electromagnetic radiation; and (5) there is no separatrix current layer. Our solution is a minimum free-parameter solution in that the equatorial current layer is electrostatically supported against collapse and thus does not require a thermal particle population. In this respect, it is one more step toward the development of a new standard solution. We discuss the implications for intermittent pulsars and long-duration gamma-ray bursts. We conclude that the physical conditions in the equatorial current layer determine the global structure of the pulsar magnetosphere.« less

  5. The Effect of High School Literacy Programs on Standardized Test Scores

    ERIC Educational Resources Information Center

    Brock, Kathryn

    2013-01-01

    Current National Assessment of Educational Progress results continued their 40-year pattern with two-thirds of U.S. 8th graders not proficient in reading, yet formal reading and literacy instruction ends in elementary school. Lack of reading proficiency can undermine academic progress in high school. Elementary literacy instruction provides…

  6. Beyond Standards: Excellence in the High School English Classroom.

    ERIC Educational Resources Information Center

    Jago, Carol

    Each student is capable of achieving excellence, but it requires a nurturing, vigorous classroom environment. To help current and future high school English teachers create and maintain this kind of environment, this book offers concrete ways to reconceive what it means to foster excellent performance in the classroom and vivid examples of student…

  7. Patch testing custom isocyanate materials from the workplace.

    PubMed

    Burrows, Dianne; Houle, Marie-Claude; Holness, D Linn; DeKoven, Joel; Skotnicki, Sandy

    2015-01-01

    Patch testing with standard trays of commercially available allergens is the current practice for investigating suspected cases of isocyanate-induced allergic contact dermatitis (ACD). In some facilities, these standard trays are further supplemented with custom preparations of isocyanate-containing materials. The aim was to determine whether added value exists in patch testing patients to custom isocyanate preparations in suspected cases of ACD. We performed a retrospective analysis of 11 patients referred to our specialty clinic between January 2003 and March 2011 for suspected patients of ACD who had custom testing with isocyanate materials from their workplace. In addition to standard trays of allergens, all patients were patch tested with custom isocyanate materials from their workplaces. Three (27%) of 11 patients showed an added value in testing to custom isocyanate allergens. Of these 3 patients, one had a reaction that reinforced positive reactions to the standard isocyanate tray, but the other 2 (18%) had no reactions to any of the commercially available allergens. Because of the high proportion of reactions (27%), we recommend the use of custom testing to workplace isocyanate products as a supplement to current standard patch testing procedures.

  8. Resistance factors for 100% dynamic testing, with and without static load tests.

    DOT National Transportation Integrated Search

    2011-05-01

    Current department of transportation (DOT) and Federal Highway Administration (FHWA) practice has highly : variable load and resistance factor design (LRFD) resistance factors, , for driven piles from design (e.g., Standard : Penetration Tests (SPT...

  9. SpaceFibre Discussion

    NASA Technical Reports Server (NTRS)

    Rakow, Glenn

    2007-01-01

    This viewgraph presentation discusses the future use of SpaceFibre, a high speed optical extension to the SpaceWire, for NASA and DOD missions. NASA, and US industries would like to work with the European developers currently working on this standard.

  10. MFC Communications Infrastructure Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Cannon; Terry Barney; Gary Cook

    2012-01-01

    Unprecedented growth of required telecommunications services and telecommunications applications change the way the INL does business today. High speed connectivity compiled with a high demand for telephony and network services requires a robust communications infrastructure.   The current state of the MFC communication infrastructure limits growth opportunities of current and future communication infrastructure services. This limitation is largely due to equipment capacity issues, aging cabling infrastructure (external/internal fiber and copper cable) and inadequate space for telecommunication equipment. While some communication infrastructure improvements have been implemented over time projects, it has been completed without a clear overall plan and technology standard.more »   This document identifies critical deficiencies with the current state of the communication infrastructure in operation at the MFC facilities and provides an analysis to identify needs and deficiencies to be addressed in order to achieve target architectural standards as defined in STD-170. The intent of STD-170 is to provide a robust, flexible, long-term solution to make communications capabilities align with the INL mission and fit the various programmatic growth and expansion needs.« less

  11. The Residual Risk Reduction Initiative: a call to action to reduce residual vascular risk in patients with dyslipidemia.

    PubMed

    Fruchart, Jean-Charles; Sacks, Frank; Hermans, Michel P; Assmann, Gerd; Brown, W Virgil; Ceska, Richard; Chapman, M John; Dodson, Paul M; Fioretto, Paola; Ginsberg, Henry N; Kadowaki, Takashi; Lablanche, Jean-Marc; Marx, Nikolaus; Plutzky, Jorge; Reiner, Zeljko; Rosenson, Robert S; Staels, Bart; Stock, Jane K; Sy, Rody; Wanner, Christoph; Zambon, Alberto; Zimmet, Paul

    2008-11-17

    Despite achieving targets for low-density lipoprotein (LDL) cholesterol, blood pressure, and glycemia in accordance with current standards of care, patients with dyslipidemia remain at high residual risk of vascular events. Atherogenic dyslipidemia, characterized by elevated triglycerides and low levels of high-density lipoprotein (HDL) cholesterol, often with elevated apolipoprotein B and non-HDL cholesterol, is common in patients with established cardiovascular disease (CVD), type 2 diabetes mellitus, or metabolic syndrome and contributes to both macrovascular and microvascular residual risk. However, atherogenic dyslipidemia is largely underdiagnosed and undertreated in clinical practice. The Residual Risk Reduction Initiative (R3i) was established to address this highly relevant clinical issue. The aims of this position paper are (1) to highlight evidence that atherogenic dyslipidemia is associated with residual macrovascular and microvascular risk in patients at high risk for CVD, despite current standards of care for dyslipidemia and diabetes; and (2) to recommend therapeutic intervention for reducing this residual vascular risk supported by evidence and expert consensus. Lifestyle modification with nutrition and exercise is an important, effective, and underutilized first step in reducing residual vascular risk. Therapeutic intervention aimed at achievement of all lipid targets is also often required. Combination lipid-modifying therapy, with the addition of niacin, a fibrate, or omega-3 fatty acids to statin therapy, increases the probability of achieving all lipid goals. Outcomes studies are in progress to evaluate whether these combination treatment strategies translate to a clinical benefit greater than that achieved with statins alone. The R3i highlights the need to address with lifestyle and/or pharmacotherapy the high level of residual risk of CVD events and microvascular complications among patients with dyslipidemia receiving therapy for high levels of LDL cholesterol and for diabetes in accordance with current standards of care.

  12. Precision measurement of the weak charge of the proton.

    PubMed

    2018-05-01

    Large experimental programmes in the fields of nuclear and particle physics search for evidence of physics beyond that explained by current theories. The observation of the Higgs boson completed the set of particles predicted by the standard model, which currently provides the best description of fundamental particles and forces. However, this theory's limitations include a failure to predict fundamental parameters, such as the mass of the Higgs boson, and the inability to account for dark matter and energy, gravity, and the matter-antimatter asymmetry in the Universe, among other phenomena. These limitations have inspired searches for physics beyond the standard model in the post-Higgs era through the direct production of additional particles at high-energy accelerators, which have so far been unsuccessful. Examples include searches for supersymmetric particles, which connect bosons (integer-spin particles) with fermions (half-integer-spin particles), and for leptoquarks, which mix the fundamental quarks with leptons. Alternatively, indirect searches using precise measurements of well predicted standard-model observables allow highly targeted alternative tests for physics beyond the standard model because they can reach mass and energy scales beyond those directly accessible by today's high-energy accelerators. Such an indirect search aims to determine the weak charge of the proton, which defines the strength of the proton's interaction with other particles via the well known neutral electroweak force. Because parity symmetry (invariance under the spatial inversion (x, y, z) → (-x, -y, -z)) is violated only in the weak interaction, it provides a tool with which to isolate the weak interaction and thus to measure the proton's weak charge 1 . Here we report the value 0.0719 ± 0.0045, where the uncertainty is one standard deviation, derived from our measured parity-violating asymmetry in the scattering of polarized electrons on protons, which is -226.5 ± 9.3 parts per billion (the uncertainty is one standard deviation). Our value for the proton's weak charge is in excellent agreement with the standard model 2 and sets multi-teraelectronvolt-scale constraints on any semi-leptonic parity-violating physics not described within the standard model. Our results show that precision parity-violating measurements enable searches for physics beyond the standard model that can compete with direct searches at high-energy accelerators and, together with astronomical observations, can provide fertile approaches to probing higher mass scales.

  13. High-Precision Measurement of the Ne19 Half-Life and Implications for Right-Handed Weak Currents

    NASA Astrophysics Data System (ADS)

    Triambak, S.; Finlay, P.; Sumithrarachchi, C. S.; Hackman, G.; Ball, G. C.; Garrett, P. E.; Svensson, C. E.; Cross, D. S.; Garnsworthy, A. B.; Kshetri, R.; Orce, J. N.; Pearson, M. R.; Tardiff, E. R.; Al-Falou, H.; Austin, R. A. E.; Churchman, R.; Djongolov, M. K.; D'Entremont, R.; Kierans, C.; Milovanovic, L.; O'Hagan, S.; Reeve, S.; Sjue, S. K. L.; Williams, S. J.

    2012-07-01

    We report a precise determination of the Ne19 half-life to be T1/2=17.262±0.007s. This result disagrees with the most recent precision measurements and is important for placing bounds on predicted right-handed interactions that are absent in the current standard model. We are able to identify and disentangle two competing systematic effects that influence the accuracy of such measurements. Our findings prompt a reassessment of results from previous high-precision lifetime measurements that used similar equipment and methods.

  14. High-precision measurement of the 19Ne half-life and implications for right-handed weak currents.

    PubMed

    Triambak, S; Finlay, P; Sumithrarachchi, C S; Hackman, G; Ball, G C; Garrett, P E; Svensson, C E; Cross, D S; Garnsworthy, A B; Kshetri, R; Orce, J N; Pearson, M R; Tardiff, E R; Al-Falou, H; Austin, R A E; Churchman, R; Djongolov, M K; D'Entremont, R; Kierans, C; Milovanovic, L; O'Hagan, S; Reeve, S; Sjue, S K L; Williams, S J

    2012-07-27

    We report a precise determination of the (19)Ne half-life to be T(1/2)=17.262±0.007 s. This result disagrees with the most recent precision measurements and is important for placing bounds on predicted right-handed interactions that are absent in the current standard model. We are able to identify and disentangle two competing systematic effects that influence the accuracy of such measurements. Our findings prompt a reassessment of results from previous high-precision lifetime measurements that used similar equipment and methods.

  15. Multijunction high voltage concentrator solar cells

    NASA Technical Reports Server (NTRS)

    Valco, G. J.; Kapoor, V. J.; Evans, J. C.; Chai, A.-T.

    1981-01-01

    The standard integrated circuit technology has been developed to design and fabricate new innovative planar multi-junction solar cell chips for concentrated sunlight applications. This 1 cm x 1 cm cell consisted of several voltage generating regions called unit cells which were internally connected in series within a single chip resulting in high open circuit voltages. Typical open-circuit voltages of 3.6 V and short-circuit currents of 90 ma were obtained at 80 AM1 suns. A dramatic increase in both short circuit current and open circuit voltage with increased light levels was observed.

  16. Alkoxybenzothiadiazole-Based Fullerene and Nonfullerene Polymer Solar Cells with High Shunt Resistance for Indoor Photovoltaic Applications.

    PubMed

    Park, Song Yi; Li, Yuxiang; Kim, Jaewon; Lee, Tack Ho; Walker, Bright; Woo, Han Young; Kim, Jin Young

    2018-01-31

    We synthesized three semicrystalline polymers (PTTBT BO , PDTBT BO , and P2FDTBT BO ) by modulating the intra- and intermolecular noncovalent Coulombic interactions and investigated their photovoltaic characteristics under various light intensities. Low series (R s ) and high shunt (R sh ) resistances are essential prerequisites for good device properties under standard illumination (100 mW cm -2 ). Considering these factors, among three polymers, PDTBT BO polymer solar cells (PSCs) exhibited the most desirable characteristics, with peak power conversion efficiencies (PCE) of 7.52 and 9.60% by being blended with PC 71 BM under standard and dim light (2.5 mW cm -2 ), respectively. P2FDTBT BO PSCs exhibited a low PCE of 3.69% under standard light due to significant charge recombination with high R s (9.42 Ω cm 2 ). However, the PCE was remarkably improved by 2.3 times (8.33% PCE) under dim light, showing negligible decrease in open-circuit voltage and remarkable increase in fill factor, which is due to an exceptionally high R sh of over 1000 kΩ cm 2 . R s is less significant under dim light because the generated current is too small to cause noticeable R s -induced voltage losses. Instead, high R sh becomes more important to avoid leakage currents. This work provides important tips to further optimize PSCs for indoor applications with low-power electronic devices such as Internet of things sensors.

  17. Singlino resonant dark matter and 125 GeV Higgs boson in high-scale supersymmetry.

    PubMed

    Ishikawa, Kazuya; Kitahara, Teppei; Takimoto, Masahiro

    2014-09-26

    We consider a singlino dark matter (DM) scenario in a singlet extension model of the minimal supersymmetric standard model, which is the so-called the nearly minimal supersymmetric standard model. We find that with high-scale supersymmetry breaking the singlino can obtain a sizable radiative correction to the mass, which opens a window for the DM scenario with resonant annihilation via the exchange of the Higgs boson. We show that the current DM relic abundance and the Higgs boson mass can be explained simultaneously. This scenario can be fully probed by XENON1T.

  18. Perception Matters for Clinical Perfectionism and Social Anxiety

    PubMed Central

    Levinson, Cheri A.; Rodebaugh, Thomas L.; Shumaker, Erik A.; Menatti, Andrew R.; Weeks, Justin W.; White, Emily K.; Heimberg, Richard G.; Warren, Cortney S.; Blanco, Carlos; Schneier, Franklin; Liebowitz, Michael R.

    2014-01-01

    Despite research documenting a relationship between social anxiety and perfectionism, very little research has examined the relationship between social anxiety and clinical perfectionism, defined as the combination of high personal standards and high maladaptive perfectionistic evaluative concern. In the current studies we examined whether clinical perfectionism predicted social anxiety in a large sample of undergraduates (N = 602), in a clinical sample of participants diagnosed with social anxiety disorder (SAD; N = 180), and by using a variance decomposition model of self-and informant-report of perfectionism (N = 134). Using self-report, we found that an interaction of personal standards and evaluative concern predicted both social interaction anxiety and fear of scrutiny, but not in the theorized direction. Specifically, we found that self-report of low standards and high evaluative concern was associated with the highest levels of social anxiety, suggesting that when individuals with SAD hold low expectations for themselves combined with high concerns about evaluation, social anxiety symptoms may increase. Alternatively, when an informants’ perspective was considered, and more consistent with the original theory, we found that the interaction of informant-only report of personal standards and shared-report (between both primary participant and informant) of concern over mistakes was associated with self-reported social anxiety, such that high concern over mistakes and high personal standards predicted the highest levels of social anxiety. Theoretical, clinical, and measurement implications for clinical perfectionism are discussed. PMID:25486087

  19. Sharp burnout failure observed in high current-carrying double-walled carbon nanotube fibers

    NASA Astrophysics Data System (ADS)

    Song, Li; Toth, Geza; Wei, Jinquan; Liu, Zheng; Gao, Wei; Ci, Lijie; Vajtai, Robert; Endo, Morinobu; Ajayan, Pulickel M.

    2012-01-01

    We report on the current-carrying capability and the high-current-induced thermal burnout failure modes of 5-20 µm diameter double-walled carbon nanotube (DWNT) fibers made by an improved dry-spinning method. It is found that the electrical conductivity and maximum current-carrying capability for these DWNT fibers can reach up to 5.9 × 105 S m - 1 and over 1 × 105 A cm - 2 in air. In comparison, we observed that standard carbon fiber tended to be oxidized and burnt out into cheese-like morphology when the maximum current was reached, while DWNT fiber showed a much slower breakdown behavior due to the gradual burnout in individual nanotubes. The electron microscopy observations further confirmed that the failure process of DWNT fibers occurs at localized positions, and while the individual nanotubes burn they also get aligned due to local high temperature and electrostatic field. In addition a finite element model was constructed to gain better understanding of the failure behavior of DWNT fibers.

  20. Evaluation of a new approach to compute intervertebral disc height measurements from lateral radiographic views of the spine.

    PubMed

    Allaire, Brett T; DePaolis Kaluza, M Clara; Bruno, Alexander G; Samelson, Elizabeth J; Kiel, Douglas P; Anderson, Dennis E; Bouxsein, Mary L

    2017-01-01

    Current standard methods to quantify disc height, namely distortion compensated Roentgen analysis (DCRA), have been mostly utilized in the lumbar and cervical spine and have strict exclusion criteria. Specifically, discs adjacent to a vertebral fracture are excluded from measurement, thus limiting the use of DCRA in studies that include older populations with a high prevalence of vertebral fractures. Thus, we developed and tested a modified DCRA algorithm that does not depend on vertebral shape. Participants included 1186 men and women from the Framingham Heart Study Offspring and Third Generation Multidetector CT Study. Lateral CT scout images were used to place 6 morphometry points around each vertebra at 13 vertebral levels in each participant. Disc heights were calculated utilizing these morphometry points using DCRA methodology and our modified version of DCRA, which requires information from fewer morphometry points than the standard DCRA. Modified DCRA and standard DCRA measures of disc height are highly correlated, with concordance correlation coefficients above 0.999. Both measures demonstrate good inter- and intra-operator reproducibility. 13.9 % of available disc heights were not evaluable or excluded using the standard DCRA algorithm, while only 3.3 % of disc heights were not evaluable using our modified DCRA algorithm. Using our modified DCRA algorithm, it is not necessary to exclude vertebrae with fracture or other deformity from disc height measurements as in the standard DCRA. Modified DCRA also yields identical measurements to the standard DCRA. Thus, the use of modified DCRA for quantitative assessment of disc height will lead to less missing data without any loss of accuracy, making it a preferred alternative to the current standard methodology.

  1. Bringing Engineering Design into High School Science Classrooms: The Heating/Cooling Unit

    ERIC Educational Resources Information Center

    Apedoe, Xornam S.; Reynolds, Birdy; Ellefson, Michelle R.; Schunn, Christian D.

    2008-01-01

    Infusing engineering design projects in K-12 settings can promote interest and attract a wide range of students to engineering careers. However, the current climate of high-stakes testing and accountability to standards leaves little room to incorporate engineering design into K-12 classrooms. We argue that design-based learning, the combination…

  2. Test Anxiety Associated with High-Stakes Testing among Elementary School Children: Prevalence, Predictors, and Relationship to Student Performance

    ERIC Educational Resources Information Center

    Segool, Natasha Katherine

    2009-01-01

    The current study explored differences in test anxiety on high-stakes standardized achievement testing and classroom testing among elementary school children. This is the first study to directly examine differences in student test anxiety across two testing conditions with different stakes among young children. Three hundred and thirty-five…

  3. How We Define Success: Holding Values in an Era of High Stakes Accountability

    ERIC Educational Resources Information Center

    Gasoi, Emily

    2009-01-01

    In the current climate of high stakes testing and tough love rhetoric, many educational stakeholders have become increasingly reliant on standardized test scores to determine whether or not individual students, teachers, and schools--and even entire districts and states--are successful. In contrast to the black and white picture that test-driven…

  4. Business Policies and Procedures of High School Newspapers.

    ERIC Educational Resources Information Center

    Campbell, Laurence R.

    The purpose of this inquiry was to identify the current business policies and procedures of high school newspapers in the United States and to determine whether such an appraisal could be used to achieve higher standards in both education and journalism. Most of the data was gathered in early 1968 by questionnaires sent to 548 public and 68…

  5. Instructional Design Implications about Comprehension of Listening to Music before and during Reading

    ERIC Educational Resources Information Center

    Hinrichs, Amy F.

    2013-01-01

    Low reading levels and lack of comprehension are current problems in high school classrooms confirmed by low standardized test scores and employer feedback as comprehension problems move into the workplace with students who do not have the necessary reading skills on the job. Midwestern high school science club students served as participants in…

  6. Measurements of Aircraft Wake Vortex Separation at High Arrival Rates and a Proposed New Wake Vortex Separation Philosophy

    NASA Technical Reports Server (NTRS)

    Rutishauser, David; Donohue, George L.; Haynie, Rudolph C.

    2003-01-01

    This paper presents data and a proposed new aircraft wake vortex separation standard that argues for a fundamental re-thinking of international practice. The current static standard, under certain atmospheric conditions, presents an unnecessary restriction on system capacity. A new approach, that decreases aircraft separation when atmospheric conditions dictate, is proposed based upon the availability of new instrumentation and a better understanding of wake physics.

  7. [Objectives and limits of test standards].

    PubMed

    Kaddick, C; Blömer, W

    2014-06-01

    Test standards are developed worldwide by extremely committed expert groups working mostly in an honorary capacity and have substantially contributed to the currently achieved safety standards in reconstructive orthopedics. Independent of the distribution and quality of a test specification, the specialist knowledge of the user cannot replace a well founded risk analysis and if used unthinkingly can lead to a false estimation of safety. The limits of standardization are reached where new indications or highly innovative products are concerned. In this case the manufacturer must undertake the time and cost-intensive route of a self-developed testing procedure which in the ideal case leads to a further testing standard. Test standards make a substantial contribution to implant safety but cannot replace the expert knowledge of the user. Tests as an end to themselves take the actual objectives of standardization to absurdity.

  8. Thermoelectric converters for alternating current standards

    NASA Astrophysics Data System (ADS)

    Anatychuk, L. I.; Taschuk, D. D.

    2012-06-01

    Thermoelectric converters of alternating current remain priority instruments when creating standard equipment. This work presents the results of design and manufacture of alternating current converter for a military standard of alternating current in Ukraine. Results of simulation of temperature distribution in converter elements, ways of optimization to improve the accuracy of alternating current signal reproduction are presented. Results of metrological trials are given. The quality of thermoelectric material specially created for alternating current metrology is verified. The converter was used in alternating current standard for the frequency range from 10 Hz to 30 MHz. The efficiency of using thermoelectric signal converters in measuring instruments is confirmed.

  9. Portable precision dc voltage-current transfer standard for electrometer calibration

    USGS Publications Warehouse

    Landis, G.; Godwin, M.

    1982-01-01

    A circuit design is presented for an instrument providing a highly stable and fully adjustable voltage and current in the range of 0-1.999 V or 0-199.9 mV and 10-11-10-15 A. This instrument is used to verify the calibration and performance of dc and vibrating reed electrometers and chart recorders on mass spectrometers of the USGS Isotope Laboratories in Denver.

  10. Endometrial Cancer Screening (PDQ®)—Patient Version

    Cancer.gov

    Endometrial cancer screening is currently not recommended because no standard or routine screening test has been shown to be effective. Endometrial cancer is usually found early due to symptoms and survival rates are high. Learn more in this expert-reviewed summary.

  11. ASTM and VAMAS activities in titanium matrix composites test methods development

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.; Harmon, D. M.; Bartolotta, P. A.; Russ, S. M.

    1994-01-01

    Titanium matrix composites (TMC's) are being considered for a number of aerospace applications ranging from high performance engine components to airframe structures in areas that require high stiffness to weight ratios at temperatures up to 400 C. TMC's exhibit unique mechanical behavior due to fiber-matrix interface failures, matrix cracks bridged by fibers, thermo-viscoplastic behavior of the matrix at elevated temperatures, and the development of significant thermal residual stresses in the composite due to fabrication. Standard testing methodology must be developed to reflect the uniqueness of this type of material systems. The purpose of this paper is to review the current activities in ASTM and Versailles Project on Advanced Materials and Standards (VAMAS) that are directed toward the development of standard test methodology for titanium matrix composites.

  12. Search for light gauge bosons of the dark sector at the Mainz Microtron.

    PubMed

    Merkel, H; Achenbach, P; Ayerbe Gayoso, C; Bernauer, J C; Böhm, R; Bosnar, D; Debenjak, L; Denig, A; Distler, M O; Esser, A; Fonvieille, H; Friščić, I; Middleton, D G; Müller, U; Nungesser, L; Pochodzalla, J; Rohrbeck, M; Sánchez Majos, S; Schlimme, B S; Schoth, M; Sirca, S; Weinriefer, M

    2011-06-24

    A new exclusion limit for the electromagnetic production of a light U(1) gauge boson γ' decaying to e + e- was determined by the A1 Collaboration at the Mainz Microtron. Such light gauge bosons appear in several extensions of the standard model and are also discussed as candidates for the interaction of dark matter with standard model matter. In electron scattering from a heavy nucleus, the existing limits for a narrow state coupling to e + e- were reduced by nearly an order of magnitude in the range of the lepton pair mass of 210 MeV/c2}

  13. Is the Physical Being Taken out of Physical Education? On the Possible Effects of High-Stakes Testing on an Embattled Profession's Curriculum Goals

    ERIC Educational Resources Information Center

    Seymour, Clancy; Garrison, Mark

    2015-01-01

    Building on recent discussions regarding how current national standards for physical education promote cognitive outcomes over physical outcomes, the authors explore how a new era in high-stakes testing is also contributing to an emphasis on the cognitive, over the physical. While high-stakes testing has been linked to reducing the amount of…

  14. Control of short-channel effects in InAlN/GaN high-electron mobility transistors using graded AlGaN buffer

    NASA Astrophysics Data System (ADS)

    Han, Tiecheng; Zhao, Hongdong; Peng, Xiaocan; Li, Yuhai

    2018-04-01

    A graded AlGaN buffer is designed to realize the p-type buffer by inducing polarization-doping holes. Based on the two-dimensional device simulator, the effect of the graded AlGaN buffer on the direct-current (DC) and radio-frequency (RF) performance of short-gate InAlN/GaN high-electron mobility transistors (HEMTs) are investigated, theoretically. Compared to standard HEMT, an enhancement of electron confinement and a good control of short-channel effect (SCEs) are demonstrated in the graded AlGaN buffer HEMT. Accordingly, the pinched-off behavior and the ability of gate modulation are significantly improved. And, no serious SCEs are observed in the graded AlGaN buffer HEMT with an aspect ratio (LG/tch) of about 6.7, much lower than that of the standard HEMT (LG/tch = 13). In addition, for a 70-nm gate length, a peak current gain cutoff frequency (fT) of 171 GHz and power gain cutoff frequency (fmax) of 191 GHz are obtained in the grade buffer HEMT, which are higher than those of the standard one with the same gate length.

  15. A comparison of the imaging characteristics of the new Kodak Hyper Speed G film with the current T-MAT G/RA film and the CR 9000 system.

    PubMed

    Monnin, P; Gutierrez, D; Bulling, S; Lepori, D; Verdun, F R

    2005-10-07

    Three standard radiation qualities (RQA 3, RQA 5 and RQA 9) and two screens, Kodak Lanex Regular and Insight Skeletal, were used to compare the imaging performance and dose requirements of the new Kodak Hyper Speed G and the current Kodak T-MAT G/RA medical x-ray films. The noise equivalent quanta (NEQ) and detective quantum efficiencies (DQE) of the four screen-film combinations were measured at three gross optical densities and compared with the characteristics for the Kodak CR 9000 system with GP (general purpose) and HR (high resolution) phosphor plates. The new Hyper Speed G film has double the intrinsic sensitivity of the T-MAT G/RA film and a higher contrast in the high optical density range for comparable exposure latitude. By providing both high sensitivity and high spatial resolution, the new film significantly improves the compromise between dose and image quality. As expected, the new film has a higher noise level and a lower signal-to-noise ratio than the standard film, although in the high frequency range this is compensated for by a better resolution, giving better DQE results--especially at high optical density. Both screen-film systems outperform the phosphor plates in terms of MTF and DQE for standard imaging conditions (Regular screen at RQA 5 and RQA 9 beam qualities). At low energy (RQA 3), the CR system has a comparable low-frequency DQE to screen-film systems when used with a fine screen at low and middle optical densities, and a superior low-frequency DQE at high optical density.

  16. A review of the quantum current standard

    NASA Astrophysics Data System (ADS)

    Kaneko, Nobu-Hisa; Nakamura, Shuji; Okazaki, Yuma

    2016-03-01

    The electric current, voltage, and resistance standards are the most important standards related to electricity and magnetism. Of these three standards, only the ampere, which is the unit of electric current, is an International System of Units (SI) base unit. However, even with modern technology, relatively large uncertainty exists regarding the generation and measurement of current. As a result of various innovative techniques based on nanotechnology and novel materials, new types of junctions for quantum current generation and single-electron current sources have recently been proposed. These newly developed methods are also being used to investigate the consistency of the three quantum electrical effects, i.e. the Josephson, quantum Hall, and single-electron tunneling effects, which are also known as ‘the quantum metrology triangle’. This article describes recent research and related developments regarding current standards and quantum-metrology-triangle experiments.

  17. Development and Pilot Testing of a Standardized Training Program for a Patient-Mentoring Intervention to Increase Adherence to Outpatient HIV Care

    PubMed Central

    Mignogna, Joseph; Stanley, Melinda A.; Davila, Jessica; Wear, Jackie; Amico, K. Rivet; Giordano, Thomas P.

    2012-01-01

    Abstract Although peer interventionists have been successful in medication treatment-adherence interventions, their role in complex behavior-change approaches to promote entry and reentry into HIV care requires further investigation. The current study sought to describe and test the feasibility of a standardized peer-mentor training program used for MAPPS (Mentor Approach for Promoting Patient Self-Care), a study designed to increase engagement and attendance at HIV outpatient visits among high-risk HIV inpatients using HIV-positive peer interventionists to deliver a comprehensive behavioral change intervention. Development of MAPPS and its corresponding training program included collaborations with mentors from a standing outpatient mentor program. The final training program included (1) a half-day workshop; (2) practice role-plays; and (3) formal, standardized patient role-plays, using trained actors with “real-time” video observation (and ratings from trainers). Mentor training occurred over a 6-week period and required demonstration of adherence and skill, as rated by MAPPS trainers. Although time intensive, ultimate certification of mentors suggested the program was both feasible and effective. Survey data indicated mentors thought highly of the training program, while objective rating data from trainers indicated mentors were able to understand and display standards associated with intervention fidelity. Data from the MAPPS training program provide preliminary evidence that peer mentors can be trained to levels necessary to ensure intervention fidelity, even within moderately complex behavioral-change interventions. Although additional research is needed due to limitations of the current study (e.g., limited generalizability due to sample size and limited breadth of clinical training opportunities), data from the current trial suggest that training programs such as MAPPS appear both feasible and effective. PMID:22248331

  18. Effectiveness of Unmanned Surface Vehicles in Anti-submarine Warfare with the Goal of Protecting a High Value Unit

    DTIC Science & Technology

    2015-06-01

    headquarters Services , Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and...are positioned on the outer ASW screen to protect an HVU from submarine attacks. This baseline scenario provides a standardized benchmark on current...are positioned on the outer ASW screen to protect an HVU from submarine attacks. This baseline scenario provides us a standardized benchmark . In the

  19. The effect of secular trends in the classroom furniture mismatch: support for continuous update of school furniture standards.

    PubMed

    Castellucci, H I; Arezes, P M; Molenbroek, J F M; Viviani, C

    2015-01-01

    In order to create safer schools, the Chilean authorities published a Standard regarding school furniture dimensions. The aims of this study are twofold: to verify the existence of positive secular trend within the Chilean student population and to evaluate the potential mismatch between the anthropometric characteristics and the school furniture dimensions defined by the mentioned standard. The sample consists of 3078 subjects. Eight anthropometric measures were gathered, together with six furniture dimensions from the mentioned standard. There is an average increase for some dimensions within the Chilean student population over the past two decades. Accordingly, almost 18% of the students will find the seat height to be too high. Seat depth will be considered as being too shallow for 42.8% of the students. It can be concluded that the Chilean student population has increased in stature, which supports the need to revise and update the data from the mentioned Standard. Positive secular trend resulted in high levels of mismatch if furniture is selected according to the current Chilean Standard which uses data collected more than 20 years ago. This study shows that school furniture standards need to be updated over time.

  20. Principals' Portfolios: A Reflective Process for Displaying Professional Competencies, Personal Qualities and Job Accomplishments

    ERIC Educational Resources Information Center

    Green, James E.

    2004-01-01

    The current emphasis on high-stakes testing is leaving an unmistakable imprint on all aspects of education. Our curriculum, our instructional methods and materials and even our understanding of the purpose of public education are being reshaped by the standardized tests. Another area where the impact of high-stakes testing can be felt is in the…

  1. Attending High School Algebra I: In Search of Well-Managed, Engaging, Culturally Relevant, and Caring Classrooms

    ERIC Educational Resources Information Center

    Gannett, Cassandra Dunn

    2012-01-01

    The inequities in learning between the rich and the poor have become pervasive in United States. This is evidenced by the high school graduation rates, college attendance percentages, and employment statistics. Upon another wave of reform, the Common Core State Standards in mathematics are currently being adopted in hopes of increasing learning…

  2. Streamflow and nutrient dependence of temperature effects on dissolved oxygen in low-order forest streams

    Treesearch

    April Mason; Y. Jun Xu; Philip Saksa; Adrienne Viosca; Johnny M. Grace; John Beebe; Richard Stich

    2007-01-01

    Low dissolved oxygen (DO) concentrations in streams can be linked to both natural conditions and human activities. In Louisiana, natural stream conditions such as low flow, high temperature and high organic content, often result in DO levels already below current water quality criteria, making it difficult to develop standards for Best Management Practices (BMPs)....

  3. The Role of Districts in Fostering Instructional Improvement Lessons from Three Urban Districts Partnered with the Institute for Learning

    ERIC Educational Resources Information Center

    Marsh, Julie A.; Kerr, Kerri A.; Ikemoto, Gina S.; Darilek, Hilary; Suttorp, Marika; Zimmer, Ron W.; Barney, Heather

    2005-01-01

    The current high-stakes accountability environment brought on by the federal No Child Left Behind Act (NCLB) places great pressure on school districts to demonstrate success by meeting yearly progress goals for student achievement and eventually demonstrating that all students achieve at high standards. In particular, many urban school…

  4. Can high-dose fotemustine reverse MGMT resistance in glioblastoma multiforme?

    PubMed

    Gallo, Chiara; Buonerba, Carlo; Di Lorenzo, Giuseppe; Romeo, Valeria; De Placido, Sabino; Marinelli, Alfredo

    2010-11-01

    Glioblastoma multiforme (GBM), the highest grade malignant glioma, is associated with a grim prognosis-median overall survival is in the range 12-15 months, despite optimum treatment. Surgery to the maximum possible extent, external beam radiotherapy, and systemic temozolomide chemotherapy are current standard treatments for newly diagnosed GBM, with intracerebral delivery of carmustine wafers (Gliadel). Unfortunately, the effectiveness of chemotherapy can be hampered by the DNA repair enzyme O6-methylguanine methyltransferase (MGMT), which confers resistance both to temozolomide and nitrosoureas, for example fotemustine and carmustine. MGMT activity can be measured by PCR and immunohistochemistry, with the former being the current validated technique. High-dose chemotherapy can deplete MGMT levels in GBM cells and has proved feasible in various trials on temozolomide, in both newly diagnosed and recurrent GBM. We here report the unique case of a GBM patient, with high MGMT expression by immunohistochemistry, who underwent an experimental, high-dose fotemustine schedule after surgery and radiotherapy. Although treatment caused two episodes of grade 3-4 thrombocytopenia, a complete response and survival of more than three years were achieved, with a 30% increase in dose intensity compared with the standard fotemustine schedule.

  5. An Integrated Power-Efficient Active Rectifier With Offset-Controlled High Speed Comparators for Inductively Powered Applications

    PubMed Central

    Lee, Hyung-Min; Ghovanloo, Maysam

    2011-01-01

    We present an active full-wave rectifier with offset-controlled high speed comparators in standard CMOS that provides high power conversion efficiency (PCE) in high frequency (HF) range for inductively powered devices. This rectifier provides much lower dropout voltage and far better PCE compared to the passive on-chip or off-chip rectifiers. The built-in offset-control functions in the comparators compensate for both turn-on and turn-off delays in the main rectifying switches, thus maximizing the forward current delivered to the load and minimizing the back current to improve the PCE. We have fabricated this active rectifier in a 0.5-μm 3M2P standard CMOS process, occupying 0.18 mm2 of chip area. With 3.8 V peak ac input at 13.56 MHz, the rectifier provides 3.12 V dc output to a 500 Ω load, resulting in the PCE of 80.2%, which is the highest measured at this frequency. In addition, overvoltage protection (OVP) as safety measure and built-in back telemetry capabilities have been incorporated in our design using detuning and load shift keying (LSK) techniques, respectively, and tested. PMID:22174666

  6. Beneficial effects of semen purification with magnetic nanoparticles

    USDA-ARS?s Scientific Manuscript database

    Current techniques for sperm quality evaluation are mostly informative. They become useful when ejaculates of high index males not meeting quality standard are still discarded. Here we developed a molecular-based magnetic conjugates allowing selective elimination of damaged spermatozoa from semen ej...

  7. Image-guided diagnosis of prostate cancer can increase detection of tumors

    Cancer.gov

    In the largest prospective study to date of image-guided technology for identifying suspicious regions of the prostate to biopsy, researchers compared the ability of this technology to detect high-risk prostate cancer with that of the current standard of

  8. User's guide for LTGSTD24 program, Version 2. 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanlon, R.L.; Connell, L.M.

    1993-05-01

    On January 30, 1989, the US Department of Energy (DOE) promulgated an interim rule entitled [open quotes]Energy Conservation Voluntary Performance Standards for New Commercial and Multi-Family High Rise Residential Buildings; Mandatory for New Federal Buildings[close quotes] (10 CFR Part 435, Subpart A). These standards require federal agencies to design all future federal commercial and multifamily high-rise residential buildings in accordance with the standards, or demonstrate that their current requirements already meet or exceed the energy-efficiency requirements of the standards. Although these newly enacted standards do not regulate the design of non-federal buildings, the DOE recommends that all design professionals usemore » the standards as guidelines for designing energy-conserving buildings. To encourage private sector use, the DOE published the standards in the January 30, 1989, Federal Register in the format typical of commercial standards. The Pacific Northwest Laboratory developed several computer programs for the DOE to make it easier for designers to comply with the standards. One of the programs, LTGSTD24 (Version 2.4), is detailed in this user's guide and is provided on the accompanying diskettes. The program will facilitate the designer's use of the standards dealing specifically with building lighting design. Using this program will greatly simplify the designer's task of performing the calculations needed to determine if a design complies with the standards.« less

  9. Pediatric Drowning: A Standard Operating Procedure to Aid the Prehospital Management of Pediatric Cardiac Arrest Resulting From Submersion.

    PubMed

    Best, Rebecca R; Harris, Benjamin H L; Walsh, Jason L; Manfield, Timothy

    2017-05-08

    Drowning is one of the leading causes of death in children. Resuscitating a child following submersion is a high-pressure situation, and standard operating procedures can reduce error. Currently, the Resuscitation Council UK guidance does not include a standard operating procedure on pediatric drowning. The objective of this project was to design a standard operating procedure to improve outcomes of drowned children. A literature review on the management of pediatric drowning was conducted. Relevant publications were used to develop a standard operating procedure for management of pediatric drowning. A concise standard operating procedure was developed for resuscitation following pediatric submersion. Specific recommendations include the following: the Heimlich maneuver should not be used in this context; however, prolonged resuscitation and therapeutic hypothermia are recommended. This standard operating procedure is a potentially useful adjunct to the Resuscitation Council UK guidance and should be considered for incorporation into its next iteration.

  10. Ankle arthritis: review of diagnosis and operative management.

    PubMed

    Grunfeld, Robert; Aydogan, Umur; Juliano, Paul

    2014-03-01

    The diagnostic and therapeutic options for ankle arthritis are reviewed. The current standard of care for nonoperative options include the use of nonsteroidal antiinflammatory drugs, corticosteroid injections, orthotics, and ankle braces. Other modalities lack high-quality research studies to delineate their appropriateness and effectiveness. The gold standard for operative intervention in end-stage degenerative arthritis remains arthrodesis, but evidence for the superiority in functional outcomes of total ankle arthroplasty is increasing. The next few years will enable more informed decisions and, with more prospective high-quality studies, the most appropriate patient population for total ankle arthroplasty can be identified. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Perception matters for clinical perfectionism and social anxiety.

    PubMed

    Levinson, Cheri A; Rodebaugh, Thomas L; Shumaker, Erik A; Menatti, Andrew R; Weeks, Justin W; White, Emily K; Heimberg, Richard G; Warren, Cortney S; Blanco, Carlos; Schneier, Franklin; Liebowitz, Michael R

    2015-01-01

    Despite research documenting a relationship between social anxiety and perfectionism, very little research has examined the relationship between social anxiety and clinical perfectionism, defined as the combination of high personal standards and high maladaptive perfectionistic evaluative concern. In the current studies we examined whether clinical perfectionism predicted social anxiety in a large sample of undergraduates (N=602), in a clinical sample of participants diagnosed with social anxiety disorder (SAD; N=180), and by using a variance decomposition model of self- and informant-report of perfectionism (N=134). Using self-report, we found that an interaction of personal standards and evaluative concern predicted both social interaction anxiety and fear of scrutiny, but not in the theorized direction. Specifically, we found that self-report of low standards and high evaluative concern was associated with the highest levels of social anxiety, suggesting that when individuals with SAD hold low expectations for themselves combined with high concerns about evaluation, social anxiety symptoms may increase. Alternatively, when an informants' perspective was considered, and more consistent with the original theory, we found that the interaction of informant-only report of personal standards and shared-report (between both primary participant and informant) of concern over mistakes was associated with self-reported social anxiety, such that high concern over mistakes and high personal standards predicted the highest levels of social anxiety. Theoretical, clinical, and measurement implications for clinical perfectionism are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Cost-effectiveness Analysis of Nutritional Support for the Prevention of Pressure Ulcers in High-Risk Hospitalized Patients.

    PubMed

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2016-06-01

    To evaluate the cost-effectiveness of nutritional support compared with standard care in preventing pressure ulcers (PrUs) in high-risk hospitalized patients. An economic model using data from a systematic literature review. A meta-analysis of randomized controlled trials on the efficacy of nutritional support in reducing the incidence of PrUs was conducted. Modeled cohort of hospitalized patients at high risk of developing PrUs and malnutrition simulated during their hospital stay and up to 1 year. Standard care included PrU prevention strategies, such as redistribution surfaces, repositioning, and skin protection strategies, along with standard hospital diet. In addition to the standard care, the intervention group received nutritional support comprising patient education, nutrition goal setting, and the consumption of high-protein supplements. The analysis was from a healthcare payer perspective. Key outcomes of the model included the average costs and quality-adjusted life years. Model results were tested in univariate sensitivity analyses, and decision uncertainty was characterized using a probabilistic sensitivity analysis. Compared with standard care, nutritional support was cost saving at AU $425 per patient and marginally more effective with an average 0.005 quality-adjusted life years gained. The probability of nutritional support being cost-effective was 87%. Nutritional support to prevent PrUs in high-risk hospitalized patients is cost-effective with substantial cost savings predicted. Hospitals should implement the recommendations from the current PrU practice guidelines and offer nutritional support to high-risk patients.

  13. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  14. A single blue nanorod light emitting diode.

    PubMed

    Hou, Y; Bai, J; Smith, R; Wang, T

    2016-05-20

    We report a light emitting diode (LED) consisting of a single InGaN/GaN nanorod fabricated by a cost-effective top-down approach from a standard LED wafer. The device demonstrates high performance with a reduced quantum confined Stark effect compared with a standard planar counterpart fabricated from the same wafer, confirmed by optical and electrical characterization. Current density as high as 5414 A cm(-2) is achieved without significant damage to the device due to the high internal quantum efficiency. The efficiency droop is mainly ascribed to Auger recombination, which was studied by an ABC model. Our work provides a potential method for fabricating compact light sources for advanced photonic integrated circuits without involving expensive or time-consuming fabrication facilities.

  15. Application of the Critical Success Factor Methodology to DoD Organization.

    DTIC Science & Technology

    1984-09-01

    high technology manufacturing, banking, airline, insurance, railway, and automobile . Sullen (6t22-25) lists the current CSFs of the 14 S automobile ...industry as image, quality dealer system, cost control, and meting energy standards. However, in 1981 the automobile CSFs included only styling, quality...bearing on current car purchases as well as future car buys. And finally cost control influenced the auto industry as a CSF, since profit per automobile had

  16. Lower currents: a new choice for routine testing.

    PubMed

    Backes, John

    2007-01-01

    U.S. NFPA and AAMI standards both recommend a 10A ground bond test and, as has been described above, both 25A and 200mA are also recommended internationally as valid test currents for the in-service testing and inspection of medical electrical equipment. The reality is that both high and low test currents are of value to biomedical engineers and technicians in different circumstances. For benchtop testing in a workshop environment, where required test currents can be applied safely, then it seems likely that high current testing will remain the preferred option. However, for in-service test applications, where the portability and versatility of the tester is a key requirement, modern electronic technology now means that low current testing can now be applied effectively and safely. In summary, by using a low-energy, high current pulse prior to 200 mA test current, the lower test current is preferred for routine field maintenance as this can mean: Increased safety of the operator. Reduced risk of damage to the in-service medical equipment. Smaller test instruments to include valid ground bond measurements. Battery operated test equipment. Increased flexibility of the test engineer due to lightweight test equipment. Cost reduction due to reduced down time of medical equipment. More economical availability of test equipment.

  17. Therapeutic Antibodies for Myeloid Neoplasms—Current Developments and Future Directions

    PubMed Central

    Schürch, Christian M.

    2018-01-01

    Therapeutic monoclonal antibodies (mAbs) such as antibody–drug conjugates, ligand–receptor antagonists, immune checkpoint inhibitors and bispecific T cell engagers have shown impressive efficacy in the treatment of multiple human cancers. Numerous therapeutic mAbs that have been developed for myeloid neoplasms, including acute myeloid leukemia (AML) and myelodysplastic syndrome (MDS), are currently investigated in clinical trials. Because AML and MDS originate from malignantly transformed hematopoietic stem/progenitor cells—the so-called leukemic stem cells (LSCs) that are highly resistant to most standard drugs—these malignancies frequently relapse and have a high disease-specific mortality. Therefore, combining standard chemotherapy with antileukemic mAbs that specifically target malignant blasts and particularly LSCs or utilizing mAbs that reinforce antileukemic host immunity holds great promise for improving patient outcomes. This review provides an overview of therapeutic mAbs for AML and MDS. Antibody targets, the molecular mechanisms of action, the efficacy in preclinical leukemia models, and the results of clinical trials are discussed. New developments and future studies of therapeutic mAbs in myeloid neoplasms will advance our understanding of the immunobiology of these diseases and enhance current therapeutic strategies. PMID:29868474

  18. Current and Emerging Therapies for Lupus Nephritis

    PubMed Central

    Parikh, Samir V.

    2016-01-01

    The introduction of corticosteroids and later, cyclophosphamide dramatically improved survival in patients with proliferative lupus nephritis, and combined administration of these agents became the standard-of-care treatment for this disease. However, treatment failures were still common and the rate of progression to ESRD remained unacceptably high. Additionally, treatment was associated with significant morbidity. Therefore, as patient survival improved, the goals for advancing lupus nephritis treatment shifted to identifying therapies that could improve long-term renal outcomes and minimize treatment-related toxicity. Unfortunately, progress has been slow and the current approaches to the management of lupus nephritis continue to rely on high-dose corticosteroids plus a broad-spectrum immunosuppressive agent. Over the past decade, an improved understanding of lupus nephritis pathogenesis fueled several clinical trials of novel drugs, but none have been found to be superior to the combination of a cytotoxic agent and corticosteroids. Despite these trial failures, efforts to translate mechanistic advances into new treatment approaches continue. In this review, we discuss current therapeutic strategies for lupus nephritis, briefly review recent advances in understanding the pathogenesis of this disease, and describe emerging approaches developed on the basis of these advances that promise to improve upon the standard-of-care lupus nephritis treatments. PMID:27283496

  19. Onset of magnetic reconnection in a weakly collisional, high- β plasma

    NASA Astrophysics Data System (ADS)

    Alt, Andrew; Kunz, Matthew

    2017-10-01

    In a magnetized, weakly collisional plasma, the magnetic moment of the constituent particles is an adiabatic invariant. An increase of the magnetic-field strength in such a plasma thus leads to an increase in the thermal pressure perpendicular to the field lines. Above a β-dependent threshold, this pressure anisotropy drives the mirror instability, which produces strong distortions in the field lines and traps particles on ion-Larmor scales. The impact of this instability on magnetic reconnection is investigated using simple analytical and numerical models for the formation of a current sheet and the associated production of pressure anisotropy. The difficulty in maintaining an isotropic, Maxwellian particle distribution during the formation and subsequent thinning of a current sheet in a weakly collisional plasma, coupled with the low threshold for the mirror instability in a high- β plasma, imply that the topology of reconnecting magnetic fields can radically differ from the standard Harris-sheet profile often used in kinetic simulations of collisionless reconnection. Depending on the rate of current-sheet formation, this mirror-induced disruption may occur before standard tearing modes are able to develop. This work was supported by U.S. DOE contract DE-AC02-09CH11466.

  20. Growth and Neurodevelopmental Outcomes of Early, High-Dose Parenteral Amino Acid Intake in Very Low Birth Weight Infants: A Randomized Controlled Trial.

    PubMed

    Balakrishnan, Maya; Jennings, Alishia; Przystac, Lynn; Phornphutkul, Chanika; Tucker, Richard; Vohr, Betty; Stephens, Bonnie E; Bliss, Joseph M

    2017-03-01

    Administration of high-dose parenteral amino acids (AAs) to premature infants within hours of delivery is currently recommended. This study compared the effect of lower and higher AA administration starting close to birth on short-term growth and neurodevelopmental outcomes at 18-24 months corrected gestational age (CGA). Infants <1250 g birth weight (n = 168) were randomly assigned in a blinded fashion to receive parenteral nutrition providing 1-2 g/kg/d AA and advancing daily by 0.5 g/kg/d to a goal of 4 g/kg/d (standard AA) or 3-4 g/kg/d and advancing to 4 g/kg/d by day 1. The primary outcome was neurodevelopmental outcomes measured by the Bayley Scales of Infant and Toddler Development, Third Edition at 18-24 months CGA. Secondary outcomes were growth parameters at 36 weeks CGA among infants surviving to hospital discharge, serum bicarbonate, serum urea nitrogen, creatinine, AA profiles in the first week of life, and incidence of major morbidities and mortality. No differences in neurodevelopmental outcome were detected between the high and low AA groups. Infants in the high AA group had significantly lower mean weight, length, and head circumference percentiles than those in the standard AA group at 36 weeks CGA and at hospital discharge. These differences did not persist after controlling for birth growth parameters, except for head circumference. Infants in the high AA group had higher mean serum urea nitrogen than the standard group on each day throughout the first week. Current recommendations for high-dose AA starting at birth are not associated with improved growth or neurodevelopmental outcomes.

  1. A modified beam-to-earth transformation to measure short-wavelength internal waves with an acoustic Doppler current profiler

    USGS Publications Warehouse

    Scotti, A.; Butman, B.; Beardsley, R.C.; Alexander, P.S.; Anderson, S.

    2005-01-01

    The algorithm used to transform velocity signals from beam coordinates to earth coordinates in an acoustic Doppler current profiler (ADCP) relies on the assumption that the currents are uniform over the horizontal distance separating the beams. This condition may be violated by (nonlinear) internal waves, which can have wavelengths as small as 100-200 m. In this case, the standard algorithm combines velocities measured at different phases of a wave and produces horizontal velocities that increasingly differ from true velocities with distance from the ADCP. Observations made in Massachusetts Bay show that currents measured with a bottom-mounted upward-looking ADCP during periods when short-wavelength internal waves are present differ significantly from currents measured by point current meters, except very close to the instrument. These periods are flagged with high error velocities by the standard ADCP algorithm. In this paper measurements from the four spatially diverging beams and the backscatter intensity signal are used to calculate the propagation direction and celerity of the internal waves. Once this information is known, a modified beam-to-earth transformation that combines appropriately lagged beam measurements can be used to obtain current estimates in earth coordinates that compare well with pointwise measurements. ?? 2005 American Meteorological Society.

  2. Magnetic Johnson Noise Constraints on Electron Electric Dipole Moment Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munger, C.

    2004-11-18

    Magnetic fields from statistical fluctuations in currents in conducting materials broaden atomic linewidths by the Zeeman effect. The constraints so imposed on the design of experiments to measure the electric dipole moment of the electron are analyzed. Contrary to the predictions of Lamoreaux [S.K. Lamoreaux, Phys. Rev. A60, 1717(1999)], the standard material for high-permeability magnetic shields proves to be as significant a source of broadening as an ordinary metal. A scheme that would replace this standard material with ferrite is proposed.

  3. Pulmonary Nodule Management in Lung Cancer Screening: A Pictorial Review of Lung-RADS Version 1.0.

    PubMed

    Godoy, Myrna C B; Odisio, Erika G L C; Truong, Mylene T; de Groot, Patricia M; Shroff, Girish S; Erasmus, Jeremy J

    2018-05-01

    The number of screening-detected lung nodules is expected to increase as low-dose computed tomography screening is implemented nationally. Standardized guidelines for image acquisition, interpretation, and screen-detected nodule workup are essential to ensure a high standard of medical care and that lung cancer screening is implemented safely and cost effectively. In this article, we review the current guidelines for pulmonary nodule management in the lung cancer screening setting. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. [Current situation of the standardization of acupuncture and moxibustion in Taiwan].

    PubMed

    Pan, Li-Jia; Cui, Rui; Zhan, Bi-Yu; Liao, Cai-Yan; Cao, Qi-Hua; Li, Gui-Lan; Guo, Yi

    2012-09-01

    The current situation of the standardization of acupuncture and moxibustion in the Taiwan region is introduced in this paper from the three aspects, named the development state of standard of acupuncture and moxibustion in Taiwan, the implementation of Taiwan district standard and the standardization of acupuncture and moxibustion in Taiwan. At present, the relevant standards of acupuncture and moxibustion in Taiwan just include the standard operation procedure of acupuncture and moxibustion, the reference guideline of the safe operation in the medical service centers of traditional Chinese medicine, and the faculty standard of Chinese medicine hospital, etc. It is concluded that the current situation of the standardization of acupuncture and moxibusiton presented the weak awareness of the standardization of acupuncture and moxibustion in the industry, insufficient enterprise standard, less-quantity of the implemented standards and narrow coverage.

  5. Building an Evaluation Scale using Item Response Theory.

    PubMed

    Lalor, John P; Wu, Hao; Yu, Hong

    2016-11-01

    Evaluation of NLP methods requires testing against a previously vetted gold-standard test set and reporting standard metrics (accuracy/precision/recall/F1). The current assumption is that all items in a given test set are equal with regards to difficulty and discriminating power. We propose Item Response Theory (IRT) from psychometrics as an alternative means for gold-standard test-set generation and NLP system evaluation. IRT is able to describe characteristics of individual items - their difficulty and discriminating power - and can account for these characteristics in its estimation of human intelligence or ability for an NLP task. In this paper, we demonstrate IRT by generating a gold-standard test set for Recognizing Textual Entailment. By collecting a large number of human responses and fitting our IRT model, we show that our IRT model compares NLP systems with the performance in a human population and is able to provide more insight into system performance than standard evaluation metrics. We show that a high accuracy score does not always imply a high IRT score, which depends on the item characteristics and the response pattern.

  6. Building an Evaluation Scale using Item Response Theory

    PubMed Central

    Lalor, John P.; Wu, Hao; Yu, Hong

    2016-01-01

    Evaluation of NLP methods requires testing against a previously vetted gold-standard test set and reporting standard metrics (accuracy/precision/recall/F1). The current assumption is that all items in a given test set are equal with regards to difficulty and discriminating power. We propose Item Response Theory (IRT) from psychometrics as an alternative means for gold-standard test-set generation and NLP system evaluation. IRT is able to describe characteristics of individual items - their difficulty and discriminating power - and can account for these characteristics in its estimation of human intelligence or ability for an NLP task. In this paper, we demonstrate IRT by generating a gold-standard test set for Recognizing Textual Entailment. By collecting a large number of human responses and fitting our IRT model, we show that our IRT model compares NLP systems with the performance in a human population and is able to provide more insight into system performance than standard evaluation metrics. We show that a high accuracy score does not always imply a high IRT score, which depends on the item characteristics and the response pattern.1 PMID:28004039

  7. Human System Drivers for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Steinberg, Susan; Charles, John B.

    2010-01-01

    Evaluation of DRM4 in terms of the human system includes the ability to meet NASA standards, the inclusion of the human system in the design trade space, preparation for future missions and consideration of a robotic precursor mission. Ensuring both the safety and the performance capability of the human system depends upon satisfying NASA Space Flight Human System Standards.1 These standards in turn drive the development of program-specific requirements for Near-earth Object (NEO) missions. In evaluating DRM4 in terms of these human system standards, the currently existing risk models, technologies and biological countermeasures were used. A summary of this evaluation is provided below in a structure that supports a mission architecture planning activities. 1. Unacceptable Level of Risk The duration of the DRM4 mission leads to an unacceptable level of risk for two aspects of human system health: A. The permissible exposure limit for space flight radiation exposure (a human system standard) would be exceeded by DRM4. B. The risk of visual alterations and abnormally high intracranial pressure would be too high. 1

  8. Neither Fair nor Accurate: Research-Based Reasons Why High-Stakes Tests Should Not Be Used to Evaluate Teachers

    ERIC Educational Resources Information Center

    Au, Wayne

    2011-01-01

    Current and former leaders of many major urban school districts, including Washington, D.C.'s Michelle Rhee and New Orleans' Paul Vallas, have sought to use tests to evaluate teachers. In fact, the use of high-stakes standardized tests to evaluate teacher performance in the manner of value-added measurement (VAM) has become one of the cornerstones…

  9. Skin cancer and inorganic arsenic: uncertainty-status of risk.

    PubMed

    Brown, K G; Guo, H R; Kuo, T L; Greene, H L

    1997-02-01

    The current U.S. EPA standard for inorganic arsenic in drinking water is 50 ppb (microgram/L), dating to the National Interim Primary Drinking Water Regulation of 1976. The current EPA risk analysis predicts an increased lifetime skin cancer risk on the order of 3 or 4 per 1000 from chronic exposure at that concentration. Revision of the standard to only a few ppb, perhaps even less than 1 ppb, may be indicated by the EPA analysis to reduce the lifetime risk to an acceptable level. The cost to water utilities, and ultimately to their consumers, to conform to such a large reduction in the standard could easily reach several billion dollars, so it is particularly important to assess accurately the current risk and the risk reduction that would be achieved by a lower standard. This article addresses the major sources of uncertainty in the EPA analysis with respect to this objective. Specifically, it focuses on uncertainty and variability in the exposure estimates for the landmark study of Tseng and colleagues in Taiwan, analyzed using a reconstruction of the their exposure data. It is concluded that while the available dataset is suitable to establish the hazard of skin cancer, it is too highly summarized for reliable dose-response assessment. A new epidemiologic study is needed, designed for the requirements of dose-response assessment.

  10. The FBI compression standard for digitized fingerprint images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brislawn, C.M.; Bradley, J.N.; Onyshczak, R.J.

    1996-10-01

    The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the currentmore » status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.« less

  11. [How to write an andrological paper: standardization, mechanics and techniques].

    PubMed

    Huang, Yu-Feng; Lu, Jin-Chun

    2010-12-01

    Andrological research papers not only reflect the current status and academic level of andrology, but also constitute an important communication platform for researchers and clinicians engaged in this field and contribute significantly to the development of andrology. It would be made easier to write a high-quality andrological paper once the author observes the basic requirements of research papers, adheres to the use of standard scientific terminology, knows the special writing mechanics, and equips himself with some essential writing techniques. Based on the long experience of editorship, we present a detailed introduction of the standardization, mechanics and techniques of writing an andrological paper.

  12. On Leakage Current Measured at High Cell Voltages in Lithium-Ion Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vadivel, Nicole R.; Ha, Seungbum; He, Meinan

    2017-01-01

    In this study, parasitic side reactions in lithium-ion batteries were examined experimentally using a potentiostatic hold at high cell voltage. The experimental leakage current measured during the potentiostatic hold was compared to the Tafel expression and showed poor agreement with the expected transfer coefficient values, indicating that a more complicated expression could be needed to accurately capture the physics of this side reaction. Here we show that cross-talk between the electrodes is the primary contribution to the observed leakage current after the relaxation of concentration gradients has ceased. This cross-talk was confirmed with experiments using a lithium-ion conducting glass ceramicmore » (LICGC) separator, which has high conductance only for lithium cations. The cells with LICGC separators showed significantly less leakage current during the potentiostatic hold test compared to cells with standard microporous separators where cross-talk is present. In addition, direct-current pulse power tests show an impedance rise for cells held at high potentials and for cells held at high temperatures, which could be attributed to film formation from the parasitic side reaction. Based on the experimental findings, a phenomenological mechanism is proposed for the parasitic side reaction which accounts for cross-talk and mass transport of the decomposition products across the separator.« less

  13. Applying Registry Services to Spaceflight Technologies to Aid in the Assignment of Assigned Numbers to Disparate Systems and Their Technologies to Further Enable Interoperability

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Nichols, Kelvin F.

    2006-01-01

    To date very little effort has been made to provide interoperability between various space agency projects. To effectively get to the Moon and beyond systems must interoperate. To provide interoperability, standardization and registries of various technologies will be required. These registries will be created as they relate to space flight. With the new NASA Moon/Mars initiative a requirement to standardize and control the naming conventions of very disparate systems and technologies are emerging. The need to provide numbering to the many processes, schemas, vehicles, robots, space suits and technologies (e.g. versions), to name a few, in the highly complex Constellation Initiative is imperative. The number of corporations, developer personnel, system interfaces, people interfaces will require standardization and registries on a scale not currently envisioned. It would only take one exception (stove piped system development) to weaken, if not, destroy interoperability. To start, a standardized registry process must be defined that allows many differing engineers, organizations and operators the ability to easily access disparate registry information across numerous technological and scientific disciplines. Once registries are standardized the need to provide registry support in terms of setup and operations, resolution of conflicts between registries and other issues will need to be addressed. Registries should not be confused with repositories. No end user data is "stored" in a registry nor is it a configuration control system. Once a registry standard is created and approved, the technologies that should be registered must be identified and prioritized. In this paper, we will identify and define a registry process that is compatible with the Constellation Initiative and other non related space activities and organizations. We will then identify and define the various technologies that should use a registry to provide interoperability. The first set of technologies will be those that are currently in need of expansion namely the assignment of satellite designations and the process which controls assignments. Second, we will analyze the technologies currently standardized under the Consultative Committee for Space Data Systems (CCSDS) banner. Third, we will analyze the current CCSDS working group and birds of a feather activities to ascertain registry requirements. Lastly, we will identify technologies that are either currently under the auspices of another

  14. Dependability of technical items: Problems of standardization

    NASA Astrophysics Data System (ADS)

    Fedotova, G. A.; Voropai, N. I.; Kovalev, G. F.

    2016-12-01

    This paper is concerned with problems blown up in the development of a new version of the Interstate Standard GOST 27.002 "Industrial product dependability. Terms and definitions". This Standard covers a wide range of technical items and is used in numerous regulations, specifications, standard and technical documentation. A currently available State Standard GOST 27.002-89 was introduced in 1990. Its development involved a participation of scientists and experts from different technical areas, its draft was debated in different audiences and constantly refined, so it was a high quality document. However, after 25 years of its application it's become necessary to develop a new version of the Standard that would reflect the current understanding of industrial dependability, accounting for the changes taking place in Russia in the production, management and development of various technical systems and facilities. The development of a new version of the Standard makes it possible to generalize on a terminological level the knowledge and experience in the area of reliability of technical items, accumulated over a quarter of the century in different industries and reliability research schools, to account for domestic and foreign experience of standardization. Working on the new version of the Standard, we have faced a number of issues and problems on harmonization with the International Standard IEC 60500-192, caused first of all by different approaches to the use of terms and differences in the mentalities of experts from different countries. The paper focuses on the problems related to the chapter "Maintenance, restoration and repair", which caused difficulties for the developers to harmonize term definitions both with experts and the International Standard, which is mainly related to differences between the Russian concept and practice of maintenance and repair and foreign ones.

  15. From Capstones to Touchstones: Preparative Assessment and Its Use in Teacher Education

    ERIC Educational Resources Information Center

    Brock, Patricia Ann

    2004-01-01

    Assessment of teacher competence follows current educational trends in rubrics, standards, and high-stakes testing. Simultaneously, the traditional preservice education classroom is expanding into cyberspace; many teacher preparation programs are being offered through distance learning. As preservice education students complete required courses…

  16. Issues with fruit dietary supplements in the US - authentication by anthocyanin

    USDA-ARS?s Scientific Manuscript database

    Current fruit-based dietary supplements in the US marketplace have no obligation to meet any fruit-component concentration requirement. For example, berry supplements might be promoted for their high anthocyanin content, but they actually have no standard or minimum anthocyanin threshold for legal s...

  17. FRACTIONAL AEROSOL FILTRATION EFFICIENCY OF IN-DUCT VENTILATION AIR CLEANERS

    EPA Science Inventory

    The filtration efficiency of ventilation air cleaners is highly particle-size dependent over the 0.01 to 3 μm diameter size range. Current standardized test methods, which determine only overall efficiencies for ambient aerosol or other test aerosols, provide data of limited util...

  18. Predictions for the drive capabilities of the RancheroS Flux Compression Generator into various load inductances using the Eulerian AMR Code Roxane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watt, Robert Gregory

    The Ranchero Magnetic Flux Compression Generator (FCG) has been used to create current pulses in the 10-­100 MA range for driving both “static” low inductance (0.5 nH) loads1 for generator demonstration purposes and high inductance (10-­20 nH) imploding liner loads2 for ultimate use in physics experiments at very high energy density. Simulations of the standard Ranchero generator have recently shown that it had a design issue that could lead to flux trapping in the generator, and a non-­ robust predictability in its use in high energy density experiments. A re-­examination of the design concept for the standard Ranchero generator, promptedmore » by the possible appearance of an aneurism at the output glide plane, has led to a new generation of Ranchero generators designated the RancheroS (for swooped). This generator has removed the problematic output glide plane and replaced it with a region of constantly increasing diameter in the output end of the FCG cavity in which the armature is driven outward under the influence of an additional HE load not present in the original Ranchero. The resultant RancheroS generator, to be tested in LA43S-­L13, probably in early FY17, has a significantly increased initial inductance and may be able to drive a somewhat higher load inductance than the standard Ranchero. This report will use the Eulerian AMR code Roxane to study the ability of the new design to drive static loads, with a goal of providing a database corresponding to the load inductances for which the generator might be used and the anticipated peak currents such loads might produce in physics experiments. Such a database, combined with a simple analytic model of an ideal generator, where d(LI)/dt = 0, and supplemented by earlier estimates of losses in actual use of the standard Ranchero, scaled to estimate the increase in losses due to the longer current carrying perimeter in the RancheroS, can then be used to bound the expectations for the current drive one may apply to any load assembly in future experiments.« less

  19. MATLAB implementation of a dynamic clamp with bandwidth >125 KHz capable of generating INa at 37°C

    PubMed Central

    Clausen, Chris; Valiunas, Virginijus; Brink, Peter R.; Cohen, Ira S.

    2012-01-01

    We describe the construction of a dynamic clamp with bandwidth >125 KHz that utilizes a high performance, yet low cost, standard home/office PC interfaced with a high-speed (16 bit) data acquisition module. High bandwidth is achieved by exploiting recently available software advances (code-generation technology, optimized real-time kernel). Dynamic-clamp programs are constructed using Simulink, a visual programming language. Blocks for computation of membrane currents are written in the high-level matlab language; no programming in C is required. The instrument can be used in single- or dual-cell configurations, with the capability to modify programs while experiments are in progress. We describe an algorithm for computing the fast transient Na+ current (INa) in real time, and test its accuracy and stability using rate constants appropriate for 37°C. We then construct a program capable of supplying three currents to a cell preparation: INa, the hyperpolarizing-activated inward pacemaker current (If), and an inward-rectifier K+ current (IK1). The program corrects for the IR drop due to electrode current flow, and also records all voltages and currents. We tested this program on dual patch-clamped HEK293 cells where the dynamic clamp controls a current-clamp amplifier and a voltage-clamp amplifier controls membrane potential, and current-clamped HEK293 cells where the dynamic clamp produces spontaneous pacing behavior exhibiting Na+ spikes in otherwise passive cells. PMID:23224681

  20. Superconducting technology for overcurrent limiting in a 25 kA current injection system

    NASA Astrophysics Data System (ADS)

    Heydari, Hossein; Faghihi, Faramarz; Sharifi, Reza; Poursoltanmohammadi, Amir Hossein

    2008-09-01

    Current injection transformer (CIT) systems are within the major group of the standard type test of high current equipment in the electrical industry, so their performance becomes very important. When designing high current systems, there are many factors to be considered from which their overcurrent protection must be ensured. The output of a CIT is wholly dependent on the impedance of the equipment under test (EUT). Therefore current flow beyond the allowable limit can occur. The present state of the art provides an important guide to developing current limiters not only for the grid application but also in industrial equipment. This paper reports the state of the art in the technology available that could be developed into an application of superconductivity for high current equipment (CIT) protection with no test disruption. This will result in a greater market choice and lower costs for equipment protection solutions, reduced costs and improved system reliability. The paper will also push the state of the art by using two distinctive circuits, closed-core and open-core, for overcurrent protection of a 25 kA CIT system, based on a flux-lock-type superconducting fault current limiter (SFCL) and magnetic properties of high temperature superconducting (HTS) elements. An appropriate location of the HTS element will enhance the rate of limitation with the help of the magnetic field generated by the CIT output busbars. The calculation of the HTS parameters for overcurrent limiting is also performed to suit the required current levels of the CIT.

  1. Development of tearing instability in a current sheet forming by sheared incompressible flow

    NASA Astrophysics Data System (ADS)

    Tolman, Elizabeth A.; Loureiro, Nuno F.; Uzdensky, Dmitri A.

    2018-02-01

    Sweet-Parker current sheets in high Lundquist number plasmas are unstable to tearing, suggesting they will not form in physical systems. Understanding magnetic reconnection thus requires study of the stability of a current sheet as it forms. Formation can occur due to sheared, sub-Alfvénic incompressible flows which narrow the sheet. Standard tearing theory (Furth et al. Phys. Fluids, vol. 6 (4), 1963, pp. 459-484, Rutherford, Phys. Fluids, vol. 16 (11), 1973, pp. 1903-1908, Coppi et al. Fizika Plazmy, vol. 2, 1976, pp. 961-966) is not immediately applicable to such forming sheets for two reasons: first, because the flow introduces terms not present in the standard calculation; second, because the changing equilibrium introduces time dependence to terms which are constant in the standard calculation, complicating the formulation of an eigenvalue problem. This paper adapts standard tearing mode analysis to confront these challenges. In an initial phase when any perturbations are primarily governed by ideal magnetohydrodynamics, a coordinate transformation reveals that the flow compresses and stretches perturbations. A multiple scale formulation describes how linear tearing mode theory (Furth et al. Phys. Fluids, vol. 6 (4), 1963, pp. 459-484, Coppi et al. Fizika Plazmy, vol. 2, 1976, pp. 961-966) can be applied to an equilibrium changing under flow, showing that the flow affects the separable exponential growth only implicitly, by making the standard scalings time dependent. In the nonlinear Rutherford stage, the coordinate transformation shows that standard theory can be adapted by adding to the stationary rates time dependence and an additional term due to the strengthening equilibrium magnetic field. Overall, this understanding supports the use of flow-free scalings with slight modifications to study tearing in a forming sheet.

  2. Eddy-Current Reference Standard

    NASA Technical Reports Server (NTRS)

    Ambrose, H. H., Jr.

    1985-01-01

    Magnetic properties of metallic reference standards duplicated and stabilized for eddy-current coil measurements over long times. Concept uses precisely machined notched samples of known annealed materials as reference standards.

  3. Standard on microbiological management of fluids for hemodialysis and related therapies by the Japanese Society for Dialysis Therapy 2008.

    PubMed

    Kawanishi, Hideki; Akiba, Takashi; Masakane, Ikuto; Tomo, Tadashi; Mineshima, Michio; Kawasaki, Tadayuki; Hirakata, Hideki; Akizawa, Tadao

    2009-04-01

    The Committee of Scientific Academy of the Japanese Society for Dialysis Therapy (JSDT) proposes a new standard on microbiological management of fluids for hemodialysis and related therapies. This standard is within the scope of the International Organization for Standardization (ISO), which is currently under revision. This standard is to be applied to the central dialysis fluid delivery systems (CDDS), which are widely used in Japan. In this standard, microbiological qualities for dialysis water and dialysis fluids are clearly defined by endotoxin level and bacterial count. The qualities of dialysis fluids were classified into three levels: standard, ultrapure, and online prepared substitution fluid. In addition, the therapeutic application of each dialysis fluid is clarified. Since high-performance dialyzers are frequently used in Japan, the standard recommends that ultrapure dialysis fluid be used for all dialysis modalities at all dialysis facilities. It also recommends that the dialysis equipment safety management committee at each facility should validate the microbiological qualities of online prepared substitution fluid.

  4. A fiber-optic current sensor for aerospace applications

    NASA Technical Reports Server (NTRS)

    Patterson, Richard L.; Rose, A. H.; Tang, D.; Day, G. W.

    1990-01-01

    A robust, accurate, broad-band, alternating current sensor using fiber optics is being developed for space applications at power frequencies as high as 20 kHz. It can also be used in low and high voltage 60 Hz terrestrial power systems and in 400 Hz aircraft systems. It is intrinsically electromagnetic interference (EMI) immune and has the added benefit of excellent isolation. The sensor uses the Faraday effect in optical fiber and standard polarimetric measurements to sense electrical current. The primary component of the sensor is a specially treated coil of single-mode optical fiber, through which the current carrying conductor passes. Improved precision is accomplished by temperature compensation by means of signals from a novel fiber-optic temperature sensor embedded in the sensing head. The technology contained in the sensor is examined and the results of precision tests conducted at various temperatures within the wide operating range are given. The results of early EMI tests are also given.

  5. A fiber-optic current sensor for aerospace applications

    NASA Technical Reports Server (NTRS)

    Patterson, Richard L.; Rose, A. H.; Tang, D.; Day, G. W.

    1990-01-01

    A robust, accurate, broadband, alternating current sensor using fiber optics is being developed for space applications at power frequencies as high as 20 kHz. It can also be used in low and high voltage 60-Hz terrestrial power systems and in 400-Hz aircraft systems. It is intrinsically electromagnetic interference (EMI) immune and has the added benefit of excellent isolation. The sensor uses the Faraday effect in optical fiber and standard polarimetric measurements to sense electrical current. The primary component of the sensor is a specially treated coil of single-mode optical fiber, through which the current carrying conductor passes. Improved precision is accomplished by temperature compensation by means of signals from a novel fiber-optic temperature sensor embedded in the sensing head. The technology used in the sensor is examined and the results of precision tests conducted at various temperatures within the wide operating range are given. The results of early EMI tests are also given.

  6. A fiber-optic current sensor for aerospace applications

    NASA Technical Reports Server (NTRS)

    Patterson, Richard L.; Rose, A. H.; Tang, D.; Day, G. W.

    1990-01-01

    A robust, accurate, broadband, alternating current sensor using fiber optics is being developed for space applications at power frequencies as high as 20 kHz. It can also be used in low- and high-voltage 60-Hz terrestrial power systems and in 400-Hz aircraft systems. It is intrinsically EMI (electromagnetic interference) immune and has the added benefit of excellent isolation. The sensor uses the Faraday effect in optical fiber and standard polarimetric measurements to sense electrical current. The primary component of the sensor is a specially treated coil of single-mode optical fiber, through which the current carrying conductor passes. Improved precision is accomplished by temperature compensation by means of signals from a fiber-optic temperature sensor embedded in the sensing head. The authors report on the technology contained in the sensor and also relate the results of precision tests conducted at various temperatures within the wide operating range. The results of early EMI tests are shown.

  7. Low dark current and high speed ZnO metal–semiconductor–metal photodetector on SiO{sub 2}/Si substrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Çalışkan, Deniz, E-mail: dcaliskan@fen.bilkent.edu.tr; Department of Nanotechnology and Nanomedicine, Hacettepe University, 06800 Beytepe, Ankara; Bütün, Bayram

    2014-10-20

    ZnO thin films are deposited by radio-frequency magnetron sputtering on thermally grown SiO{sub 2} on Si substrates. Pt/Au contacts are fabricated by standard photolithography and lift-off in order to form a metal-semiconductor-metal (MSM) photodetector. The dark current of the photodetector is measured as 1 pA at 100 V bias, corresponding to 100 pA/cm{sup 2} current density. Spectral photoresponse measurement showed the usual spectral behavior and 0.35 A/W responsivity at a 100 V bias. The rise and fall times for the photocurrent are measured as 22 ps and 8 ns, respectively, which are the lowest values to date. Scanning electron microscope image shows high aspect ratio andmore » dense grains indicating high surface area. Low dark current density and high speed response are attributed to high number of recombination centers due to film morphology, deducing from photoluminescence measurements. These results show that as deposited ZnO thin film MSM photodetectors can be used for the applications needed for low light level detection and fast operation.« less

  8. Carbon nanotube feedback-gate field-effect transistor: suppressing current leakage and increasing on/off ratio.

    PubMed

    Qiu, Chenguang; Zhang, Zhiyong; Zhong, Donglai; Si, Jia; Yang, Yingjun; Peng, Lian-Mao

    2015-01-27

    Field-effect transistors (FETs) based on moderate or large diameter carbon nanotubes (CNTs) usually suffer from ambipolar behavior, large off-state current and small current on/off ratio, which are highly undesirable for digital electronics. To overcome these problems, a feedback-gate (FBG) FET structure is designed and tested. This FBG FET differs from normal top-gate FET by an extra feedback-gate, which is connected directly to the drain electrode of the FET. It is demonstrated that a FBG FET based on a semiconducting CNT with a diameter of 1.5 nm may exhibit low off-state current of about 1 × 10(-13) A, high current on/off ratio of larger than 1 × 10(8), negligible drain-induced off-state leakage current, and good subthreshold swing of 75 mV/DEC even at large source-drain bias and room temperature. The FBG structure is promising for CNT FETs to meet the standard for low-static-power logic electronics applications, and could also be utilized for building FETs using other small band gap semiconductors to suppress leakage current.

  9. Analysis and elimination method of the effects of cables on LVRT testing for offshore wind turbines

    NASA Astrophysics Data System (ADS)

    Jiang, Zimin; Liu, Xiaohao; Li, Changgang; Liu, Yutian

    2018-02-01

    The current state, characteristics and necessity of the low voltage ride through (LVRT) on-site testing for grid-connected offshore wind turbines are introduced firstly. Then the effects of submarine cables on the LVRT testing are analysed based on the equivalent circuit of the testing system. A scheme for eliminating the effects of cables on the proposed LVRT testing method is presented. The specified voltage dips are guaranteed to be in compliance with the testing standards by adjusting the ratio between the current limiting impedance and short circuit impedance according to the steady voltage relationship derived from the equivalent circuit. Finally, simulation results demonstrate that the voltage dips at the high voltage side of wind turbine transformer satisfy the requirements of testing standards.

  10. Zero bias thermally stimulated currents in synthetic diamond

    NASA Astrophysics Data System (ADS)

    Mori, R.; Miglio, S.; Bruzzi, M.; Bogani, F.; De Sio, A.; Pace, E.

    2009-06-01

    Zero bias thermally stimulated currents (ZBTSCs) have been observed in single crystal high pressure high temperature (HPHT) and polycrystalline chemical vapor deposited (pCVD) diamond films. The ZBTSC technique is characterized by an increased sensitivity with respect to a standard TSC analysis. Due to the absence of the thermally activated background current, new TSC peaks have been observed in both HPHT and pCVD diamond films, related to shallow activation energies usually obscured by the emission of the dominant impurities. The ZBTSC peaks are explained in terms of defect discharge in the nonequilibrium potential distribution created by a nonuniform traps filling at the metal-diamond junctions. The electric field due to the charged defects has been estimated in a quasizero bias TSC experiment by applying an external bias.

  11. Fiber-channel audio video standard for military and commercial aircraft product lines

    NASA Astrophysics Data System (ADS)

    Keller, Jack E.

    2002-08-01

    Fibre channel is an emerging high-speed digital network technology that combines to make inroads into the avionics arena. The suitability of fibre channel for such applications is largely due to its flexibility in these key areas: Network topologies can be configured in point-to-point, arbitrated loop or switched fabric connections. The physical layer supports either copper or fiber optic implementations with a Bit Error Rate of less than 10-12. Multiple Classes of Service are available. Multiple Upper Level Protocols are supported. Multiple high speed data rates offer open ended growth paths providing speed negotiation within a single network. Current speeds supported by commercially available hardware are 1 and 2 Gbps providing effective data rates of 100 and 200 MBps respectively. Such networks lend themselves well to the transport of digital video and audio data. This paper summarizes an ANSI standard currently in the final approval cycle of the InterNational Committee for Information Technology Standardization (INCITS). This standard defines a flexible mechanism whereby digital video, audio and ancillary data are systematically packaged for transport over a fibre channel network. The basic mechanism, called a container, houses audio and video content functionally grouped as elements of the container called objects. Featured in this paper is a specific container mapping called Simple Parametric Digital Video (SPDV) developed particularly to address digital video in avionics systems. SPDV provides pixel-based video with associated ancillary data typically sourced by various sensors to be processed and/or distributed in the cockpit for presentation via high-resolution displays. Also highlighted in this paper is a streamlined Upper Level Protocol (ULP) called Frame Header Control Procedure (FHCP) targeted for avionics systems where the functionality of a more complex ULP is not required.

  12. DWPF STARTUP FRIT VISCOSITY MEASUREMENT ROUND ROBIN RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crum, Jarrod V.; Edwards, Tommy B.; Russell, Renee L.

    2012-07-31

    A viscosity standard is needed to replace the National Institute of Standards and Technology (NIST) glasses currently being used to calibrate viscosity measurement equipment. The current NIST glasses are either unavailable or less than ideal for calibrating equipment to measure the viscosity of high-level waste glasses. This report documents the results of a viscosity round robin study conducted on the Defense Waste Processing Facility (DWPF) startup frit. DWPF startup frit was selected because its viscosity-temperature relationship is similar to most DWPF and Hanford high-level waste glass compositions. The glass underwent grinding and blending to homogenize the large (100 lb) batch.more » Portions of the batch were supplied to the laboratories (named A through H) for viscosity measurements following a specified temperature schedule with a temperature range of 1150 C to 950 C and with an option to measure viscosity at lower temperatures if their equipment was capable of measuring at the higher viscosities. Results were used to fit the Vogel-Tamman-Fulcher and Arrhenius equations to viscosity as a function of temperature for the entire temperature range of 460 C through 1250 C as well as the limited temperature interval of approximately 950 C through 1250 C. The standard errors for confidence and prediction were determined for the fitted models.« less

  13. Commercial Motor Vehicle Driver Obstructive Sleep Apnea Screening and Treatment in the United States: An Update and Recommendation Overview.

    PubMed

    Colvin, Loretta J; Collop, Nancy A

    2016-01-01

    No regulatory mandate exists in the United States (U.S.) for comprehensive obstructive sleep apnea (OSA) risk assessment and stratification for commercial motor vehicle (CMV) drivers. Current Federal Motor Carrier Safety Administration (FMCSA) requirements are outdated and depend largely on subjective report, a less reliable strategy in an occupational setting. Without FMCSA standards, sleep specialists, occupational medical examiners and employers rely on a collection of medical consensus recommendations to establish standards of care. These recommendations advise OSA risk assessment through a combination of focused medical history, physical examination, questionnaires, and accident history, which increase OSA detection compared to current FMCSA standards. For those diagnosed with OSA, consensus-based risk stratification helps identify CMV drivers who may benefit from OSA treatment and establish minimum standards for assessing treatment efficacy and adherence. Unfortunately no consolidated recommendation exists; rather, publications span medical and governmental literature in a patchwork fashion that no longer fully reflect current practice due to subsequent advances in OSA diagnosis, treatment, and technology. Based on searches of medical literature, internet materials, and reference lists from existing publications, an overview and discussion of key published recommendations regarding OSA assessment and treatment in CMV operators is provided. Suggestions for incorporating these recommendations into clinical sleep medicine practice in the U.S. are presented. The challenge for sleep specialists is maintaining the delicate balance between recommendations impacting standard of care and associated medico-legal impact with stakeholder interests from medical, regulatory, industry and public perspectives while providing high quality and efficient care. © 2016 American Academy of Sleep Medicine.

  14. Current situation of International Organization for Standardization/Technical Committee 249 international standards of traditional Chinese medicine.

    PubMed

    Liu, Yu-Qi; Wang, Yue-Xi; Shi, Nan-Nan; Han, Xue-Jie; Lu, Ai-Ping

    2017-05-01

    To review the current situation and progress of traditional Chinese medicine (TCM) international standards, standard projects and proposals in International Organization for Standardization (ISO)/ technical committee (TC) 249. ISO/TC 249 standards and standard projects on the ISO website were searched and new standard proposals information were collected from ISO/TC 249 National Mirror Committee in China. Then all the available data were summarized in 5 closely related items, including proposed time, proposed country, assigned working group (WG), current stage and classifification. In ISO/TC 249, there were 2 international standards, 18 standard projects and 24 new standard proposals proposed in 2014. These 44 standard subjects increased year by year since 2011. Twenty-nine of them were proposed by China, 15 were assigned to WG 4, 36 were in preliminary and preparatory stage and 8 were categorized into 4 fifields, 7 groups and sub-groups based on International Classifification Standards. A rapid and steady development of international standardization in TCM can be observed in ISO/TC 249.

  15. Combinatorial approach toward high-throughput analysis of direct methanol fuel cells.

    PubMed

    Jiang, Rongzhong; Rong, Charles; Chu, Deryn

    2005-01-01

    A 40-member array of direct methanol fuel cells (with stationary fuel and convective air supplies) was generated by electrically connecting the fuel cells in series. High-throughput analysis of these fuel cells was realized by fast screening of voltages between the two terminals of a fuel cell at constant current discharge. A large number of voltage-current curves (200) were obtained by screening the voltages through multiple small-current steps. Gaussian distribution was used to statistically analyze the large number of experimental data. The standard deviation (sigma) of voltages of these fuel cells increased linearly with discharge current. The voltage-current curves at various fuel concentrations were simulated with an empirical equation of voltage versus current and a linear equation of sigma versus current. The simulated voltage-current curves fitted the experimental data well. With increasing methanol concentration from 0.5 to 4.0 M, the Tafel slope of the voltage-current curves (at sigma=0.0), changed from 28 to 91 mV.dec-1, the cell resistance from 2.91 to 0.18 Omega, and the power output from 3 to 18 mW.cm-2.

  16. Preliminary studies of cotton non-lint content identification by near-infrared spectroscopy

    USDA-ARS?s Scientific Manuscript database

    The high demand for cotton production worldwide has presented a need for its standardized classification. There currently exists trained classers and instrumentation to distinguish key cotton quality parameters, such as some trash types and content. However, it is of interest to develop a universal...

  17. Up Front: Students Are Chafing under "Test Stress."

    ERIC Educational Resources Information Center

    American School Board Journal, 2001

    2001-01-01

    Although President Bush favors continuous testing, headlines reflect an intense, growing antitesting sentiment. One standard does not fit all, current systems are malfunctioning, and kids are short-changed. A recent report says abstinence-only sex education is ineffective; high teen birth rates underline the need for comprehensive approaches. (MLH)

  18. Effects of new dietary ingredients used in artificial diet for screwworm larvae (Diptera: Calliphoridae)

    USDA-ARS?s Scientific Manuscript database

    Spray-dried whole bovine blood, dry poultry egg, and a dry milk substitute are the constituents of the standard artificial diet currently used for mass rearing screwworm larvae, Cochliomyia hominivorax (Coquerel) (Diptera: Calliphoridae). Due to high cost and uncertainty of the commercial supply of ...

  19. Adequate Funding for Educational Technology

    ERIC Educational Resources Information Center

    Angle, Jason B.

    2010-01-01

    Public schools are currently operating in a pressure-cooker of accountability systems in which they must teach students to high standards and meet ever increasing targets for student proficiency, or face increasingly severe sanctions. Into this mix is thrown educational technology and the funding for that technology. The literature espouses the…

  20. Help Seeking in Academic Settings: Goals, Groups, and Contexts

    ERIC Educational Resources Information Center

    Karabenick, Stuart A., Ed.; Newman, Richard S., Ed.

    2006-01-01

    Building on Karabenick's earlier volume on this topic and maintaining its high standards of scholarship and intellectual rigor, this book brings together contemporary work that is theoretically as well as practically important. It highlights current trends in the area and gives expanded attention to applications to teaching and learning. The…

  1. 10 Writing Opportunities to "Teach to the Test"

    ERIC Educational Resources Information Center

    DeFauw, Danielle L.

    2013-01-01

    Within the current political and educative context, where high-stakes standardized assessments create a pressure-filled experience for teachers to "teach to the test," time spent on writing instruction that supports students in transferring their learning between classroom and assessment contexts is crucial. Teachers who must use prompts to…

  2. Attitudes toward Elementary School Student Retention.

    ERIC Educational Resources Information Center

    Faerber, Kay; Van Dusseldorp, Ralph

    Nonpromotion of elementary school students is a highly controversial and emotional issue, and a vast amount of literature has been devoted to the topic. With the current emphasis on raising academic standards in public schools, more and more educators are viewing "social promotion" with disfavor. This study was conducted to determine current…

  3. Combining Learning and Assessment to Improve Science Education

    ERIC Educational Resources Information Center

    Linn, Marcia C.; Chiu, Jennifer

    2011-01-01

    High-stakes tests take time away from valuable learning activities, narrow the focus of instruction, and imply that science involves memorizing details rather than understanding the natural world. Current tests lead precollege instructors to postpone science inquiry activities until after the last standardized test is completed--often during the…

  4. THE PASSIVE OZONE NETWORK IN DALLAS (POND CONCEPT) - A MODELING OPPORTUNITY WITH COMMUNITY INVOLVEMENT

    EPA Science Inventory

    Despite tremendous efforts towards regulating and controlling tropospheric ozone (O3) formation, over 70 million people currently live in U.S. counties which exceed the National Ambient Air Quality Standard (NAAQS) set for 03. These high 03 concentrations alone cost the U.S. ap...

  5. Healthy Young Children: A Manual for Programs.

    ERIC Educational Resources Information Center

    Kendrick, Abby Shapiro, Ed.; And Others

    This manual, which was developed as a reference and resource guide for program directors and teachers of young children, describes high standards for health policies. Also provided are information based on current research and recommendations from experts in health and early childhood education. The manual contains 7 sections and 19 chapters.…

  6. The Philosophy and Foundations of Vocational Education.

    ERIC Educational Resources Information Center

    MSS Information Corp., New York, NY.

    The introductory volume in a new series on vocational education, the book surveys recent literature on the philosophy and foundations of this relatively new field. Opening papers deal with the objectives of vocational education departments in high schools, current standards of technological and industrial education, and models for comprehensive…

  7. The State of State Prekindergarten Standards in 2003.

    ERIC Educational Resources Information Center

    Neuman, Susan B.; Roskos, Kathleen; Vukelich, Carol; Clements, Douglas

    Currently, an increasing number of states support school readiness programs, recognizing that high quality early childhood education positively affects all children's success in school and the quality of their future. Recent federal initiatives, including Good Start Grow Smart, the revised guidance for the Child Care and Development Fund (CCDF)…

  8. Effectiveness of Existing Eye Safety Legislation in Arizona.

    ERIC Educational Resources Information Center

    Gillaspy, Roy Eugene

    This study was designed to ascertain the current practices of eye safety in Arizona high school industrial education laboratories, including the enforcement of eye safety legislation, use of eye protection devices, how the eye ware meets the American National Standards Institute specifications, and the teachers' interpretations of the existing eye…

  9. EVALUATION OF A CRYPTOSPORIDIUM INTERNAL STANDARD FOR DETERMINING RECOVERY WITH ENVIRONMENTAL PROTECTION AGENCY METHOD 1623

    EPA Science Inventory

    The current benchmark method for detecting Cryptosporidium oocysts in water is the U.S. Environmental Protection Agency (U.S. EPA) Method 1623. Studies evaluating this method report that recoveries are highly variable and dependent upon laboratory, water sample, and analyst. Ther...

  10. The development of the time dependence of the nuclear EMP electric field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eng, C

    The nuclear electromagnetic pulse (EMP) electric field calculated with the legacy code CHAP is compared with the field given by an integral solution of Maxwell's equations, also known as the Jefimenko equation, to aid our current understanding on the factors that affect the time dependence of the EMP. For a fair comparison the CHAP current density is used as a source in the Jefimenko equation. At first, the comparison is simplified by neglecting the conduction current and replacing the standard atmosphere with a constant density air slab. The simplicity of the resultant current density aids in determining the factors thatmore » affect the rise, peak and tail of the EMP electric field versus time. The three dimensional nature of the radiating source, i.e. sources off the line-of-sight, and the time dependence of the derivative of the current density with respect to time are found to play significant roles in shaping the EMP electric field time dependence. These results are found to hold even when the conduction current and the standard atmosphere are properly accounted for. Comparison of the CHAP electric field with the Jefimenko electric field offers a direct validation of the high-frequency/outgoing wave approximation.« less

  11. Students' understanding of direct current resistive electrical circuits

    NASA Astrophysics Data System (ADS)

    Engelhardt, Paula Vetter; Beichner, Robert J.

    2004-01-01

    Both high school and university students' reasoning regarding direct current resistive electric circuits often differ from the accepted explanations. At present, there are no standard diagnostic tests on electric circuits. Two versions of a diagnostic instrument were developed, each consisting of 29 questions. The information provided by this test can provide instructors with a way of evaluating the progress and conceptual difficulties of their students. The analysis indicates that students, especially females, tend to hold multiple misconceptions, even after instruction. During interviews, the idea that the battery is a constant source of current was used most often in answering the questions. Students tended to focus on the current in solving problems and to confuse terms, often assigning the properties of current to voltage and/or resistance.

  12. Numerical Investigation of the Microscopic Heat Current Inside a Nanofluid System Based on Molecular Dynamics Simulation and Wavelet Analysis.

    PubMed

    Jia, Tao; Gao, Di

    2018-04-03

    Molecular dynamics simulation is employed to investigate the microscopic heat current inside an argon-copper nanofluid. Wavelet analysis of the microscopic heat current inside the nanofluid system is conducted. The signal of the microscopic heat current is decomposed into two parts: one is the approximation part; the other is the detail part. The approximation part is associated with the low-frequency part of the signal, and the detail part is associated with the high-frequency part of the signal. Both the probability distributions of the high-frequency and the low-frequency parts of the signals demonstrate Gaussian-like characteristics. The curves fit to data of the probability distribution of the microscopic heat current are established, and the parameters including the mean value and the standard deviation in the mathematical formulas of the curves show dramatic changes for the cases before and after adding copper nanoparticles into the argon base fluid.

  13. Teacher Professional Develpment That Meets 21st Century Science Education Standards

    NASA Astrophysics Data System (ADS)

    van der Veen, Wil E.; Roelofsen Moody, T.

    2011-01-01

    The National Academies are working with several other groups to develop new National Science Education Standards, with the intention that they will be adopted by all states. It is critical that the science education community uses these new standards when planning teacher professional development and understands the potential implementation challenges. As a first step in developing these new standards, the National Research Council (NRC) recently published a draft Framework for Science Education. This framework describes the major scientific ideas and practices that all students should be familiar with by the end of high school. Following recommendations from the NRC Report "Taking Science to School” (NRC, 2007), it emphasizes the importance of integrating science practices with the learning of science content. These same recommendations influenced the recently revised New Jersey Science Education Standards. Thus, the revised New Jersey standards can be valuable as a case study for curriculum developers and professional development providers. While collaborating with the New Jersey Department of Education on the development of these revised science standards, we identified two critical needs for successful implementation. First, we found that many currently used science activities must be adapted to meet the revised standards and that new activities must be developed. Second, teacher professional development is needed to model the integration of science practices with the learning of science content. With support from the National Space Grant Foundation we developed a week-long Astronomy Institute, which was presented in the summers of 2009 and 2010. We will briefly describe our professional development model and how it helped teachers to bridge the gap between the standards and their current classroom practice. We will provide examples of astronomy activities that were either adapted or developed to meet the new standards. Finally, we will briefly discuss the evaluation results.

  14. Haloperidol and Rimonabant Increase Delay Discounting in Rats Fed High-Fat and Standard-Chow Diets

    PubMed Central

    Boomhower, Steven R.; Rasmussen, Erin B.

    2016-01-01

    The dopamine and endocannabinoid neurotransmitter systems have been implicated in delay discounting, a measure of impulsive choice, and obesity. The current study was designed to determine the extent to which haloperidol and rimonabant affected delay discounting in rats fed standard-chow and high-fat diets. Sprague-Dawley rats were allowed to free-feed under a high-fat diet (4.73 kcal/g) or a standard-chow diet (3.0 kcal/g) for three months. Then, operant sessions began in which rats (n = 9 standard chow; n = 10 high-fat) chose between one sucrose pellet delivered immediately vs. three sucrose pellets after a series of delays. In another condition, carrot-flavored pellets replaced sucrose pellets. After behavior stabilized, acute injections of rimonabant (0.3-10 mg/kg) and haloperidol (0.003-0.1 mg/kg) were administered i.p. before some choice sessions in both pellet conditions. Haloperidol and rimonabant increased discounting in both groups of rats by decreasing percent choice for the larger reinforcer and area-under-the-curve (AUC) values. Rats in the high-fat diet condition demonstrated increased sensitivity to haloperidol compared to chow-fed controls: haloperidol increased discounting in both dietary groups in the sucrose condition,, but only in the high-fat-fed rats in the carrot-pellet condition. These findings indicate that blocking D2 and CB1 receptors results in increased delay discounting, and that a high-fat diet may alter sensitivity to dopaminergic compounds using the delay-discounting task. PMID:25000488

  15. Wheelchair transportation safety on school buses: stakeholder recommendations for priority issues and actions.

    PubMed

    Buning, Mary Ellen; Karg, Patricia E

    2011-01-01

    This paper presents results from and provides discussion of a state-of-the-science workshop in which highly informed stakeholders in wheelchair transportation safety for students on school buses were participants. The Nominal Group Technique was used to create a process in which the main issues preventing safe transportation of wheelchair-seated students and key strategies to overcome these issues were identified and ranked. These results, along with a synthesis of group discussion and recommendations for action, are presented along with consideration of current policies, regulations, and political realities. Critical safety shortcomings exist in this highly specialized enterprise that varies from state to state. Recommended strategies include implementing wheelchair requirements in federal transportation safety standards, creation of a clearinghouse for wheelchair transportation best practices and education, creation of national standards for training, practices, and monitoring, and increased "buy-in" to voluntary wheelchair standards by wheelchair manufacturers.

  16. The Coverage of Human Evolution in High School Biology Textbooks in the 20th Century and in Current State Science Standards

    ERIC Educational Resources Information Center

    Skoog, Gerald

    2005-01-01

    Efforts to eliminate or neutralize the coverage of evolution in high school biology textbooks in the United States have persisted with varying degrees of intensity and success since the 1920s. In particular, the coverage of human evolution has been impacted by these efforts. Evidence of the success of these efforts can be chronicled by the…

  17. High Standards for All Students: A Report from the National Assessment of Title I on Progress and Challenges since the 1994 Reauthorization.

    ERIC Educational Resources Information Center

    Chait, Robin; Hardcastle, Daphne; Kotzin, Stacy; LaPointe, Michelle; Miller, Meredith; Rimdzius, Tracy; Sanchez, Susan; Scott, Elois; Stullich, Stephanie; Thompson-Hoffman, Susan

    This report provides a comprehensive summary of the most recent data available from the National Assessment of Title I on the implementation of the Title I program and the academic performance of children in high poverty schools. Seven sections focus on: (1) "Policy Context for Title I" (provisions of the current Title I law and new…

  18. A Combination Therapy of JO-I and Chemotherapy in Ovarian Cancer Models

    DTIC Science & Technology

    2013-10-01

    which consists of a 3PAR storage backend and is sharing data via a highly available NetApp storage gateway and 2 high throughput commodity storage...Environment is configured as self- service Enterprise cloud and currently hosts more than 700 virtual machines. The network infrastructure consists of...technology infrastructure and information system applications designed to integrate, automate, and standardize operations. These systems fuse state of

  19. Development of Geography Text Books Used by Senior High School Teachers Case Study at East Java-Indonesia

    ERIC Educational Resources Information Center

    Purwanto, Edy; Fatchan, Ach.; Purwanto; Soekamto, Hadi

    2016-01-01

    The aim of this study was to analyze the geography text book for: (1) identify and describe the errors in the organization of geography textbooks, and (2) identify and describe the content of the textbook standard errors of geography. The text book is currently being used by teachers of Senior High School in East Java. To analyze the contents of…

  20. Self-focused attention affects subsequent processing of positive (but not negative) performance appraisals.

    PubMed

    Holzman, Jacob B; Valentiner, David P

    2016-03-01

    Cognitive-behavioral models highlight the conjoint roles of self-focused attention (SFA), post-event processing (PEP), and performance appraisals in the maintenance of social anxiety. SFA, PEP, and biased performance appraisals are related to social anxiety; however, limited research has examined how SFA affects information-processing following social events. The current study examined whether SFA affects the relationships between performance appraisals and PEP following a social event.. 137 participants with high (n = 72) or low (n = 65) social anxiety were randomly assigned to conditions of high SFA or low SFA while engaging in a standardized social performance. Subsequent performance appraisals and PEP were measured. Immediate performance appraisals were not affected by SFA. High levels of SFA led to a stronger, inverse relationship between immediate positive performance appraisals and subsequent negative PEP. High levels of SFA also led to a stronger, inverse relationship between negative PEP and changes in positive performance appraisals.. Future research should examine whether the current findings, which involved a standardized social performance event, extend to interaction events as well as in a clinical sample. These findings suggest that SFA affects the processing of positive information following a social performance event. SFA is particularly important for understanding how negative PEP undermines positive performance appraisals.. Published by Elsevier Ltd.

  1. High-frequency ultrasound imaging for breast cancer biopsy guidance

    PubMed Central

    Cummins, Thomas; Yoon, Changhan; Choi, Hojong; Eliahoo, Payam; Kim, Hyung Ham; Yamashita, Mary W.; Hovanessian-Larsen, Linda J.; Lang, Julie E.; Sener, Stephen F.; Vallone, John; Martin, Sue E.; Kirk Shung, K.

    2015-01-01

    Abstract. Image-guided core needle biopsy is the current gold standard for breast cancer diagnosis. Microcalcifications, an important radiographic finding on mammography suggestive of early breast cancer such as ductal carcinoma in situ, are usually biopsied under stereotactic guidance. This procedure, however, is uncomfortable for patients and requires the use of ionizing radiation. It would be preferable to biopsy microcalcifications under ultrasound guidance since it is a faster procedure, more comfortable for the patient, and requires no radiation. However, microcalcifications cannot reliably be detected with the current standard ultrasound imaging systems. This study is motivated by the clinical need for real-time high-resolution ultrasound imaging of microcalcifications, so that biopsies can be accurately performed under ultrasound guidance. We have investigated how high-frequency ultrasound imaging can enable visualization of microstructures in ex vivo breast tissue biopsy samples. We generated B-mode images of breast tissue and applied the Nakagami filtering technique to help refine image output so that microcalcifications could be better assessed during ultrasound-guided core biopsies. We describe the preliminary clinical results of high-frequency ultrasound imaging of ex vivo breast biopsy tissue with microcalcifications and without Nakagami filtering and the correlation of these images with the pathology examination by hematoxylin and eosin stain and whole slide digital scanning. PMID:26693167

  2. 49 CFR 238.230 - Safety appliances-new equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... contained in the current American Welding Society (AWS) Standard, the Canadian Welding Bureau (CWB) Standard... performed by an individual possessing the qualifications to be certified under the current AWS Standard, CWB...

  3. A programmable quantum current standard from the Josephson and the quantum Hall effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poirier, W., E-mail: wilfrid.poirier@lne.fr; Lafont, F.; Djordjevic, S.

    We propose a way to realize a programmable quantum current standard (PQCS) from the Josephson voltage standard and the quantum Hall resistance standard (QHR) exploiting the multiple connection technique provided by the quantum Hall effect (QHE) and the exactness of the cryogenic current comparator. The PQCS could lead to breakthroughs in electrical metrology like the realization of a programmable quantum current source, a quantum ampere-meter, and a simplified closure of the quantum metrological triangle. Moreover, very accurate universality tests of the QHE could be performed by comparing PQCS based on different QHRs.

  4. Islet cell transplant: Update on current clinical trials

    PubMed Central

    Schuetz, Christian; Markmann, James F.

    2016-01-01

    In the last 15 years clinical islet transplantation has made the leap from experimental procedure to standard of care for a highly selective group of patients. Due to a risk-benefit calculation involving the required systemic immunosuppression the procedure is only considered in patients with type 1 diabetes, complicated by severe hypoglycemia or end stage renal disease. In this review we summarize current outcomes of the procedure and take a look at ongoing and future improvements and refinements of beta cell therapy. PMID:28451515

  5. Physics with CMS and Electronic Upgrades

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohlf, James W.

    2016-08-01

    The current funding is for continued work on the Compact Muon Solenoid (CMS) at the CERN Large Hadron Collider (LHC) as part of the Energy Frontier experimental program. The current budget year covers the first year of physics running at 13 TeV (Run 2). During this period we have concentrated on commisioning of the μTCA electronics, a new standard for distribution of CMS trigger and timing control signals and high bandwidth data aquistiion as well as participating in Run 2 physics.

  6. A 10 Kelvin Magnet for Space-Flight ADRs

    NASA Technical Reports Server (NTRS)

    Tuttle, James; Pourrahimi, Shahin; Shirron, Peter; Canavan, Edgar; DiPirro, Michael; Riall, Sara

    2003-01-01

    Future NASA missions will include detectors cooled by adiabatic demagnetization refrigerators (ADRs) coupled with mechanical cryocoolers. A lightweight, low-current 10 Kelvin magnet would allow the interface between these devices to be at temperatures as high as 10 Kelvin, adding flexibility to the instrument design. We report on the testing of a standard-technology Nb3Sn magnet and the development of a lightweight, low-current 10 Kelvin magnet. We also discuss the outlook for flying a 10 Kelvin magnet as part of an ADR system.

  7. Users guide for ENVSTD program Version 2. 0 and LTGSTD program Version 2. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawley, D.B.; Riesen, P.K.; Briggs, R.S.

    1989-02-01

    On January 30, 1989, the US Department of Energy (DOE) promulgated 10 CFR Part 435, Subpart A, an Interim Rule entitled ''Energy Conservation Voluntary Performance Standards for New Commercial and Multi-Family High Rise Residential Buildings; Mandatory for New Federal Buildings.'' As a consequence, federal agencies must design all future federal commercial and multifamily high rise residential buildings in accordance with the Standards, or show that their current standards already meet or exceed the energy-efficiency requirements of the Standards. Although these newly enacted Standards do not regulate the design of nonfederal buildings, DOE recommends that all design professionals use the Standardsmore » as guidelines for designing energy-conserving buildings. To encourage private sector use, the Standards were presented in the January 30, 1989, Federal Register in the format typical of commercial standards rather than a federal regulation. As a further help, DOE supported the development of various microcomputer programs to ease the use of the Standards. Two of these programs/emdash/ENVSTD (Version 2.0) and LTGSTD (Version 2.0)/emdash/are detailed in this users guide and provided on the accompanying diskette. This package, developed by Pacific Northwest Laboratory (PNL), is intended to facilitate the designer's use of the Standards dealing specifically with a building's envelope and lighting system designs. Using these programs will greatly simplify the designer's task of performing the sometimes complex calculations needed to determine a design's compliance with the Standards. 3 refs., 6 figs.« less

  8. Space Flyable Hg(sup +) Frequency Standards

    NASA Technical Reports Server (NTRS)

    Prestage, John D.; Maleki, Lute

    1994-01-01

    We discuss a design for a space based atomic frequency standard (AFS) based on Hg(sup +) ions confined in a linear ion trap. This newly developed AFS should be well suited for space borne applications because it can supply the ultra-high stability of a H-maser but its total mass is comparable to that of a NAVSTAR/GPS cesium clock, i.e., about 11kg. This paper will compare the proposed Hg(sup +) AFS to the present day GPS cesium standards to arrive at the 11 kg mass estimate. The proposed space borne Hg(sup +) standard is based upon the recently developed extended linear ion trap architecture which has reduced the size of existing trapped Hg(sup +) standards to a physics package which is comparable in size to a cesium beam tube. The demonstrated frequency stability to below 10(sup -15) of existing Hg(sup +) standards should be maintained or even improved upon in this new architecture. This clock would deliver far more frequency stability per kilogram than any current day space qualified standard.

  9. The most intense current sheets in the high-speed solar wind near 1 AU

    NASA Astrophysics Data System (ADS)

    Podesta, John J.

    2017-03-01

    Electric currents in the solar wind plasma are investigated using 92 ms fluxgate magnetometer data acquired in a high-speed stream near 1 AU. The minimum resolvable scale is roughly 0.18 s in the spacecraft frame or, using Taylor's "frozen turbulence" approximation, one proton inertial length di in the plasma frame. A new way of identifying current sheets is developed that utilizes a proxy for the current density J obtained from the derivatives of the three orthogonal components of the observed magnetic field B. The most intense currents are identified as 5σ events, where σ is the standard deviation of the current density. The observed 5σ events are characterized by an average scale size of approximately 3di along the flow direction of the solar wind, a median separation of around 50di or 100di along the flow direction of the solar wind, and a peak current density on the order of 0.5 pA/cm2. The associated current-carrying structures are consistent with current sheets; however, the planar geometry of these structures cannot be confirmed using single-point, single-spacecraft measurements. If Taylor's hypothesis continues to hold for the energetically dominant fluctuations at kinetic scales 1

  10. Methods of Measurement of High Air Velocities by the Hot-wire Method

    NASA Technical Reports Server (NTRS)

    Weske, John R.

    1943-01-01

    Investigations of strengths of hot wires at high velocities were conducted with platinum, nickel, and tungsten at approximately 200 Degrees Celcius hot-wire temperature. The results appear to disqualify platinum for velocities approaching the sonic range; whereas nickel withstands sound velocity, and tungsten may be used for supersonic velocities under standard atmospheric conditions. Hot wires must be supported by rigid prolongs at high velocities to avoid wire breakage. Resting current measurements for constant temperature show agreement with King's relation.

  11. Potential impacts of climate change on the built environment: ASHRAE climate zones, building codes and national energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    New, Joshua Ryan; Kumar, Jitendra; Hoffman, Forrest M.

    Statement of the Problem: ASHRAE releases updates to 90.1 “Energy Standard for Buildings except Low-Rise Residential Buildings” every three years resulting in a 3.7%-17.3% increase in energy efficiency for buildings with each release. This is adopted by or informs building codes in nations across the globe, is the National Standard for the US, and individual states elect which release year of the standard they will enforce. These codes are built upon Standard 169 “Climatic Data for Building Design Standards,” the latest 2017 release of which defines climate zones based on 8, 118 weather stations throughout the world and data frommore » the past 8-25 years. This data may not be indicative of the weather that new buildings built today, will see during their upcoming 30-120 year lifespan. Methodology & Theoretical Orientation: Using more modern, high-resolution datasets from climate satellites, IPCC climate models (PCM and HadGCM), high performance computing resources (Titan) and new capabilities for clustering and optimization the authors briefly analyzed different methods for redefining climate zones. Using bottom-up analysis of multiple meteorological variables which were the subject matter, experts selected as being important to energy consumption, rather than the heating/cooling degree days currently used. Findings: We analyzed the accuracy of redefined climate zones, compared to current climate zones and how the climate zones moved under different climate change scenarios, and quantified the accuracy of these methods on a local level, at a national scale for the US. Conclusion & Significance: There is likely to be a significant annual, national energy and cost (billions USD) savings that could be realized by adjusting climate zones to take into account anticipated trends or scenarios in regional weather patterns.« less

  12. A comparison of high-speed links, their commercial support and ongoing R&D activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, H.L.; Barsotti, E.; Zimmermann, S.

    Technological advances and a demanding market have forced the development of higher bandwidth communication standards for networks, data links and busses. Most of these emerging standards are gathering enough momentum that their widespread availability and lower prices are anticipated. The hardware and software that support the physical media for most of these links is currently available, allowing the user community to implement fairly high-bandwidth data links and networks with commercial components. Also, switches needed to support these networks are available or being developed. The commercial suppose of high-bandwidth data links, networks and switching fabrics provides a powerful base for themore » implementation of high-bandwidth data acquisition systems. A large data acquisition system like the one for the Solenoidal Detector Collaboration (SDC) at the SSC can benefit from links and networks that support an integrated systems engineering approach, for initialization, downloading, diagnostics, monitoring, hardware integration and event data readout. The issue that our current work addresses is the possibility of having a channel/network that satisfies the requirements of an integrated data acquisition system. In this paper we present a brief description of high-speed communication links and protocols that we consider of interest for high energy physic High Performance Parallel Interface (HIPPI). Serial HIPPI, Fibre Channel (FC) and Scalable Coherent Interface (SCI). In addition, the initial work required to implement an SDC-like data acquisition system is described.« less

  13. A comparison of high-speed links, their commercial support and ongoing R D activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, H.L.; Barsotti, E.; Zimmermann, S.

    Technological advances and a demanding market have forced the development of higher bandwidth communication standards for networks, data links and busses. Most of these emerging standards are gathering enough momentum that their widespread availability and lower prices are anticipated. The hardware and software that support the physical media for most of these links is currently available, allowing the user community to implement fairly high-bandwidth data links and networks with commercial components. Also, switches needed to support these networks are available or being developed. The commercial suppose of high-bandwidth data links, networks and switching fabrics provides a powerful base for themore » implementation of high-bandwidth data acquisition systems. A large data acquisition system like the one for the Solenoidal Detector Collaboration (SDC) at the SSC can benefit from links and networks that support an integrated systems engineering approach, for initialization, downloading, diagnostics, monitoring, hardware integration and event data readout. The issue that our current work addresses is the possibility of having a channel/network that satisfies the requirements of an integrated data acquisition system. In this paper we present a brief description of high-speed communication links and protocols that we consider of interest for high energy physic High Performance Parallel Interface (HIPPI). Serial HIPPI, Fibre Channel (FC) and Scalable Coherent Interface (SCI). In addition, the initial work required to implement an SDC-like data acquisition system is described.« less

  14. High-throughput GPU-based LDPC decoding

    NASA Astrophysics Data System (ADS)

    Chang, Yang-Lang; Chang, Cheng-Chun; Huang, Min-Yu; Huang, Bormin

    2010-08-01

    Low-density parity-check (LDPC) code is a linear block code known to approach the Shannon limit via the iterative sum-product algorithm. LDPC codes have been adopted in most current communication systems such as DVB-S2, WiMAX, WI-FI and 10GBASE-T. LDPC for the needs of reliable and flexible communication links for a wide variety of communication standards and configurations have inspired the demand for high-performance and flexibility computing. Accordingly, finding a fast and reconfigurable developing platform for designing the high-throughput LDPC decoder has become important especially for rapidly changing communication standards and configurations. In this paper, a new graphic-processing-unit (GPU) LDPC decoding platform with the asynchronous data transfer is proposed to realize this practical implementation. Experimental results showed that the proposed GPU-based decoder achieved 271x speedup compared to its CPU-based counterpart. It can serve as a high-throughput LDPC decoder.

  15. NCI's national environmental research data collection: metadata management built on standards and preparing for the semantic web

    NASA Astrophysics Data System (ADS)

    Wang, Jingbo; Bastrakova, Irina; Evans, Ben; Gohar, Kashif; Santana, Fabiana; Wyborn, Lesley

    2015-04-01

    National Computational Infrastructure (NCI) manages national environmental research data collections (10+ PB) as part of its specialized high performance data node of the Research Data Storage Infrastructure (RDSI) program. We manage 40+ data collections using NCI's Data Management Plan (DMP), which is compatible with the ISO 19100 metadata standards. We utilize ISO standards to make sure our metadata is transferable and interoperable for sharing and harvesting. The DMP is used along with metadata from the data itself, to create a hierarchy of data collection, dataset and time series catalogues that is then exposed through GeoNetwork for standard discoverability. This hierarchy catalogues are linked using a parent-child relationship. The hierarchical infrastructure of our GeoNetwork catalogues system aims to address both discoverability and in-house administrative use-cases. At NCI, we are currently improving the metadata interoperability in our catalogue by linking with standardized community vocabulary services. These emerging vocabulary services are being established to help harmonise data from different national and international scientific communities. One such vocabulary service is currently being established by the Australian National Data Services (ANDS). Data citation is another important aspect of the NCI data infrastructure, which allows tracking of data usage and infrastructure investment, encourage data sharing, and increasing trust in research that is reliant on these data collections. We incorporate the standard vocabularies into the data citation metadata so that the data citation become machine readable and semantically friendly for web-search purpose as well. By standardizing our metadata structure across our entire data corpus, we are laying the foundation to enable the application of appropriate semantic mechanisms to enhance discovery and analysis of NCI's national environmental research data information. We expect that this will further increase the data discoverability and encourage the data sharing and reuse within the community, increasing the value of the data much further than its current use.

  16. Special Issue of Solid-State Electronics, dedicated to EUROSOI-ULIS 2016

    NASA Astrophysics Data System (ADS)

    Sverdlov, Viktor; Selberherr, Siegfried

    2017-02-01

    The current special issue of Solid-State Electronics includes 29 extended papers presented at the 2016 Second Joint International EUROSOI Workshop and International Conference on Ultimate Integration on Silicon (EUROSOI-ULIS 2016) held in Wien, Austria, on January 25-27, 2016. The papers entering to the special issue have been selected by the EUROSOI-ULIS 2016 Technical Program Committee based on the excellence of abstracts submitted and presentations delivered at the conference. In order to comply with the high standards of Solid-State Electronics the manuscripts went through the standard reviewing procedure.

  17. A novel frequency analysis method for assessing K(ir)2.1 and Na (v)1.5 currents.

    PubMed

    Rigby, J R; Poelzing, S

    2012-04-01

    Voltage clamping is an important tool for measuring individual currents from an electrically active cell. However, it is difficult to isolate individual currents without pharmacological or voltage inhibition. Herein, we present a technique that involves inserting a noise function into a standard voltage step protocol, which allows one to characterize the unique frequency response of an ion channel at different step potentials. Specifically, we compute the fast Fourier transform for a family of current traces at different step potentials for the inward rectifying potassium channel, K(ir)2.1, and the channel encoding the cardiac fast sodium current, Na(v)1.5. Each individual frequency magnitude, as a function of voltage step, is correlated to the peak current produced by each channel. The correlation coefficient vs. frequency relationship reveals that these two channels are associated with some unique frequencies with high absolute correlation. The individual IV relationship can then be recreated using only the unique frequencies with magnitudes of high absolute correlation. Thus, this study demonstrates that ion channels may exhibit unique frequency responses.

  18. Electrothermal Action of the Pulse of the Current of a Short Artificial-Lightning Stroke on Test Specimens of Wires and Cables of Electric Power Objects

    NASA Astrophysics Data System (ADS)

    Baranov, M. I.; Rudakov, S. V.

    2018-03-01

    The authors have given results of investigations of the electrothermal action of aperiodic pulses of temporal shape 10/350 μs of the current of a short artificial-lightning stroke on test specimens of electric wires and cables with copper and aluminum cores and sheaths with polyvinylchloride and polyethylene insulations of power circuits of industrial electric power objects. It has been shown that the thermal stability of such wires and cables is determined by the action integral of the indicated current pulse. The authors have found the maximum permissible and critical densities of this pulse in copper and aluminum current-carrying parts of the wires and cables. High-current experiments conducted under high-voltage laboratory conditions on a unique generator of 10/350 μs pulses of an artificial-lightning current with amplitude-time parameters normalized according to the existing requirements of international and national standards and with tolerances on them have confirmed the reliability of the proposed calculated estimate for thermal lightning resistance of cabling and wiring products.

  19. Electrothermal Action of the Pulse of the Current of a Short Artificial-Lightning Stroke on Test Specimens of Wires and Cables of Electric Power Objects

    NASA Astrophysics Data System (ADS)

    Baranov, M. I.; Rudakov, S. V.

    2018-05-01

    The authors have given results of investigations of the electrothermal action of aperiodic pulses of temporal shape 10/350 μs of the current of a short artificial-lightning stroke on test specimens of electric wires and cables with copper and aluminum cores and sheaths with polyvinylchloride and polyethylene insulations of power circuits of industrial electric power objects. It has been shown that the thermal stability of such wires and cables is determined by the action integral of the indicated current pulse. The authors have found the maximum permissible and critical densities of this pulse in copper and aluminum current-carrying parts of the wires and cables. High-current experiments conducted under high-voltage laboratory conditions on a unique generator of 10/350 μs pulses of an artificial-lightning current with amplitude-time parameters normalized according to the existing requirements of international and national standards and with tolerances on them have confirmed the reliability of the proposed calculated estimate for thermal lightning resistance of cabling and wiring products.

  20. Impedance of an intense plasma-cathode electron source for tokamak startup

    NASA Astrophysics Data System (ADS)

    Hinson, E. T.; Barr, J. L.; Bongard, M. W.; Burke, M. G.; Fonck, R. J.; Perry, J. M.

    2016-05-01

    An impedance model is formulated and tested for the ˜1 kV , 1 kA/cm2 , arc-plasma cathode electron source used for local helicity injection tokamak startup. A double layer sheath is established between the high-density arc plasma ( narc≈1021 m-3 ) within the electron source, and the less dense external tokamak edge plasma ( nedge≈1018 m-3 ) into which current is injected at the applied injector voltage, Vinj . Experiments on the Pegasus spherical tokamak show that the injected current, Iinj , increases with Vinj according to the standard double layer scaling Iinj˜Vinj3 /2 at low current and transitions to Iinj˜Vinj1 /2 at high currents. In this high current regime, sheath expansion and/or space charge neutralization impose limits on the beam density nb˜Iinj/Vinj1 /2 . For low tokamak edge density nedge and high Iinj , the inferred beam density nb is consistent with the requirement nb≤nedge imposed by space-charge neutralization of the beam in the tokamak edge plasma. At sufficient edge density, nb˜narc is observed, consistent with a limit to nb imposed by expansion of the double layer sheath. These results suggest that narc is a viable control actuator for the source impedance.

  1. Real-time image sequence segmentation using curve evolution

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Liu, Weisong

    2001-04-01

    In this paper, we describe a novel approach to image sequence segmentation and its real-time implementation. This approach uses the 3D structure tensor to produce a more robust frame difference signal and uses curve evolution to extract whole objects. Our algorithm is implemented on a standard PC running the Windows operating system with video capture from a USB camera that is a standard Windows video capture device. Using the Windows standard video I/O functionalities, our segmentation software is highly portable and easy to maintain and upgrade. In its current implementation on a Pentium 400, the system can perform segmentation at 5 frames/sec with a frame resolution of 160 by 120.

  2. Landsat Image Map Production Methods at the U. S. Geological Survey

    USGS Publications Warehouse

    Kidwell, R.D.; Binnie, D.R.; Martin, S.

    1987-01-01

    To maintain consistently high quality in satellite image map production, the U. S. Geological Survey (USGS) has developed standard procedures for the photographic and digital production of Landsat image mosaics, and for lithographic printing of multispectral imagery. This paper gives a brief review of the photographic, digital, and lithographic procedures currently in use for producing image maps from Landsat data. It is shown that consistency in the printing of image maps is achieved by standardizing the materials and procedures that affect the image detail and color balance of the final product. Densitometric standards are established by printing control targets using the pressplates, inks, pre-press proofs, and paper to be used for printing.

  3. Insulation Resistance and Leakage Currents in Low-Voltage Ceramic Capacitors with Cracks

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander A.

    2014-01-01

    Measurement of insulation resistance (IR) in multilayer ceramic capacitors (MLCCs) is considered a screening technique that ensures the dielectric is defect-free. This work analyzes the effectiveness of this technique for revealing cracks in ceramic capacitors. It is shown that absorption currents prevail over the intrinsic leakage currents during standard IR measurements at room temperature. Absorption currents, and consequently IR, have a weak temperature dependence, increase linearly with voltage (before saturation), and are not sensitive to the presence of mechanical defects. In contrary, intrinsic leakage currents increase super-linearly with voltage and exponentially with temperature (activation energy is in the range from 0.6 eV to 1.1 eV). Leakage currents associated with the presence of cracks have a weaker dependence on temperature and voltage compared to the intrinsic leakage currents. For this reason, intrinsic leakage currents prevail at high temperatures and voltages, thus masking the presence of defects.

  4. Insulation Resistance and Leakage Currents in Low-Voltage Ceramic Capacitors with Cracks

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander A.

    2016-01-01

    Measurement of insulation resistance (IR) in multilayer ceramic capacitors (MLCCs) is considered a screening technique that ensures the dielectric is defect-free. This work analyzes the effectiveness of this technique for revealing cracks in ceramic capacitors. It is shown that absorption currents prevail over the intrinsic leakage currents during standard IR measurements at room temperature. Absorption currents, and consequently IR, have a weak temperature dependence, increase linearly with voltage (before saturation), and are not sensitive to the presence of mechanical defects. In contrary, intrinsic leakage currents increase super-linearly with voltage and exponentially with temperature (activation energy is in the range from 0.6 eV to 1.1 eV). Leakage currents associated with the presence of cracks have a weaker dependence on temperature and voltage compared to the intrinsic leakage currents. For this reason, intrinsic leakage currents prevail at high temperatures and voltages, thus masking the presence of defects.

  5. Frequency optimization in the eddy current test for high purity niobium

    NASA Astrophysics Data System (ADS)

    Joung, Mijoung; Jung, Yoochul; Kim, Hyungjin

    2017-01-01

    The eddy current test (ECT) is frequently used as a non-destructive method to check for the defects of high purity niobium (RRR300, Residual Resistivity Ratio) in a superconducting radio frequency (SRF) cavity. Determining an optimal frequency corresponding to specific material properties and probe specification is a very important step. The ECT experiments for high purity Nb were performed to determine the optimal frequency using the standard sample of high purity Nb having artificial defects. The target depth was considered with the treatment step that the niobium receives as the SRF cavity material. The results were analysed via the selectivity that led to a specific result, depending on the size of the defects. According to the results, the optimal frequency was determined to be 200 kHz, and a few features of the ECT for the high purity Nb were observed.

  6. Development of a Hospital Outcome Measure Intended for Use With Electronic Health Records: 30-Day Risk-standardized Mortality After Acute Myocardial Infarction.

    PubMed

    McNamara, Robert L; Wang, Yongfei; Partovian, Chohreh; Montague, Julia; Mody, Purav; Eddy, Elizabeth; Krumholz, Harlan M; Bernheim, Susannah M

    2015-09-01

    Electronic health records (EHRs) offer the opportunity to transform quality improvement by using clinical data for comparing hospital performance without the burden of chart abstraction. However, current performance measures using EHRs are lacking. With support from the Centers for Medicare & Medicaid Services (CMS), we developed an outcome measure of hospital risk-standardized 30-day mortality rates for patients with acute myocardial infarction for use with EHR data. As no appropriate source of EHR data are currently available, we merged clinical registry data from the Action Registry-Get With The Guidelines with claims data from CMS to develop the risk model (2009 data for development, 2010 data for validation). We selected candidate variables that could be feasibly extracted from current EHRs and do not require changes to standard clinical practice or data collection. We used logistic regression with stepwise selection and bootstrapping simulation for model development. The final risk model included 5 variables available on presentation: age, heart rate, systolic blood pressure, troponin ratio, and creatinine level. The area under the receiver operating characteristic curve was 0.78. Hospital risk-standardized mortality rates ranged from 9.6% to 13.1%, with a median of 10.7%. The odds of mortality for a high-mortality hospital (+1 SD) were 1.37 times those for a low-mortality hospital (-1 SD). This measure represents the first outcome measure endorsed by the National Quality Forum for public reporting of hospital quality based on clinical data in the EHR. By being compatible with current clinical practice and existing EHR systems, this measure is a model for future quality improvement measures.

  7. Wavelet/scalar quantization compression standard for fingerprint images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brislawn, C.M.

    1996-06-12

    US Federal Bureau of Investigation (FBI) has recently formulated a national standard for digitization and compression of gray-scale fingerprint images. Fingerprints are scanned at a spatial resolution of 500 dots per inch, with 8 bits of gray-scale resolution. The compression algorithm for the resulting digital images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition (wavelet/scalar quantization method). The FBI standard produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. The compression standard specifies a class ofmore » potential encoders and a universal decoder with sufficient generality to reconstruct compressed images produced by any compliant encoder, allowing flexibility for future improvements in encoder technology. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations.« less

  8. Cord Blood Banking Standards: Autologous Versus Altruistic

    PubMed Central

    Armitage, Sue

    2016-01-01

    Cord blood (CB) is either donated to public CB banks for use by any patient worldwide for whom it is a match or stored in a private bank for potential autologous or family use. It is a unique cell product that has potential for treating life-threatening diseases. The majority of CB products used today are for hematopoietic stem cell transplantation and are accessed from public banks. CB is still evolving as a hematopoietic stem cell source, developing as a source for cellular immunotherapy products, such as natural killer, dendritic, and T-cells, and fast emerging as a non-hematopoietic stem cell source in the field of regenerative medicine. This review explores the regulations, standards, and accreditation schemes that are currently available nationally and internationally for public and private CB banking. Currently, most of private banking is under regulated as compared to public banking. Regulations and standards were initially developed to address the public arena. Early responses from the medical field regarding private CB banking was that at the present time, because of insufficient scientific data to support autologous banking and given the difficulty of making an accurate estimate of the need for autologous transplantation, private storage of CB as “biological insurance” should be discouraged (1, 2, 3). To ensure success and the true realization of the full potential of CB, whether for autologous or allogeneic use, it is essential that each and every product provided for current and future treatments meets high-quality, international standards. PMID:26779485

  9. Standardising Responsibility? The Significance of Interstitial Spaces.

    PubMed

    Wickson, Fern; Forsberg, Ellen-Marie

    2015-10-01

    Modern society is characterised by rapid technological development that is often socially controversial and plagued by extensive scientific uncertainty concerning its socio-ecological impacts. Within this context, the concept of 'responsible research and innovation' (RRI) is currently rising to prominence in international discourse concerning science and technology governance. As this emerging concept of RRI begins to be enacted through instruments, approaches, and initiatives, it is valuable to explore what it is coming to mean for and in practice. In this paper we draw attention to a realm that is often backgrounded in the current discussions of RRI but which has a highly significant impact on scientific research, innovation and policy-namely, the interstitial space of international standardization. Drawing on the case of nanoscale sciences and technologies to make our argument, we present examples of how international standards are already entangled in the development of RRI and yet, how the process of international standardization itself largely fails to embody the norms proposed as characterizing RRI. We suggest that although current models for RRI provide a promising attempt to make research and innovation more responsive to societal needs, ethical values and environmental challenges, such approaches will need to encompass and address a greater diversity of innovation system agents and spaces if they are to prove successful in their aims.

  10. Electronic Model of a Ferroelectric Field Effect Transistor

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat Duen; Russell, Larry (Technical Monitor)

    2001-01-01

    A pair of electronic models has been developed of a Ferroelectric Field Effect transistor. These models can be used in standard electrical circuit simulation programs to simulate the main characteristics of the FFET. The models use the Schmitt trigger circuit as a basis for their design. One model uses bipolar junction transistors and one uses MOSFET's. Each model has the main characteristics of the FFET, which are the current hysterisis with different gate voltages and decay of the drain current when the gate voltage is off. The drain current from each model has similar values to an actual FFET that was measured experimentally. T'he input and o Output resistance in the models are also similar to that of the FFET. The models are valid for all frequencies below RF levels. No attempt was made to model the high frequency characteristics of the FFET. Each model can be used to design circuits using FFET's with standard electrical simulation packages. These circuits can be used in designing non-volatile memory circuits and logic circuits and is compatible with all SPICE based circuit analysis programs. The models consist of only standard electrical components, such as BJT's, MOSFET's, diodes, resistors, and capacitors. Each model is compared to the experimental data measured from an actual FFET.

  11. Global Risk Assessment of Aflatoxins in Maize and Peanuts: Are Regulatory Standards Adequately Protective?

    PubMed Central

    Wu, Felicia

    2013-01-01

    The aflatoxins are a group of fungal metabolites that contaminate a variety of staple crops, including maize and peanuts, and cause an array of acute and chronic human health effects. Aflatoxin B1 in particular is a potent liver carcinogen, and hepatocellular carcinoma (HCC) risk is multiplicatively higher for individuals exposed to both aflatoxin and chronic infection with hepatitis B virus (HBV). In this work, we sought to answer the question: do current aflatoxin regulatory standards around the world adequately protect human health? Depending upon the level of protection desired, the answer to this question varies. Currently, most nations have a maximum tolerable level of total aflatoxins in maize and peanuts ranging from 4 to 20ng/g. If the level of protection desired is that aflatoxin exposures would not increase lifetime HCC risk by more than 1 in 100,000 cases in the population, then most current regulatory standards are not adequately protective even if enforced, especially in low-income countries where large amounts of maize and peanuts are consumed and HBV prevalence is high. At the protection level of 1 in 10,000 lifetime HCC cases in the population, however, almost all aflatoxin regulations worldwide are adequately protective, with the exception of several nations in Africa and Latin America. PMID:23761295

  12. Unwrapping eddy current compensation: improved compensation of eddy current induced baseline shifts in high-resolution phase-contrast MRI at 9.4 Tesla.

    PubMed

    Espe, Emil K S; Zhang, Lili; Sjaastad, Ivar

    2014-10-01

    Phase-contrast MRI (PC-MRI) is a versatile tool allowing evaluation of in vivo motion, but is sensitive to eddy current induced phase offsets, causing errors in the measured velocities. In high-resolution PC-MRI, these offsets can be sufficiently large to cause wrapping in the baseline phase, rendering conventional eddy current compensation (ECC) inadequate. The purpose of this study was to develop an improved ECC technique (unwrapping ECC) able to handle baseline phase discontinuities. Baseline phase discontinuities are unwrapped by minimizing the spatiotemporal standard deviation of the static-tissue phase. Computer simulations were used for demonstrating the theoretical foundation of the proposed technique. The presence of baseline wrapping was confirmed in high-resolution myocardial PC-MRI of a normal rat heart at 9.4 Tesla (T), and the performance of unwrapping ECC was compared with conventional ECC. Areas of phase wrapping in static regions were clearly evident in high-resolution PC-MRI. The proposed technique successfully eliminated discontinuities in the baseline, and resulted in significantly better ECC than the conventional approach. We report the occurrence of baseline phase wrapping in PC-MRI, and provide an improved ECC technique capable of handling its presence. Unwrapping ECC offers improved correction of eddy current induced baseline shifts in high-resolution PC-MRI. Copyright © 2013 Wiley Periodicals, Inc.

  13. Electric fence standards comport with human data and AC limits.

    PubMed

    Kroll, Mark W; Perkins, Peter E; Panescu, Dorin

    2015-08-01

    The ubiquitous electric fence is essential to modern agriculture and has saved lives by reducing the number of livestock automobile collisions. Modern safety standards such as IEC 60335-2-76 and UL 69 have played a role in this positive result. However, these standards are essentially based on energy and power (RMS current), which have limited direct relationship to cardiac effects. We compared these standards to bioelectrically more relevant units of charge and average current in view of recent work on VF (ventricular fibrillation) induction and to existing IEC AC current limits. There are 3 limits for normal (low) pulsing rate: IEC energy limit, IEC current limit, and UL current limit. We then calculated the delivered charge allowed for each pulse duration for these limits and then compared them to a charge-based safety model derived from published human ventricular-fibrillation induction data. Both the IEC and UL also allow for rapid pulsing for up to 3 minutes. We calculated maximum outputs for various pulse durations assuming pulsing at 10, 20, and 30 pulses per second. These were then compared to standard utility power safety (AC) limits via the conversion factor of 7.4 to convert average current to RMS current for VF risk. The outputs of TASER electrical weapons (typically < 100 μC and ~100 μs duration) were also compared. The IEC and UL electric fence energizer normal rate standards are conservative in comparison with actual human laboratory experiments. The IEC and UL electric fence energizer rapid-pulsing standards are consistent with accepted IEC AC current limits for commercially used pulse durations.

  14. LHC benchmark scenarios for the real Higgs singlet extension of the standard model

    DOE PAGES

    Robens, Tania; Stefaniak, Tim

    2016-05-13

    Here, we present benchmark scenarios for searches for an additional Higgs state in the real Higgs singlet extension of the Standard Model in Run 2 of the LHC. The scenarios are selected such that they ful ll all relevant current theoretical and experimental constraints, but can potentially be discovered at the current LHC run. We take into account the results presented in earlier work and update the experimental constraints from relevant LHC Higgs searches and signal rate measurements. The benchmark scenarios are given separately for the low mass and high mass region, i.e. the mass range where the additional Higgsmore » state is lighter or heavier than the discovered Higgs state at around 125 GeV. They have also been presented in the framework of the LHC Higgs Cross Section Working Group.« less

  15. Scientific misconduct, the pharmaceutical industry, and the tragedy of institutions.

    PubMed

    Cohen-Kohler, Jillian Clare; Esmail, Laura C

    2007-09-01

    This paper examines how current legislative and regulatory models do not adequately govern the pharmaceutical industry towards ethical scientific conduct. In the context of a highly profit-driven industry, governments need to ensure ethical and legal standards are not only in place for companies but that they are enforceable. We demonstrate with examples from both industrialized and developing countries how without sufficient controls, there is a risk that corporate behaviour will transgress ethical boundaries. We submit that there is a critical need for urgent drug regulatory reform. There must be robust regulatory structures in place which enforce corporate governance mechanisms to ensure that pharmaceutical companies maintain ethical standards in drug research and development and the marketing of pharmaceuticals. What is also needed is for the pharmaceutical industry to adopt authentic "corporate social responsibility" policies as current policies and practices are insufficient.

  16. (GERD)].

    PubMed

    Olmos, Jorge A; Piskorz, María Marta; Vela, Marcelo F

    2016-06-01

    GERD is a highly prevalent disease in our country. It has a deep impact in patient´s quality of life, representing extremely high costs regarding health. The correct understanding of its pathophysiology is crucial for the rational use of diagnoses methods and the implementation of appropriate treatment adjusted to each individual case. In this review we evaluate this disorder based on the best available evidence, focusing in pathophysiological mechanisms, its epidemiology, modern diagnosis methods and current management standards.

  17. Highlights from High Energy Neutrino Experiments at CERN

    NASA Astrophysics Data System (ADS)

    Schlatter, W.-D.

    2015-07-01

    Experiments with high energy neutrino beams at CERN provided early quantitative tests of the Standard Model. This article describes results from studies of the nucleon quark structure and of the weak current, together with the precise measurement of the weak mixing angle. These results have established a new quality for tests of the electroweak model. In addition, the measurements of the nucleon structure functions in deep inelastic neutrino scattering allowed first quantitative tests of QCD.

  18. Trends in noncompliance with milk quality standards for Dairy Herd Improvement herds in the United States

    USDA-ARS?s Scientific Manuscript database

    Frequency of herd noncompliance for somatic cell count (SCC) based on current US and European Union (EU) standards as well as for standards proposed by the National Milk Producers Federation (NMPF) was examined for US Dairy Herd Improvement (DHI) herds. For current US standards, regulatory action is...

  19. Reach for Reference. No Opposition Here! Opposing Viewpoints Resource Center Is a Very Good Database

    ERIC Educational Resources Information Center

    Safford, Barbara Ripp

    2004-01-01

    "Opposing Viewpoints" and "Opposing Viewpoints Juniors" have long been standard titles in upper elementary, middle level, and high school collections. "Opposing Viewpoints Juniors" should be required as information literacy/critical thinking curriculum tools as early as fifth grade as they use current controversies to teach students how to…

  20. Developing a User Oriented Design Methodology for Learning Activities Using Boundary Objects

    ERIC Educational Resources Information Center

    Fragou, ?lga; Kameas, Achilles

    2013-01-01

    International Standards in High and Open and Distance Education are used for developing Open Educational Resources (OERs). Current issues in e-learning community are the specification of learning chunks and the definition of describing designs for different units of learning (activities, units, courses) in a generic though expandable format.…

  1. A Diploma Worth Having

    ERIC Educational Resources Information Center

    Wiggins, Grant

    2011-01-01

    High school is boring, writes the author, in part because lock-step diploma requirements crowd out personalized and engaged learning. It is also boring because current content standards are based on traditional, subject-area notions of curriculum instead of on the essential question, What do students need to be well prepared for their adult lives?…

  2. Experiences of Teacher Evaluation Systems on High School Physical Education Programs

    ERIC Educational Resources Information Center

    Phillips, Sharon R.; Mercier, Kevin; Doolittle, Sarah

    2017-01-01

    Primary objective: Teacher evaluation is being revamped by policy-makers. The marginalized status of physical education has protected this subject area from reform for many decades, but in our current era of system-wide, data-based decision-making, physical education is no longer immune. Standardized and local testing, together with structured…

  3. Multidimensional Perfectionism and Internalizing Problems: Do Teacher and Classmate Support Matter?

    ERIC Educational Resources Information Center

    Fredrick, Stephanie Secord; Demaray, Michelle Kilpatrick; Jenkins, Lyndsay N.

    2017-01-01

    Adolescent stressors coupled with environmental demands, such as pressures to achieve, might lead to negative outcomes for some students. Students who worry about their ability to meet high standards might be more at risk of internalizing problems. The current study investigated the relations among perfectionism, social support, and internalizing…

  4. On the Factorial Structure of the SAT and Implications for Next-Generation College Readiness Assessments

    ERIC Educational Resources Information Center

    Wiley, Edward W.; Shavelson, Richard J.; Kurpius, Amy A.

    2014-01-01

    The name "SAT" has become synonymous with college admissions testing; it has been dubbed "the gold standard." Numerous studies on its reliability and predictive validity show that the SAT predicts college performance beyond high school grade point average. Surprisingly, studies of the factorial structure of the current version…

  5. Determining RNA quality for NextGen sequencing: some exceptions to the gold standard rule of 23S to 16S rRNA ratio

    USDA-ARS?s Scientific Manuscript database

    Using next-generation-sequencing technology to assess entire transcriptomes requires high quality starting RNA. Currently, RNA quality is routinely judged using automated microfluidic gel electrophoresis platforms and associated algorithms. Here we report that such automated methods generate false-n...

  6. Patterns in Authoring of Adaptive Educational Hypermedia: A Taxonomy of Learning Styles

    ERIC Educational Resources Information Center

    Brown, Elizabeth; Cristea, Alexandra; Stewart, Craig; Brailsford, Tim

    2005-01-01

    This paper describes the use of adaptation patterns in the task of formulating standards for adaptive educational hypermedia (AEH) systems that is currently under investigation by the EU ADAPT project. Within this project, design dimensions for high granularity patterns have been established. In this paper we focus on detailing lower granularity…

  7. A Humanizing Pedagogy: Reinventing the Principles and Practice of Education as a Journey toward Liberation

    ERIC Educational Resources Information Center

    Salazar, Maria del Carmen

    2013-01-01

    Students and educators are constrained from finding meaning in the current educational system as a result of the tension between educators' pedagogical practices and systemic constraints, such as high-stakes standardized tests and district-mandated instructional curriculum. Such restrictive educational policies limit educators from developing…

  8. The Sonoma Water Evaluation Trial (SWET): A randomized drinking water intervention trial to reduce gastrointestinal illness in older adults

    EPA Science Inventory

    Objectives. We estimate the risk of highly credible gastrointestinal illness (HCGI) among adults 55 and older in a community drinking tap water meeting current U.S. standards. Methods. We conducted a randomized, triple-blinded, crossover trial in 714 households (988 indiv...

  9. Engineering Education in Russia in an Era of Changes

    ERIC Educational Resources Information Center

    Lukianenko, M. V.; Polezhaev, O. A.; Churliaeva, N. P.

    2013-01-01

    Engineering education in Russia is undergoing reforms, but the history of this form of higher education does not indicate that it will succeed in bringing it into line with current world standards, or even making it more able to contribute at a high level to Russian economic growth. (Contains 5 notes.)

  10. 'Muscle-sparing' statins: preclinical profiles and future clinical use.

    PubMed

    Pfefferkorn, Jeffrey A

    2009-03-01

    Coronary heart disease (CHD) is a leading cause of death in the US, and hypercholesterolemia is a key risk factor for this disease. The current standard of care for treating hypercholesterolemia is the use of HMG-CoA reductase inhibitors, also known as statins, which block the rate-limiting step of cholesterol biosynthesis. In widespread clinical use, statins have proven safe and effective for both primary prevention of CHD and secondary prevention of coronary events. Results from several recent clinical trials have demonstrated that increasingly aggressive cholesterol-lowering therapy might offer additional protection against CHD compared with less aggressive treatment standards. While higher doses of current statin therapies are capable of achieving these more aggressive treatment goals, in certain cases statin-induced myalgia, the muscle pain or weakness that sometimes accompanies high-dose statin therapy, limits patient compliance with a treatment regimen. To address this limitation, efforts have been undertaken to develop highly hepatoselective statins that are capable of delivering best-in-class efficacy with minimized risk of dose-limiting myalgia. In this review, the preclinical and early clinical data for these next generation statins are discussed.

  11. Chemotherapy and target therapy in the management of adult high- grade gliomas.

    PubMed

    Spinelli, Gian Paolo; Miele, Evelina; Lo Russo, Giuseppe; Miscusi, Massimo; Codacci-Pisanelli, Giovanni; Petrozza, Vincenzo; Papa, Anselmo; Frati, Luigi; Della Rocca, Carlo; Gulino, Alberto; Tomao, Silverio

    2012-10-01

    Adult high grade gliomas (HGG) are the most frequent and fatal primary central nervous system (CNS) tumors. Despite recent advances in the knowledge of the pathology and the molecular features of this neoplasm, its prognosis remains poor. In the last years temozolomide (TMZ) has dramatically changed the life expectancy of these patients: the association of this drug with radiotherapy (RT), followed by TMZ alone, is the current standard of care. However, malignant gliomas often remain resistant to chemotherapy (CHT). Therefore, preclinical and clinical research efforts have been directed on identifying and understanding the different mechanisms of chemo-resistance operating in this subset of tumors,in order to develop effective strategies to overcome resistance. Moreover, the evidence of alterations in signal transduction pathways underlying tumor progression, has increased the number of trials investigating molecular target agents, such as anti-epidermal growth factor receptor (EGFR) and anti- vascular endothelial growth factor (VEGF) signaling. The purpose of this review is to point out the current standard of treatment and to explore new available target therapies in HGG.

  12. Animal Health and Welfare Issues Facing Organic Production Systems.

    PubMed

    Sutherland, Mhairi A; Webster, Jim; Sutherland, Ian

    2013-10-31

    The demand for organically-grown produce is increasing worldwide, with one of the drivers being an expectation among consumers that animals have been farmed to a high standard of animal welfare. This review evaluates whether this expectation is in fact being met, by describing the current level of science-based knowledge of animal health and welfare in organic systems. The primary welfare risk in organic production systems appears to be related to animal health. Organic farms use a combination of management practices, alternative and complementary remedies and convenional medicines to manage the health of their animals and in many cases these are at least as effective as management practices employed by non-organic producers. However, in contrast to non-organic systems, there is still a lack of scientifically evaluated, organically acceptable therapeutic treatments that organic animal producers can use when current management practices are not sufficient to maintain the health of their animals. The development of such treatments are necessary to assure consumers that organic animal-based food and fibre has not only been produced with minimal or no chemical input, but under high standards of animal welfare.

  13. SVG-Based Web Publishing

    NASA Astrophysics Data System (ADS)

    Gao, Jerry Z.; Zhu, Eugene; Shim, Simon

    2003-01-01

    With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.

  14. A Tool Measuring Remaining Thickness of Notched Acoustic Cavities in Primary Reaction Control Thruster NDI Standards

    NASA Technical Reports Server (NTRS)

    Sun, Yushi; Sun, Changhong; Zhu, Harry; Wincheski, Buzz

    2006-01-01

    Stress corrosion cracking in the relief radius area of a space shuttle primary reaction control thruster is an issue of concern. The current approach for monitoring of potential crack growth is nondestructive inspection (NDI) of remaining thickness (RT) to the acoustic cavities using an eddy current or remote field eddy current probe. EDM manufacturers have difficulty in providing accurate RT calibration standards. Significant error in the RT values of NDI calibration standards could lead to a mistaken judgment of cracking condition of a thruster under inspection. A tool based on eddy current principle has been developed to measure the RT at each acoustic cavity of a calibration standard in order to validate that the standard meets the sample design criteria.

  15. The JPEG XT suite of standards: status and future plans

    NASA Astrophysics Data System (ADS)

    Richter, Thomas; Bruylants, Tim; Schelkens, Peter; Ebrahimi, Touradj

    2015-09-01

    The JPEG standard has known an enormous market adoption. Daily, billions of pictures are created, stored and exchanged in this format. The JPEG committee acknowledges this success and spends continued efforts in maintaining and expanding the standard specifications. JPEG XT is a standardization effort targeting the extension of the JPEG features by enabling support for high dynamic range imaging, lossless and near-lossless coding, and alpha channel coding, while also guaranteeing backward and forward compatibility with the JPEG legacy format. This paper gives an overview of the current status of the JPEG XT standards suite. It discusses the JPEG legacy specification, and details how higher dynamic range support is facilitated both for integer and floating-point color representations. The paper shows how JPEG XT's support for lossless and near-lossless coding of low and high dynamic range images is achieved in combination with backward compatibility to JPEG legacy. In addition, the extensible boxed-based JPEG XT file format on which all following and future extensions of JPEG will be based is introduced. This paper also details how the lossy and lossless representations of alpha channels are supported to allow coding transparency information and arbitrarily shaped images. Finally, we conclude by giving prospects on upcoming JPEG standardization initiative JPEG Privacy & Security, and a number of other possible extensions in JPEG XT.

  16. Programmable, very low noise current source.

    PubMed

    Scandurra, G; Cannatà, G; Giusi, G; Ciofi, C

    2014-12-01

    We propose a new approach for the realization of very low noise programmable current sources mainly intended for application in the field of low frequency noise measurements. The design is based on a low noise Junction Field Effect Transistor (JFET) acting as a high impedance current source and programmability is obtained by resorting to a low noise, programmable floating voltage source that allows to set the sourced current at the desired value. The floating voltage source is obtained by exploiting the properties of a standard photovoltaic MOSFET driver. Proper filtering and a control network employing super-capacitors allow to reduce the low frequency output noise to that due to the low noise JFET down to frequencies as low as 100 mHz while allowing, at the same time, to set the desired current by means of a standard DA converter with an accuracy better than 1%. A prototype of the system capable of supplying currents from a few hundreds of μA up to a few mA demonstrates the effectiveness of the approach we propose. When delivering a DC current of about 2 mA, the power spectral density of the current fluctuations at the output is found to be less than 25 pA/√Hz at 100 mHz and less than 6 pA/√Hz for f > 1 Hz, resulting in an RMS noise in the bandwidth from 0.1 to 10 Hz of less than 14 pA.

  17. Programmable, very low noise current source

    NASA Astrophysics Data System (ADS)

    Scandurra, G.; Cannatà, G.; Giusi, G.; Ciofi, C.

    2014-12-01

    We propose a new approach for the realization of very low noise programmable current sources mainly intended for application in the field of low frequency noise measurements. The design is based on a low noise Junction Field Effect Transistor (JFET) acting as a high impedance current source and programmability is obtained by resorting to a low noise, programmable floating voltage source that allows to set the sourced current at the desired value. The floating voltage source is obtained by exploiting the properties of a standard photovoltaic MOSFET driver. Proper filtering and a control network employing super-capacitors allow to reduce the low frequency output noise to that due to the low noise JFET down to frequencies as low as 100 mHz while allowing, at the same time, to set the desired current by means of a standard DA converter with an accuracy better than 1%. A prototype of the system capable of supplying currents from a few hundreds of μA up to a few mA demonstrates the effectiveness of the approach we propose. When delivering a DC current of about 2 mA, the power spectral density of the current fluctuations at the output is found to be less than 25 pA/√Hz at 100 mHz and less than 6 pA/√Hz for f > 1 Hz, resulting in an RMS noise in the bandwidth from 0.1 to 10 Hz of less than 14 pA.

  18. Nonvolatile, semivolatile, or volatile: redefining volatile for volatile organic compounds.

    PubMed

    Võ, Uyên-Uyén T; Morris, Michael P

    2014-06-01

    Although widely used in air quality regulatory frameworks, the term "volatile organic compound" (VOC) is poorly defined. Numerous standardized tests are currently used in regulations to determine VOC content (and thus volatility), but in many cases the tests do not agree with each other, nor do they always accurately represent actual evaporation rates under ambient conditions. The parameters (time, temperature, reference material, column polarity, etc.) used in the definitions and the associated test methods were created without a significant evaluation of volatilization characteristics in real world settings. Not only do these differences lead to varying VOC content results, but occasionally they conflict with one another. An ambient evaporation study of selected compounds and a few formulated products was conducted and the results were compared to several current VOC test methodologies: SCAQMD Method 313 (M313), ASTM Standard Test Method E 1868-10 (E1868), and US. EPA Reference Method 24 (M24). The ambient evaporation study showed a definite distinction between nonvolatile, semivolatile, and volatile compounds. Some low vapor pressure (LVP) solvents, currently considered exempt as VOCs by some methods, volatilize at ambient conditions nearly as rapidly as the traditional high-volatility solvents they are meant to replace. Conversely, bio-based and heavy hydrocarbons did not readily volatilize, though they often are calculated as VOCs in some traditional test methods. The study suggests that regulatory standards should be reevaluated to more accurately reflect real-world emission from the use of VOC containing products. The definition of VOC in current test methods may lead to regulations that exclude otherwise viable alternatives or allow substitutions of chemicals that may limit the environmental benefits sought in the regulation. A study was conducted to examine volatility of several compounds and a few formulated products under several current VOC test methodologies and ambient evaporation. This paper provides ample evidence to warrant a reevaluation of regulatory standards and provides a framework for progressive developments based on reasonable and scientifically justifiable definitions of VOCs.

  19. Multilevel DC Link Inverter for Brushless Permanent Magnet Motors with Very Low Inductance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, G.J.

    2001-10-29

    Due to their long effective air gaps, permanent magnet motors tend to have low inductance. The use of ironless stator structure in present high power PM motors (several tens of kWs) reduces the inductance even further (< 100 {micro}H). This low inductance imposes stringent current regulation demands for the inverter to obtain acceptable current ripple. An analysis of the current ripple for these low inductance brushless PM motors shows that a standard inverter with the most commonly used IGBT switching devices cannot meet the current regulation demands and will produce unacceptable current ripples due to the IGBT's limited switching frequency.more » This paper introduces a new multilevel dc link inverter, which can dramatically reduce the current ripple for brushless PM motor drives. The operating principle and design guidelines are included.« less

  20. A Simple Strategy for Implementing Standard Reference Terminologies in a Distributed Healthcare Delivery System with Minimal Impact to Existing Applications

    PubMed Central

    Bouhaddou, Omar; Lincoln, Michael J.; Maulden, Sarah; Murphy, Holli; Warnekar, Pradnya; Nguyen, Viet; Lam, Siew; Brown, Steven H; Frankson, Ferdinand J.; Crandall, Glen; Hughes, Carla; Sigley, Roger; Insley, Marcia; Graham, Gail

    2006-01-01

    The Veterans Administration (VA) has adopted an ambitious program to standardize its clinical terminology to comply with industry-wide standards. The VA is using commercially available tools and in-house software to create a high-quality reference terminology system. The terminology will be used by current and future applications with no planned disruption to operational systems. The first large customer of the group is the national VA Health Data Repository (HDR). Unique enterprise identifiers are assigned to each standard term, and a rich network of semantic relationships makes the resulting data not only recognizable, but highly computable and reusable in a variety of applications, including decision support and data sharing with partners such as the Department of Defense (DoD). This paper describes the specific methods and approaches that the VA has employed to develop and implement this innovative program in existing information system. The goal is to share with others our experience with key issues that face our industry as we move toward an electronic health record for every individual. PMID:17238306

  1. Commercial Motor Vehicle Driver Obstructive Sleep Apnea Screening and Treatment in the United States: An Update and Recommendation Overview

    PubMed Central

    Colvin, Loretta J.; Collop, Nancy A.

    2016-01-01

    No regulatory mandate exists in the United States (U.S.) for comprehensive obstructive sleep apnea (OSA) risk assessment and stratification for commercial motor vehicle (CMV) drivers. Current Federal Motor Carrier Safety Administration (FMCSA) requirements are outdated and depend largely on subjective report, a less reliable strategy in an occupational setting. Without FMCSA standards, sleep specialists, occupational medical examiners and employers rely on a collection of medical consensus recommendations to establish standards of care. These recommendations advise OSA risk assessment through a combination of focused medical history, physical examination, questionnaires, and accident history, which increase OSA detection compared to current FMCSA standards. For those diagnosed with OSA, consensus-based risk stratification helps identify CMV drivers who may benefit from OSA treatment and establish minimum standards for assessing treatment efficacy and adherence. Unfortunately no consolidated recommendation exists; rather, publications span medical and governmental literature in a patchwork fashion that no longer fully reflect current practice due to subsequent advances in OSA diagnosis, treatment, and technology. Based on searches of medical literature, internet materials, and reference lists from existing publications, an overview and discussion of key published recommendations regarding OSA assessment and treatment in CMV operators is provided. Suggestions for incorporating these recommendations into clinical sleep medicine practice in the U.S. are presented. The challenge for sleep specialists is maintaining the delicate balance between recommendations impacting standard of care and associated medico-legal impact with stakeholder interests from medical, regulatory, industry and public perspectives while providing high quality and efficient care. Citation: Colvin LJ, Collop NA. Commercial motor vehicle driver obstructive sleep apnea screening and treatment in the United States: an update and recommendation overview. J Clin Sleep Med 2016;12(1):113–125. PMID:26094916

  2. A Randomized, Controlled Trial of ZMapp for Ebola Virus Infection

    PubMed Central

    2016-01-01

    BACKGROUND Data from studies in nonhuman primates suggest that the triple monoclonal antibody cocktail ZMapp is a promising immune-based treatment for Ebola virus disease (EVD). METHODS Beginning in March 2015, we conducted a randomized, controlled trial of ZMapp plus the current standard of care as compared with the current standard of care alone in patients with EVD that was diagnosed in West Africa by polymerase-chain-reaction (PCR) assay. Eligible patients of any age were randomly assigned in a 1:1 ratio to receive either the current standard of care or the current standard of care plus three intravenous infusions of ZMapp (50 mg per kilogram of body weight, administered every third day). Patients were stratified according to baseline PCR cycle-threshold value for the virus (≤22 vs. >22) and country of enrollment. Oral favipiravir was part of the current standard of care in Guinea. The primary end point was mortality at 28 days. RESULTS A total of 72 patients were enrolled at sites in Liberia, Sierra Leone, Guinea, and the United States. Of the 71 patients who could be evaluated, 21 died, representing an overall case fatality rate of 30%. Death occurred in 13 of 35 patients (37%) who received the current standard of care alone and in 8 of 36 patients (22%) who received the current standard of care plus ZMapp. The observed posterior probability that ZMapp plus the current standard of care was superior to the current standard of care alone was 91.2%, falling short of the prespecified threshold of 97.5%. Frequentist analyses yielded similar results (absolute difference in mortality with ZMapp, −15 percentage points; 95% confidence interval, −36 to 7). Baseline viral load was strongly predictive of both mortality and duration of hospitalization in all age groups. CONCLUSIONS In this randomized, controlled trial of a putative therapeutic agent for EVD, although the estimated effect of ZMapp appeared to be beneficial, the result did not meet the prespecified statistical threshold for efficacy. (Funded by the National Institute of Allergy and Infectious Diseases and others; PREVAIL II ClinicalTrials.gov number, NCT02363322.) PMID:27732819

  3. Current Status of Multidisciplinary Care in Psoriatic Arthritis in Spain: NEXUS 2.0 Project.

    PubMed

    Queiro, Rubén; Coto, Pablo; Joven, Beatriz; Rivera, Raquel; Navío Marco, Teresa; de la Cueva, Pablo; Alvarez Vega, Jose Luis; Narváez Moreno, Basilio; Rodriguez Martínez, Fernando José; Pardo Sánchez, José; Feced Olmos, Carlos; Pujol, Conrad; Rodríguez, Jesús; Notario, Jaume; Pujol Busquets, Manel; García Font, Mercè; Galindez, Eva; Pérez Barrio, Silvia; Urruticoechea-Arana, Ana; Hergueta, Merce; López Montilla, M Dolores; Vélez García-Nieto, Antonio; Maceiras, Francisco; Rodríguez Pazos, Laura; Rubio Romero, Esteban; Rodríguez Fernandez Freire, Lourdes; Luelmo, Jesús; Gratacós, Jordi

    2018-02-26

    1) To analyze the implementation of multidisciplinary care models in psoriatic arthritis (PsA) patients, 2) To define minimum and excellent standards of care. A survey was sent to clinicians who already performed multidisciplinary care or were in the process of undertaking it, asking: 1) Type of multidisciplinary care model implemented; 2) Degree, priority and feasibility of the implementation of quality standards in the structure, process and result for care. In 6 regional meetings the results of the survey were presented and discussed, and the ultimate priority of quality standards for care was defined. At a nominal meeting group, 11 experts (rheumatologists and dermatologists) analyzed the results of the survey and the regional meetings. With this information, they defined which standards of care are currently considered as minimum and which are excellent. The simultaneous and parallel models of multidisciplinary care are those most widely implemented, but the implementation of quality standards is highly variable. In terms of structure it ranges from 22% to 74%, in those related to process from 17% to 54% and in the results from 2% to 28%. Of the 25 original quality standards for care, 9 were considered only minimum, 4 were excellent and 12 defined criteria for minimum level and others for excellence. The definition of minimum and excellent quality standards for care will help achieve the goal of multidisciplinary care for patients with PAs, which is the best healthcare possible. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  4. Policies and practices of beach monitoring in the Great Lakes, USA: a critical review

    USGS Publications Warehouse

    Nevers, Meredith B.; Whitman, Richard L.

    2010-01-01

    Beaches throughout the Great Lakes are monitored for fecal indicator bacteria (typically Escherichia coli) in order to protect the public from potential sewage contamination. Currently, there is no universal standard for sample collection and analysis or results interpretation. Monitoring policies are developed by individual beach management jurisdictions, and applications are highly variable across and within lakes, states, and provinces. Extensive research has demonstrated that sampling decisions for time, depth, number of replicates, frequency of sampling, and laboratory analysis all influence the results outcome, as well as calculations of the mean and interpretation of the results in policy decisions. Additional shortcomings to current monitoring approaches include appropriateness and reliability of currently used indicator bacteria and the overall goal of these monitoring programs. Current research is attempting to circumvent these complex issues by developing new tools and methods for beach monitoring. In this review, we highlight the variety of sampling routines used across the Great Lakes and the extensive body of research that challenges comparisons among beaches. We also assess the future of Great Lakes monitoring and the advantages and disadvantages of establishing standards that are evenly applied across all beaches.

  5. Validation of a quantized-current source with 0.2 ppm uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stein, Friederike; Fricke, Lukas, E-mail: lukas.fricke@ptb.de; Scherer, Hansjörg

    2015-09-07

    We report on high-accuracy measurements of quantized current, sourced by a tunable-barrier single-electron pump at frequencies f up to 1 GHz. The measurements were performed with an ultrastable picoammeter instrument, traceable to the Josephson and quantum Hall effects. Current quantization according to I = ef with e being the elementary charge was confirmed at f = 545 MHz with a total relative uncertainty of 0.2 ppm, improving the state of the art by about a factor of 5. The accuracy of a possible future quantum current standard based on single-electron transport was experimentally validated to be better than the best (indirect) realization of the ampere within themore » present SI.« less

  6. Clinical utility of cerebrospinal fluid biomarkers in the diagnosis of early Alzheimer’s disease

    PubMed Central

    Blennow, Kaj; Dubois, Bruno; Fagan, Anne M.; Lewczuk, Piotr; de Leon, Mony J.; Hampel, Harald

    2015-01-01

    Several potential disease-modifying drugs for Alzheimer’s disease (AD) have failed to show any effect on disease progression in clinical trials, conceivably because the AD subjects are already too advanced to derive clinical benefit from treatment and because diagnosis based on clinical criteria alone introduces a high misdiagnosis rate. Thus, well-validated biomarkers for early detection and accurate diagnosis are crucial. Low cerebrospinal fluid (CSF) concentrations of the amyloid-β (Aβ1-42) peptide, in combination with high total tau and phosphorylated tau, are sensitive and specific biomarkers highly predictive of progression to AD dementia in patients with mild cognitive impairment. However, interlaboratory variations in the results seen with currently available immunoassays are of concern. Recent worldwide standardization efforts and quality control programs include standard operating procedures for both preanalytical (e.g., lumbar puncture and sample handling) and analytical (e.g., preparation of calibration curve) procedures. Efforts are also ongoing to develop highly reproducible assays on fully automated instruments. These global standardization and harmonization measures will provide the basis for the generalized international application of CSF bio-markers for both clinical trials and routine clinical diagnosis of AD. PMID:24795085

  7. A Microfluidic, High Throughput Protein Crystal Growth Method for Microgravity

    PubMed Central

    Carruthers Jr, Carl W.; Gerdts, Cory; Johnson, Michael D.; Webb, Paul

    2013-01-01

    The attenuation of sedimentation and convection in microgravity can sometimes decrease irregularities formed during macromolecular crystal growth. Current terrestrial protein crystal growth (PCG) capabilities are very different than those used during the Shuttle era and that are currently on the International Space Station (ISS). The focus of this experiment was to demonstrate the use of a commercial off-the-shelf, high throughput, PCG method in microgravity. Using Protein BioSolutions’ microfluidic Plug Maker™/CrystalCard™ system, we tested the ability to grow crystals of the regulator of glucose metabolism and adipogenesis: peroxisome proliferator-activated receptor gamma (apo-hPPAR-γ LBD), as well as several PCG standards. Overall, we sent 25 CrystalCards™ to the ISS, containing ~10,000 individual microgravity PCG experiments in a 3U NanoRacks NanoLab (1U = 103 cm.). After 70 days on the ISS, our samples were returned with 16 of 25 (64%) microgravity cards having crystals, compared to 12 of 25 (48%) of the ground controls. Encouragingly, there were more apo-hPPAR-γ LBD crystals in the microgravity PCG cards than the 1g controls. These positive results hope to introduce the use of the PCG standard of low sample volume and large experimental density to the microgravity environment and provide new opportunities for macromolecular samples that may crystallize poorly in standard laboratories. PMID:24278480

  8. Underwater sound radiation patterns of contemporary merchant ships

    NASA Astrophysics Data System (ADS)

    Gassmann, M.; Wiggins, S. M.; Hildebrand, J. A.

    2016-12-01

    Merchant ships radiate underwater sound as an unintended by-product of their operation and as consequence contribute significantly to low-frequency, man-made noise in the ocean. Current measurement standards for the description of underwater sound from ships (ISO 17208-1:2016 and ANSI S12.64-2009) require nominal hydrophone depths of 15°, 30° and 45° at the starboard and portside of the test vessel.To opportunistically study the underwater sound of contemporary merchant ships that were tracked by the Automatic Identification System (AIS), an array of seven high-frequency acoustic recording packages (HARPs) with a sampling frequency of 200 kHz was deployed in the Santa Barbara Channel in the primary outgoing shipping lane for the port of Los Angeles and Long Beach. The vertical and horizontal aperture of the array allowed for starboard and portside measurements at all standard-required nominal hydrophone depths in addition to measurements taken at the keel aspect. Based on these measurements, frequency-dependent radiation patterns of contemporary merchant ships were estimated and used to evaluate current standards for computing ship source levels.

  9. On-line hemodiafiltration. Gold standard or top therapy?

    PubMed

    Passlick-Deetjen, Jutta; Pohlmeier, Robert

    2002-01-01

    In summary, on-line HDF is an extracorporeal blood purification therapy with increased convective removal of uremic toxins as compared to the most frequently used low- or high-flux HD therapy. The clinical advantages of on-line HDF have shown to be dose dependent, which makes on-line HDF superior to other therapies with less convective solute removal. Among the therapies with high convective solute removal, i.e. on-line HDF, on-line HF and double high-flux dialysis, it is difficult to finally decide on the best therapy, as direct comparisons of these therapies are not performed. Theoretical considerations like the relative to on-line HDF lower achievable Kt/Vurea with on-line HF, allow to state that on-line HDF is the top therapy now available for patients with ESRD. A gold standard may be defined as something with which everything else is compared if one tries to establish it in the respective field. In order to declare on-line HDF as the gold standard in renal replacement therapy, we need more direct comparisons of on-line HDF with other therapies, including mortality as an outcome parameter. However, based on our current knowledge, it does not seem to be too speculative that high-quality clinical studies will establish on-line HDF in the next years as the new gold standard in renal replacement therapy.

  10. Field programmable analog array based on current differencing transconductance amplifiers and its application to high-order filter

    NASA Astrophysics Data System (ADS)

    He, Haizhen; Luo, Rongming; Hu, Zhenhua; Wen, Lei

    2017-07-01

    A current-mode field programmable analog array(FPAA) is presented in this paper. The proposed FPAA consists of 9 configurable analog blocks(CABs) which are based on current differencing transconductance amplifiers (CDTA) and trans-impedance amplifier (TIA). The proposed CABs interconnect through global lines. These global lines contain some bridge switches, which used to reduce the parasitic capacitance effectively. High-order current-mode low-pass and band-pass filter with transmission zeros based on the simulation of general passive RLC ladder prototypes is proposed and mapped into the FPAA structure in order to demonstrate the versatility of the FPAA. These filters exhibit good performance on bandwidth. Filter's cutoff frequency can be tuned from 1.2MHz to 40MHz.The proposed FPAA is simulated in a standard Charted 0.18μm CMOS process with +/-1.2V power supply to confirm the presented theory, and the results have good agreement with the theoretical analysis.

  11. Integration experiences and performance studies of A COTS parallel archive systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Bary

    2010-01-01

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less

  12. Integration experiments and performance studies of a COTS parallel archive system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Gary

    2010-06-16

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less

  13. Frequency of malaria and glucose-6-phosphate dehydrogenase deficiency in Tajikistan.

    PubMed

    Rebholz, Cornelia E; Michel, Anette J; Maselli, Daniel A; Saipphudin, Karimov; Wyss, Kaspar

    2006-06-16

    During the Soviet era, malaria was close to eradication in Tajikistan. Since the early 1990s, the disease has been on the rise and has become endemic in large areas of southern and western Tajikistan. The standard national treatment for Plasmodium vivax is based on primaquine. This entails the risk of severe haemolysis for patients with glucose-6-phosphate dehydrogenase (G6PD) deficiency. Seasonal and geographical distribution patterns as well as G6PD deficiency frequency were analysed with a view to improve understanding of the current malaria situation in Tajikistan. Spatial and seasonal distribution was analysed, applying a risk model that included key environmental factors such as temperature and the availability of mosquito breeding sites. The frequency of G6PD deficiency was studied at the health service level, including a cross-sectional sample of 382 adult men. Analysis revealed high rates of malaria transmission in most districts of the southern province of Khatlon, as well as in some zones in the northern province of Sughd. Three categories of risk areas were identified: (i) zones at relatively high malaria risk with high current incidence rates, where malaria control and prevention measures should be taken at all stages of the transmission cycle; (ii) zones at relatively high malaria risk with low current incidence rates, where malaria prevention measures are recommended; and (iii) zones at intermediate or low malaria risk with low current incidence rates where no particular measures appear necessary. The average prevalence of G6PD deficiency was 2.1% with apparent differences between ethnic groups and geographical regions. The study clearly indicates that malaria is a serious health issue in specific regions of Tajikistan. Transmission is mainly determined by temperature. Consequently, locations at lower altitude are more malaria-prone. G6PD deficiency frequency is too moderate to require fundamental changes in standard national treatment of cases of P. vivax.

  14. Improved high operating temperature MCT MWIR modules

    NASA Astrophysics Data System (ADS)

    Lutz, H.; Breiter, R.; Figgemeier, H.; Schallenberg, T.; Schirmacher, W.; Wollrab, R.

    2014-06-01

    High operating temperature (HOT) IR-detectors are a key factor to size, weight and power (SWaP) reduced IR-systems. Such systems are essential to provide infantrymen with low-weight handheld systems with increased battery lifetimes or most compact clip-on weapon sights in combination with high electro-optical performance offered by cooled IR-technology. AIM's MCT standard n-on-p technology with vacancy doping has been optimized over many years resulting in MWIR-detectors with excellent electro-optical performance up to operating temperatures of ~120K. In the last years the effort has been intensified to improve this standard technology by introducing extrinsic doping with Gold as an acceptor. As a consequence the dark current could considerably be suppressed and allows for operation at ~140K with good e/o performance. More detailed investigations showed that limitation for HOT > 140K is explained by consequences from rising dark current rather than from defective pixel level. Recently, several crucial parameters were identified showing great promise for further optimization of HOT-performance. Among those, p-type concentration could successfully be reduced from the mid 1016 / cm3 to the lower 1015/ cm3 range. Since AIM is one of the leading manufacturers of split linear cryocoolers, an increase in operating temperature will directly lead to IR-modules with improved SWaP characteristics by making use of the miniature members of its SX cooler family with single piston and balancer technology. The paper will present recent progress in the development of HOT MWIR-detector arrays at AIM and show electro-optical performance data in comparison to focal plane arrays produced in the standard technology.

  15. Treating an Established Episode of Delirium in Palliative Care: Expert Opinion and Review of the Current Evidence Base With Recommendations for Future Development

    PubMed Central

    Pereira, José L.; Davis, Daniel H.J.; Currow, David C.; Meagher, David; Rabheru, Kiran; Wright, David; Bruera, Eduardo; Hartwick, Michael; Gagnon, Pierre R.; Gagnon, Bruno; Breitbart, William; Regnier, Laura; Lawlor, Peter G.

    2014-01-01

    Context Delirium is a highly prevalent complication in patients in palliative care settings, especially in the end-of-life context. Objectives To review the current evidence base for treating episodes of delirium in palliative care settings and propose a framework for future development. Methods We combined multidisciplinary input from delirium researchers and other purposely selected stakeholders at an international delirium study planning meeting. This was supplemented by a literature search of multiple databases and relevant reference lists to identify studies regarding therapeutic interventions for delirium. Results The context of delirium management in palliative care is highly variable. The standard management of a delirium episode includes the investigation of precipitating and aggravating factors followed by symptomatic treatment with drug therapy. However, the intensity of this management depends on illness trajectory and goals of care in addition to the local availability of both investigative modalities and therapeutic interventions. Pharmacologically, haloperidol remains the practice standard by consensus for symptomatic control. Dosing schedules are derived from expert opinion and various clinical practice guidelines as evidence-based data from palliative care settings are limited. The commonly used pharmacologic interventions for delirium in this population warrant evaluation in clinical trials to examine dosing and titration regimens, different routes of administration, and safety and efficacy compared with placebo. Conclusion Delirium treatment is multidimensional and includes the identification of precipitating and aggravating factors. For symptomatic management, haloperidol remains the practice standard. Further high-quality collaborative research investigating the appropriate treatment of this complex syndrome is needed. PMID:24480529

  16. Why are we prolonging QT interval monitoring?

    PubMed

    Barrett, Trina

    2015-01-01

    At present, monitoring of the QT interval (QTI) is not a standard practice in the medical intensive care unit setting, where many drugs that prolong the QTI are administered. This literature review looked at the current research for evidence-based standards to support QTI monitoring of patients with risk factors for QTI prolongation, which can result in life-threatening arrhythmias such as torsade de pointes. The objective of this article is to establish the existence of evidence-based standards for monitoring of the QTI and to raise awareness in the nursing profession of the need for such monitoring among patients who are at high risk for prolonged QTI. To determine whether published standards for QTI monitoring exist, a search was conducted of the bibliographic databases CINAHL, EBSCOhost, Medline, PubMed, Google Scholar, and the Cochrane Library for the years 2013 and 2014. Also, a survey was conducted to determine whether practice standards for QTI monitoring are being implemented at 4 major hospitals in the Memphis area, including a level 1 trauma center. The database search established the existence of published guidelines that support the need for QTI monitoring. Results of the hospital survey indicated that direct care nurses were not aware of the need to identify high-risk patients, drugs with the potential to prolong QTI that were being administered to their patients, or evidence-based standards for QTI monitoring. Review of the research literature underscored the need for QTI monitoring among high-risk patients, that is, those with genetic conditions that predispose them to QTI prolongation, those with existing cardiac conditions being treated with antiarrhythmic medications, or those who are prescribed any new medication classified as high risk on the basis of clinical research. This need is especially crucial in intensive care unit settings, where many antiarrhythmic medications are administered.

  17. Minimal Unified Resolution to R_{K^{(*)}} and R(D^{(*)}) Anomalies with Lepton Mixing.

    PubMed

    Choudhury, Debajyoti; Kundu, Anirban; Mandal, Rusa; Sinha, Rahul

    2017-10-13

    It is a challenging task to explain, in terms of a simple and compelling new physics scenario, the intriguing discrepancies between the standard model expectations and the data for the neutral-current observables R_{K} and R_{K^{*}}, as well as the charged-current observables R(D) and R(D^{*}). We show that this can be achieved in an effective theory with only two unknown parameters. In addition, this class of models predicts some interesting signatures in the context of both B decays as well as high-energy collisions.

  18. Ultrahigh-resolution endoscopic optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Herz, Paul R.; Hsiung, Pei-Lin; Aguirre, Aaron D.; Mashimo, Hiroshi; Desai, Saleem; Pedrosa, Macos; Koski, Amanda; Schmitt, Joseph M.; Fujimoto, James G.

    2005-01-01

    Early detection of gastrointestinal cancer is essential for the patient treatment and medical care. Endoscopically guided biopsy is currently the gold standard for the diagnosis of early esophageal cancer, but can suffer from high false negative rates due to sampling errors. Optical coherence tomography (OCT) is an emerging medical imaging technology which can generate high resolution, cross-sectional images of tissue in situ and in real time, without the removal of tissue specimen. Although endoscopic OCT has been used successfully to identify certain pathologies in the gastrointestinal tract, the resolution of current endoscopic OCT systems has been limited to 10 - 15 m for clinical procedures. In this study, in vivo imaging of the gastrointestinal tract is demonstrated at a three-fold higher resolution (< 5 m), using a portable, broadband, Cr4+:Forsterite laser as the optical light source. Images acquired from the esophagus, gastro-esophageal junction and colon on animal model display tissue microstructures and architectural details at high resolution, and the features observed in the OCT images are well-matched with histology. The clinical feasibility study is conducted through delivering OCT imaging catheter using standard endoscope. OCT images of normal esophagus, Barrett's esophagus, and esophageal cancers are demonstrated with distinct features. The ability of high resolution endoscopic OCT to image tissue morphology at an unprecedented resolution in vivo would facilitate the development of OCT as a potential imaging modality for early detection of neoplastic changes.

  19. A simplified, low power system for effective vessel sealing

    NASA Astrophysics Data System (ADS)

    Lyle, Allison B.; Kennedy, Jenifer S.; Schmaltz, Dale F.; Kennedy, Aaron S.

    2015-03-01

    The first bipolar vessel sealing system was developed nearly 15 years ago and has since become standard of care in surgery. These systems make use of radio frequency current that is delivered between bipolar graspers to permanently seal arteries, veins and tissue bundles. Conventional vessel sealing generators are based off traditional electrosurgery generator architecture and deliver high power (150-300 Watts) and high current using complex control and sense algorithms to adjust the output for vessel sealing applications. In recent years, a need for small-scale surgical vessel sealers has developed as surgeons strive to further reduce their footprint on patients. There are many technical challenges associated with miniaturization of vessel sealing devices including maintaining electrical isolation while delivering high current in a saline environment. Research into creating a small, 3mm diameter vessel sealer revealed that a highly simplified generator system could be used to achieve excellent results and subsequently a low power vessel sealing system was developed. This system delivers 25 Watts constant power while limiting voltage (<= Vrms) and current (<= Amps) until an impedance endpoint is achieved, eliminating the use of complicated control and sensing software. The result is optimized tissue effect, where high seal strength is maintained (> 360mmHg), but seal times (1.7 +/- 0.7s versus 4.1 +/- 0.7s), thermal spread (<1mm vs <=2mm) and total energy delivery are reduced, when compared to an existing high power system.

  20. 36 CFR Appendix A to Part 1234 - Minimum Security Standards for Level III Federal Facilities

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... construction projects should be reviewed if possible, to incorporate current technology and blast standards... critical systems (alarm systems, radio communications, computer facilities, etc.) Required. Occupant... all exterior windows (shatter protection) Recommended. Review current projects for blast standards...

  1. The prevalence of urinary tract infection in children with severe acute malnutrition: a narrative review

    PubMed Central

    Uwaezuoke, Samuel N

    2016-01-01

    This article aims to review the current evidence which shows that the prevalence of urinary tract infection (UTI) has been increasing in children with severe acute malnutrition (SAM). UTI remains one of the most common causes of febrile illness in pediatric practice. Most studies conducted among hospitalized children with complicated SAM have reported high prevalence rates of UTI. Clearly, the knowledge of baseline risk of UTI can help clinicians to make informed diagnostic and therapeutic decisions in these children. From the global reports reviewed in this article, UTI prevalence rates range from as low as 6% to as high as 37% in developing countries, while the most common bacterial isolates from urine cultures are Gram-negative coliform organisms such as Escherichia coli and Klebsiella species. These findings form the basis for the current diagnostic and therapeutic guidelines for clinicians managing children with complicated SAM. With the reported high prevalence of UTI among these children and concerns over antibiotic resistance, more extensive data are required using standardized microbiological methods. Thus, the assessment of the performance of urine dipsticks and microscopy against the gold standard urine culture is an important step toward strengthening the evidence for the therapeutic guidelines for UTI in children with SAM. PMID:29388594

  2. ER-2 High Altitude Solar Cell Calibration Flights

    NASA Technical Reports Server (NTRS)

    Myers, Matthew; Wolford, David; Snyder, David; Piszczor, Michael

    2015-01-01

    Evaluation of space photovoltaics using ground-based simulators requires primary standard cells which have been characterized in a space or near-space environment. Due to the high cost inherent in testing cells in space, most primary standards are tested on high altitude fixed wing aircraft or balloons. The ER-2 test platform is the latest system developed by the Glenn Research Center (GRC) for near-space photovoltaic characterization. This system offers several improvements over GRC's current Learjet platform including higher altitude, larger testing area, onboard spectrometers, and longer flight season. The ER-2 system was developed by GRC in cooperation with NASA's Armstrong Flight Research Center (AFRC) as well as partners at the Naval Research Laboratory and Air Force Research Laboratory. The system was designed and built between June and September of 2014, with the integration and first flights taking place at AFRC's Palmdale facility in October of 2014. Three flights were made testing cells from GRC as well as commercial industry partners. Cell performance data was successfully collected on all three flights as well as solar spectra. The data was processed using a Langley extrapolation method, and performance results showed a less than half a percent variation between flights, and less than a percent variation from GRC's current Learjet test platform.

  3. Risk stratification of childhood medulloblastoma in the molecular era: the current consensus.

    PubMed

    Ramaswamy, Vijay; Remke, Marc; Bouffet, Eric; Bailey, Simon; Clifford, Steven C; Doz, Francois; Kool, Marcel; Dufour, Christelle; Vassal, Gilles; Milde, Till; Witt, Olaf; von Hoff, Katja; Pietsch, Torsten; Northcott, Paul A; Gajjar, Amar; Robinson, Giles W; Padovani, Laetitia; André, Nicolas; Massimino, Maura; Pizer, Barry; Packer, Roger; Rutkowski, Stefan; Pfister, Stefan M; Taylor, Michael D; Pomeroy, Scott L

    2016-06-01

    Historical risk stratification criteria for medulloblastoma rely primarily on clinicopathological variables pertaining to age, presence of metastases, extent of resection, histological subtypes and in some instances individual genetic aberrations such as MYC and MYCN amplification. In 2010, an international panel of experts established consensus defining four main subgroups of medulloblastoma (WNT, SHH, Group 3 and Group 4) delineated by transcriptional profiling. This has led to the current generation of biomarker-driven clinical trials assigning WNT tumors to a favorable prognosis group in addition to clinicopathological criteria including MYC and MYCN gene amplifications. However, outcome prediction of non-WNT subgroups is a challenge due to inconsistent survival reports. In 2015, a consensus conference was convened in Heidelberg with the objective to further refine the risk stratification in the context of subgroups and agree on a definition of risk groups of non-infant, childhood medulloblastoma (ages 3-17). Published and unpublished data over the past 5 years were reviewed, and a consensus was reached regarding the level of evidence for currently available biomarkers. The following risk groups were defined based on current survival rates: low risk (>90 % survival), average (standard) risk (75-90 % survival), high risk (50-75 % survival) and very high risk (<50 % survival) disease. The WNT subgroup and non-metastatic Group 4 tumors with whole chromosome 11 loss or whole chromosome 17 gain were recognized as low-risk tumors that may qualify for reduced therapy. High-risk strata were defined as patients with metastatic SHH or Group 4 tumors, or MYCN-amplified SHH medulloblastomas. Very high-risk patients are Group 3 with metastases or SHH with TP53 mutation. In addition, a number of consensus points were reached that should be standardized across future clinical trials. Although we anticipate new data will emerge from currently ongoing and recently completed clinical trials, this consensus can serve as an outline for prioritization of certain molecular subsets of tumors to define and validate risk groups as a basis for future clinical trials.

  4. Risk stratification of childhood medulloblastoma in the molecular era: The Current Consensus

    PubMed Central

    Ramaswamy, Vijay; Remke, Marc; Bouffet, Eric; Bailey, Simon; Clifford, Steven C.; Doz, Francois; Kool, Marcel; Dufour, Christelle; Vassal, Gilles; Milde, Till; Witt, Olaf; von Hoff, Katja; Pietsch, Torsten; Northcott, Paul A.; Gajjar, Amar; Robinson, Giles W.; Padovani, Laetitia; André, Nicolas; Massimino, Maura; Pizer, Barry; Packer, Roger; Rutkowski, Stefan; Pfister, Stefan M.; Taylor, Michael D.; Pomeroy, Scott L.

    2016-01-01

    Historical risk stratification criteria for medulloblastoma rely primarily on clinicopathological variables pertaining to age, presence of metastases, extent of resection, histological subtypes and in some instances individual genetic aberrations such as MYC and MYCN amplification. In 2010, an international panel of experts established consensus defining four main subgroups of medulloblastoma (WNT, SHH, Group 3 and Group 4) delineated by transcriptional profiling. This has led to the current generation of biomarker-driven clinical trials assigning WNT tumors to a favorable prognosis group in addition to clinicopathological criteria including MYC and MYCN gene amplifications. However, outcome prediction of non-WNT subgroups is a challenge due to inconsistent survival reports. In 2015, a consensus conference was convened in Heidelberg with the objective to further refine the risk stratification in the context of subgroups and agree on a definition of risk groups of non-infant, childhood medulloblastoma (ages 3–17). Published and unpublished data over the past five years were reviewed, and a consensus was reached regarding the level of evidence for currently available biomarkers. The following risk groups were defined based on current survival rates: low risk (>90% survival), average (standard) risk (75–90% survival), high risk (50–75% survival) and very high risk (<50% survival) disease. The WNT subgroup and non-metastatic Group 4 tumors with whole chromosome 11 loss or whole chromosome 17 gain were recognized as low risk tumors that may qualify for reduced therapy. High-risk strata were defined as patients with metastatic SHH or Group 4 tumors, or MYCN amplified SHH medulloblastomas. Very high-risk patients are Group 3 with metastases or SHH with TP53 mutation. In addition, a number of consensus points were reached that should be standardized across future clinical trials. Although we anticipate new data will emerge from currently ongoing and recently completed clinical trials, this consensus can serve as an outline for prioritization of certain molecular subsets of tumors to define and validate risk groups as a basis for future clinical trials. PMID:27040285

  5. Measuring What Matters: A Stronger Accountability Model for Teacher Education

    ERIC Educational Resources Information Center

    Crowe, Edward

    2010-01-01

    State oversight for teacher preparation programs mostly ignores the impact of graduates on the K-12 students they teach, and it gives little attention to where graduates teach or how long they remain in the profession. There is no evidence that current state policies hold programs to high standards in order to produce teachers who can help…

  6. "I Don't Come out with Big Words like Other People": Interviewing Adolescents as Part of Communication Profiling

    ERIC Educational Resources Information Center

    Spencer, Sarah; Clegg, Judy; Stackhouse, Joy

    2010-01-01

    Assessing adolescent language skills poses significant challenges due to the subtle nature of language proficiency at this age, along with the high linguistic demands both academically and socially. As with young children, the current range of language assessments designed specifically for adolescents mostly includes standardized tests. This…

  7. A Literature Review on Disciplinary Literacy: How Do Secondary Teachers Apprentice Students Into Mathematical Literacy?

    ERIC Educational Resources Information Center

    Hillman, Ann Marie

    2014-01-01

    Current adolescent literacy rates cause concerns at the number of students who graduate high school with basic or below-basic reading skills. The Common Core State Standards promote disciplinary literacy, which presents advanced literacy skills embedded in content area instruction. Disciplinary literacy is argued as a way to raise adolescent…

  8. Christian Teacher Education in a Culture of "Techne": Current Developments in Teacher Accreditation

    ERIC Educational Resources Information Center

    Schwarz, Gretchen

    2014-01-01

    Teacher education has long been a major mission of Christian colleges. Many Christian as well as public universities have teacher education programs accredited by CAEP, a national organization. Self-study for improvement is important. CAEP promises high standards, but when examined more closely, the CAEP system works at cross-purposes with teacher…

  9. Teaching Harry Potter: The Power of Imagination in Multicultural Classrooms. Secondary Education in a Changing World

    ERIC Educational Resources Information Center

    Belcher, Catherine L.; Stephenson, Becky Herr

    2011-01-01

    Given the current educational climate of high stakes testing, standardized curriculum, and "approved" reading lists, incorporating unauthorized, often controversial, popular literature into the classroom becomes a political choice. The authors examine why teachers choose to read "Harry Potter", how they use the books and incorporate new media, and…

  10. Health in Day Care: A Guide for Day Care Providers in Massachusetts.

    ERIC Educational Resources Information Center

    Kendrick, Abby Shapiro, Ed.; Messenger, Katherine P., Ed.

    This reference manual and resource guide describes high standards for health policies and day care procedures that reflect current research and recommendations of experts. Chapters 1 and 2, which concern day care's role in health, cover health education in day care and the basics relating to policies, providers, and records. Chapters 3-5 concern…

  11. Selective Citation Mars Conclusions about Test Validity and Predictive Bias

    ERIC Educational Resources Information Center

    Kuncel, Nathan R.; Sackett, Paul R.

    2007-01-01

    Comments on the article by Vasquez and Jones, in which they put forward the argument that standardized tests do not evaluate much of anything worthwhile and do not assess merit. The current authors argue that Vasquez and Jones support their argument only through highly selective citations from the literature, and they discuss Vasquez and Jones'…

  12. The Experience of Place in Childhood Literacy Life-Worlds: A Phenomenological Study of Readers as Place-Makers

    ERIC Educational Resources Information Center

    Fischer, Sarah B.

    2015-01-01

    Current educational policy in the United States supports the standardization of school curricula and promotes a high-stakes testing culture that reinforces the ideologies of market fundamentalism. This accountability movement has resulted in school curriculum that aims to transcend children's diverse lived experiences and the local contexts in…

  13. Maxine Greene and the Quest in Our Times: A Teacher Educator's Reflections on Imaginative Praxis

    ERIC Educational Resources Information Center

    Nicholson-Goodman, JoVictoria

    2012-01-01

    Given the culture of compliance produced at the intersection of standardization, high-stakes testing, and punitive measures for all who deviate from the hyperrationalized frenzy of accountability that currently prevails, how is it that I have the audacity to offer teacher-learners space to exercise their imaginative capacities, to envision and…

  14. Cycles of Nature. An Introduction to Biological Rhythms.

    ERIC Educational Resources Information Center

    Ahlgren, Andrew; Halberg, Franz

    This book is an outlined for the short study (1- to 2-weeks) of chronobiology, a field of science that explores the relationships between time and biological functions. It develops step-by-step the reasoning that leads to the current scientific understanding of biological rhythms. The unit can be inserted into a standard middle or high school…

  15. Global risk management in type 2 diabetes: blood glucose, blood pressure, and lipids--update on the background of the current guidelines.

    PubMed

    Clemens, A; Siegel, E; Gallwitz, B

    2004-10-01

    Diabetes mellitus presents a significant public health burden based on its increased morbidity, mortality, and economic cost. The high comorbidity and prevalence of concomitant diseases like hypertension and dyslipidemia in diabetic patients cause the high risk in developing secondary, cost intensive, and for the patient often disastrous late complications (nephropathy, retinopathy, neuropathy, and cardiovascular disease). Therefore, patients with diabetes mellitus need a global risk management that takes the various individual clinical problems into account. The current global standards of therapy in patients with diabetes mellitus are focused on the control of glycemia, blood pressure, and lipid levels, as well as aspirin therapy and avoiding of smoking. There are a number of guidelines and recommendations to manage these global issues. Our review will summarize current recommendations and consolidate therapeutic goals and treatments that are of vital importance in the global risk management in diabetic patients.

  16. Vagus Nerve Stimulation Applied with a Rapid Cycle Has More Profound Influence on Hippocampal Electrophysiology Than a Standard Cycle.

    PubMed

    Larsen, Lars E; Wadman, Wytse J; Marinazzo, Daniele; van Mierlo, Pieter; Delbeke, Jean; Daelemans, Sofie; Sprengers, Mathieu; Thyrion, Lisa; Van Lysebettens, Wouter; Carrette, Evelien; Boon, Paul; Vonck, Kristl; Raedt, Robrecht

    2016-07-01

    Although vagus nerve stimulation (VNS) is widely used, therapeutic mechanisms and optimal stimulation parameters remain elusive. In the present study, we investigated the effect of VNS on hippocampal field activity and compared the efficiency of different VNS paradigms. Hippocampal electroencephalography (EEG) and perforant path dentate field-evoked potentials were acquired before and during VNS in freely moving rats, using 2 VNS duty cycles: a rapid cycle (7 s on, 18 s off) and standard cycle (30 s on, 300 s off) and various output currents. VNS modulated the evoked potentials, reduced total power of the hippocampal EEG, and slowed the theta rhythm. In the hippocampal EEG, theta (4-8 Hz) and high gamma (75-150 Hz) activity displayed strong phase amplitude coupling that was reduced by VNS. Rapid-cycle VNS had a greater effect than standard-cycle VNS on all outcome measures. Using rapid cycle VNS, a maximal effect on EEG parameters was found at 300 μA, beyond which effects saturated. The findings suggest that rapid-cycle VNS produces a more robust outcome than standard cycle VNS and support already existing preclinical evidence that relatively low output currents are sufficient to produce changes in brain physiology and thus likely also therapeutic efficacy.

  17. Current challenges in diagnostic imaging of venous thromboembolism.

    PubMed

    Huisman, Menno V; Klok, Frederikus A

    2015-01-01

    Because the clinical diagnosis of deep-vein thrombosis and pulmonary embolism is nonspecific, integrated diagnostic approaches for patients with suspected venous thromboembolism have been developed over the years, involving both non-invasive bedside tools (clinical decision rules and D-dimer blood tests) for patients with low pretest probability and diagnostic techniques (compression ultrasound for deep-vein thrombosis and computed tomography pulmonary angiography for pulmonary embolism) for those with a high pretest probability. This combination has led to standardized diagnostic algorithms with proven safety for excluding venous thrombotic disease. At the same time, it has become apparent that, as a result of the natural history of venous thrombosis, there are special patient populations in which the current standard diagnostic algorithms are not sufficient. In this review, we present 3 evidence-based patient cases to underline recent developments in the imaging diagnosis of venous thromboembolism. © 2015 by The American Society of Hematology. All rights reserved.

  18. New recording package for VACM provides sensor flexibility

    USGS Publications Warehouse

    Strahle, William J.; Worrilow, S. E.; Fucile, S. E.; Martini, Marinna A.

    1994-01-01

    For the past three decades, the VACM has been a standard for ocean current measurements. A VACM is a true vector-averaging instrument that computes north and east current vectors and averages temperature continuously over a specified interval. It keeps a running total of rotor counts, and records one-shot samples of compass, vane position and time. Adding peripheral sensors to the data stream was easy. In today's economy, it seems imperative that operational centers concentrate on upgrading present inventory rather than purchasing newer instruments that often fall short of the flexible measurement platforms with high data capacities required by most researchers today. PCMCIA cards are rapidly becoming an industry standard with a wide range of storage capacities. By upgrading the VACM to a PCMCIA storage system with a flexible microprocessor, the VACM should continue to be a viable instrument into the next century

  19. Exposure Patterns and Health Effects Associated with Swimming and Surfing in Polluted Marine Waters

    NASA Astrophysics Data System (ADS)

    Grant, S. B.

    2007-05-01

    Marine bathing beaches are closed to the public whenever water quality fails to meet State and Federal standards. In this talk I will explore the science (and lack thereof!) behind these beach closures, including the health effects data upon which standards are based, shortcomings of the current approach used for testing and notification, and the high degree of spatial and temporal heterogeneity associated with human exposure to pollutants in these systems. The talk will focus on examples from Huntington Beach, where the speaker has conducted research over the past several years.

  20. Archival-grade optical disc design and international standards

    NASA Astrophysics Data System (ADS)

    Fujii, Toru; Kojyo, Shinichi; Endo, Akihisa; Kodaira, Takuo; Mori, Fumi; Shimizu, Atsuo

    2015-09-01

    Optical discs currently on the market exhibit large variations in life span among discs, making them unsuitable for certain business applications. To assess and potentially mitigate this problem, we performed accelerated degradation testing under standard ISO conditions, determined the probable disc failure mechanisms, and identified the essential criteria necessary for a stable disc composition. With these criteria as necessary conditions, we analyzed the physical and chemical changes that occur in the disc components, on the basis of which we determined technological measures to reduce these degradation processes. By applying these measures to disc fabrication, we were able to develop highly stable optical discs.

  1. Current-Sensitive Path Planning for an Underactuated Free-Floating Ocean Sensorweb

    NASA Technical Reports Server (NTRS)

    Dahl, Kristen P.; Thompson, David R.; McLaren, David; Chao, Yi; Chien, Steve

    2011-01-01

    This work investigates multi-agent path planning in strong, dynamic currents using thousands of highly under-actuated vehicles. We address the specific task of path planning for a global network of ocean-observing floats. These submersibles are typified by the Argo global network consisting of over 3000 sensor platforms. They can control their buoyancy to float at depth for data collection or rise to the surface for satellite communications. Currently, floats drift at a constant depth regardless of the local currents. However, accurate current forecasts have become available which present the possibility of intentionally controlling floats' motion by dynamically commanding them to linger at different depths. This project explores the use of these current predictions to direct float networks to some desired final formation or position. It presents multiple algorithms for such path optimization and demonstrates their advantage over the standard approach of constant-depth drifting.

  2. Vascular endothelial growth factor inhibitor use and treatment approach for choroidal neovascularization secondary to pathologic myopia.

    PubMed

    Pakzad-Vaezi, Kaivon; Mehta, Hemal; Mammo, Zaid; Tufail, Adnan

    2016-07-01

    Myopic choroidal neovascularization (CNV) is the most common cause of CNV in those under 50 years of age. It is a significant cause of visual loss in those with pathologic myopia. The current standard of care involves therapy with intravitreal inhibitors of vascular endothelial growth factor (VEGF). The epidemiology of myopia, high myopia, pathologic myopia, and myopic CNV is reviewed, along with a brief discussion of historical treatments. The pharmacology of the three most commonly used anti-VEGF agents is discussed, with an emphasis on the licensed drugs, ranibizumab and aflibercept. A comprehensive clinical approach to diagnosis and treatment of myopic CNV is presented. The current standard of care for myopic CNV is intravitreal inhibition of VEGF, with ranibizumab and aflibercept licensed for intraocular use. The diagnosis, OCT features of disease activity and retreatment algorithm for myopic CNV is different from wet age-related macular degeneration. In the long-term, myopic CNV may be associated with gradual, irreversible visual loss due to progressive chorioretinal atrophy, for which there is currently no treatment.

  3. The evolving role of stereotactic radiosurgery and stereotactic radiation therapy for patients with spine tumors.

    PubMed

    Rock, Jack P; Ryu, Samuel; Yin, Fang-Fang; Schreiber, Faye; Abdulhak, Muwaffak

    2004-01-01

    Traditional management strategies for patients with spinal tumors have undergone considerable changes during the last 15 years. Significant improvements in digital imaging, computer processing, and treatment planning have provided the basis for the application of stereotactic techniques, now the standard of care for intracranial pathology, to spinal pathology. In addition, certain of these improvements have also allowed us to progress from frame-based to frameless systems which now act to accurately assure the delivery of high doses of radiation to a precisely defined target volume while sparing injury to adjacent normal tissues. In this article we will describe the evolution from yesterday's standards for radiation therapy to the current state of the art for the treatment of patients with spinal tumors. This presentation will include a discussion of radiation dosing and toxicity, the overall process of extracranial radiation delivery, and the current state of the art regarding Cyberknife, Novalis, and tomotherapy. Additional discussion relating current research protocols and future directions for the management of benign tumors of the spine will also be presented.

  4. Development of a Standard Test Scenario to Evaluate the Effectiveness of Portable Fire Extinguishers on Lithium-ion Battery Fires

    NASA Technical Reports Server (NTRS)

    Juarez, Alfredo; Harper, Susan A.; Hirsch, David B.; Carriere, Thierry

    2013-01-01

    Many sources of fuel are present aboard current spacecraft, with one especially hazardous source of stored energy: lithium ion batteries. Lithium ion batteries are a very hazardous form of fuel due to their self-sustaining combustion once ignited, for example, by an external heat source. Batteries can become extremely energetic fire sources due to their high density electrochemical energy content that may, under duress, be violently converted to thermal energy and fire in the form of a thermal runaway. Currently, lithium ion batteries are the preferred types of batteries aboard international spacecraft and therefore are routinely installed, collectively forming a potentially devastating fire threat to a spacecraft and its crew. Currently NASA is developing a fine water mist portable fire extinguisher for future use on international spacecraft. As its development ensues, a need for the standard evaluation of various types of fire extinguishers against this potential threat is required to provide an unbiased means of comparing between fire extinguisher technologies and ranking them based on performance.

  5. Role and Value of the Corporate Medical Director.

    PubMed

    Pawlecki, J Brent; Burton, Wayne N; Christensen, Cherryl; Crighton, K Andrew; Heron, Richard; Hudson, T Warner; Hymel, Pamela A; Roomes, David

    2018-05-01

    : The role of the corporate medical director (CMD) has evolved over the last 300 years since Ramazzini first identified diseases of Italian workers in the early 1700s. Since then, there has been a gradual blurring of the boundaries between private and workplace health concerns. Today's CMD must have intimate knowledge of their corporation's industry and the businesses that they support, particularly the occupational and environmental programs that comply with all local, state, and/or national standards and regulations. Leading companies not only measure compliance with such standards but also may hold programs to their own internal corporate global standards even if these go beyond local government requirements. This document will explore in greater depth the strength and importance that the CMD brings to the business operations to support a healthy, engaged, and high performing workforce. Part 1 describes the role and value of the CMD, while Part 2 provides collective wisdom for the new CMD from current and past highly experienced CMDs.

  6. Historical Precision of an Ozone Correction Procedure for AM0 Solar Cell Calibration

    NASA Technical Reports Server (NTRS)

    Snyder, David B.; Jenkins, Phillip; Scheiman, David

    2005-01-01

    In an effort to improve the accuracy of the high altitude aircraft method for calibration of high band-gap solar cells, the ozone correction procedure has been revisited. The new procedure adjusts the measured short circuit current, Isc, according to satellite based ozone measurements and a model of the atmospheric ozone profile then extrapolates the measurements to air mass zero, AMO. The purpose of this paper is to assess the precision of the revised procedure by applying it to historical data sets. The average Isc of a silicon cell for a flying season increased 0.5% and the standard deviation improved from 0.5% to 0.3%. The 12 year average Isc of a GaAs cell increased 1% and the standard deviation improved from 0.8% to 0.5%. The slight increase in measured Isc and improvement in standard deviation suggests that the accuracy of the aircraft method may improve from 1% to nearly 0.5%.

  7. Leo Spacecraft Charging Design Guidelines: A Proposed NASA Standard

    NASA Technical Reports Server (NTRS)

    Hillard, G. B.; Ferguson, D. C.

    2004-01-01

    Over the past decade, Low Earth Orbiting (LEO) spacecraft have gradually required ever-increasing power levels. As a rule, this has been accomplished through the use of high voltage systems. Recent failures and anomalies on such spacecraft have been traced to various design practices and materials choices related to the high voltage solar arrays. NASA Glenn has studied these anomalies including plasma chamber testing on arrays similar to those that experienced difficulties on orbit. Many others in the community have been involved in a comprehensive effort to understand the problems and to develop practices to avoid them. The NASA Space Environments and Effects program, recognizing the timeliness of this effort, commissioned and funded a design guidelines document intended to capture the current state of understanding. This document, which was completed in the spring of 2003, has been submitted as a proposed NASA standard. We present here an overview of this document and discuss the effort to develop it as a NASA standard.

  8. Femtosecond Timekeeping: Slip-Free Clockwork for Optical Timescales

    NASA Astrophysics Data System (ADS)

    Herman, D.; Droste, S.; Baumann, E.; Roslund, J.; Churin, D.; Cingoz, A.; Deschênes, J.-D.; Khader, I. H.; Swann, W. C.; Nelson, C.; Newbury, N. R.; Coddington, I.

    2018-04-01

    The generation of true optical time standards will require the conversion of the highly stable optical-frequency output of an optical atomic clock to a high-fidelity time output. We demonstrate a comb-based clockwork that phase-coherently integrates ˜7 ×1020 optical cycles of an input optical frequency to create a coherent time output. We verify the underlying stability of the optical timing system by comparing two comb-based clockworks with a common input optical frequency and show <20 fs total time drift over the 37-day measurement period. Both clockworks also generate traditional timing signals including an optical pulse per second and a 10-MHz rf reference. The optical pulse-per-second time outputs remain synchronized to 240 attoseconds (240 as) at 1000 s. The phase-coherent 10-MHz rf outputs are stable to near a part in 1019 . Fault-free timekeeping from an optical clock to femtosecond level over months is an important step in replacing the current microwave time standard by an optical standard.

  9. Metrology of airborne and liquid-borne nanoparticles: current status and future needs

    NASA Astrophysics Data System (ADS)

    Ehara, Kensei; Sakurai, Hiromu

    2010-04-01

    The current status and future needs of nanoparticle metrology are discussed, particularly with respect to measurements of size, size distribution and number concentration of airborne and liquid-borne nanoparticles. Possible classification of types of measurement standards is proposed, and the role of each type of standard, including the feasibility of its establishment, is examined. A desirable interplay between measurement standards and documentary standards in establishing the traceability chain in particle measurements is suggested. Particle-related calibration services currently provided by our laboratory at the National Institute of Advanced Industrial Science and Technology are also described.

  10. Development of high impedance measurement system for water leakage detection in implantable neuroprosthetic devices.

    PubMed

    Yousif, Aziz; Kelly, Shawn K

    2016-08-01

    There has been a push for a greater number of channels in implantable neuroprosthetic devices; but, that number has largely been limited by current hermetic packaging technology. Microfabricated packaging is becoming reality, but a standard testing system is needed to prepare these devices for clinical trials. Impedance measurements of electrodes built into the packaging layers may give an early warning of device failure and predict device lifetime. Because the impedance magnitudes of such devices can be on the order of gigaohms, a versatile system was designed to accommodate ultra-high impedances and allow future integrated circuit implementation in current neural prosthetic technologies. Here we present the circuitry, control software, and preliminary testing results of our designed system.

  11. Evaluation of Esophageal Motor Function With High-resolution Manometry

    PubMed Central

    2013-01-01

    For several decades esophageal manometry has been the test of choice to evaluate disorders of esophageal motor function. The recent introduction of high-resolution manometry for the study of esophageal motor function simplified performance of esophageal manometry, and revealed previously unidentified patterns of normal and abnormal esophageal motor function. Presentation of pressure data as color contour plots or esophageal pressure topography led to the development of new tools for analyzing and classifying esophageal motor patterns. The current standard and still developing approach to do this is the Chicago classification. While this methodical approach is improving our diagnosis of esophageal motor disorders, it currently does not address all motor abnormalities. We will explore the Chicago classification and disorders that it does not address. PMID:23875094

  12. 77 FR 3720 - Determination of Failure To Attain the One-Hour Ozone Standard by 2007, Determination of Current...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... the area is currently attaining the eight-hour standard is finalized, 40 CFR 51.918 \\1\\ of EPA's ozone... CFR 51.905(e)(2)(i)(B). EPA's proposed determination that the area failed to attain the one-hour ozone... standard. For areas that attain the standard, section 51.918 of the implementation rule provides that, upon...

  13. Bivariate quadratic method in quantifying the differential capacitance and energy capacity of supercapacitors under high current operation

    NASA Astrophysics Data System (ADS)

    Goh, Chin-Teng; Cruden, Andrew

    2014-11-01

    Capacitance and resistance are the fundamental electrical parameters used to evaluate the electrical characteristics of a supercapacitor, namely the dynamic voltage response, energy capacity, state of charge and health condition. In the British Standards EN62391 and EN62576, the constant capacitance method can be further improved with a differential capacitance that more accurately describes the dynamic voltage response of supercapacitors. This paper presents a novel bivariate quadratic based method to model the dynamic voltage response of supercapacitors under high current charge-discharge cycling, and to enable the derivation of the differential capacitance and energy capacity directly from terminal measurements, i.e. voltage and current, rather than from multiple pulsed-current or excitation signal tests across different bias levels. The estimation results the author achieves are in close agreement with experimental measurements, within a relative error of 0.2%, at various high current levels (25-200 A), more accurate than the constant capacitance method (4-7%). The archival value of this paper is the introduction of an improved quantification method for the electrical characteristics of supercapacitors, and the disclosure of the distinct properties of supercapacitors: the nonlinear capacitance-voltage characteristic, capacitance variation between charging and discharging, and distribution of energy capacity across the operating voltage window.

  14. Controllable Growth of Ga Film Electrodeposited from Aqueous Solution and Cu(In,Ga)Se2 Solar Cells.

    PubMed

    Bi, Jinlian; Ao, Jianping; Gao, Qing; Zhang, Zhaojing; Sun, Guozhong; He, Qing; Zhou, Zhiqiang; Sun, Yun; Zhang, Yi

    2017-06-07

    Electrodepositon of Ga film is very challenging due to the high standard reduction potential (-0.53 V vs SHE for Ga 3+ ). In this study, Ga film with compact structure was successfully deposited on the Mo/Cu/In substrate by the pulse current electrodeposition (PCE) method using GaCl 3 aqueous solution. A high deposition rate of Ga 3+ and H + can be achieved by applying a large overpotential induced by high pulse current. In the meanwhile, the concentration polarization induced by cation depletion can be minimized by changing the pulse frequency and duty cycle. Uniform and smooth Ga film was fabricated at high deposition rate with pulse current density 125 mA/cm 2 , pulse frequency 5 Hz, and duty cycle 0.25. Ga film was then selenized together with electrodeposited Cu and In films to make a CIGSe absorber film for solar cells. The solar cell based on the Ga film presents conversion efficiency of 11.04%, fill factor of 63.40%, and V oc of 505 mV, which is much better than those based on the inhomogeneous and rough Ga film prepared by the DCE method, indicating the pulse current electrodeposition process is promising for the fabrication of CIGSe solar cell.

  15. Characterization of three commercial Y-TZP ceramics produced for their high-translucency, high-strength and high-surface area.

    PubMed

    Tong, Hui; Tanaka, Carina B; Kaizer, Marina R; Zhang, Yu

    2016-01-01

    Developing yttria-stabilized tetragonal zirconia polycrystal (Y-TZP) with high strength and translucency could significantly widen the clinical indications of monolithic zirconia restorations. This study investigates the mechanical and optical properties of three Y-TZP ceramics: High-Translucency, High-Strength and High-Surface Area. The four-point bending strengths (mean ± standard error) for the three Y-TZP ceramics ( n = 10) were 990 ± 39, 1416 ± 33 and 1076 ± 32 MPa for High-Translucency, High-Strength and High-Surface Area, respectively. The fracture toughness values (mean ± standard error) for the three zirconias ( n = 10) were 3.24 ± 0.10, 3.63 ± 0.12 and 3.21 ± 0.14 MPa m 1/2 for High-Translucency, High-Strength and High-Surface Area, respectively. Both strength and toughness values of High-Strength zirconia were significantly higher than High-Surface Area and High-Translucency zirconias. Translucency parameter values of High-Translucency zirconia were considerably higher than High-Strength and High-Surface Area zirconias. However, all three zirconias became essentially opaque when their thickness reached 1 mm or greater. Our findings suggest that there exists a delicate balance between mechanical and optical properties of the current commercial Y-TZP ceramics.

  16. Poly(vinylidene fluoride-hexafluoropropylene) polymer electrolyte for paper-based and flexible battery applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aliahmad, Nojan; Shrestha, Sudhir; Varahramyan, Kody

    2016-06-15

    Paper-based batteries represent a new frontier in battery technology. However, low-flexibility and poor ionic conductivity of solid electrolytes have been major impediments in achieving practical mechanically flexible batteries. This work discuss new highly ionic conductive polymer gel electrolytes for paper-based battery applications. In this paper, we present a poly(vinylidene fluoride-hexafluoropropylene) (PVDH-HFP) porous membrane electrolyte enhanced with lithium bis(trifluoromethane sulphone)imide (LiTFSI) and lithium aluminum titanium phosphate (LATP), with an ionic conductivity of 2.1 × 10{sup −3} S cm{sup −1}. Combining ceramic (LATP) with the gel structure of PVDF-HFP and LiTFSI ionic liquid harnesses benefits of ceramic and gel electrolytes in providingmore » flexible electrolytes with a high ionic conductivity. In a flexibility test experiment, bending the polymer electrolyte at 90° for 20 times resulted in 14% decrease in ionic conductivity. Efforts to further improving the flexibility of the presented electrolyte are ongoing. Using this electrolyte, full-cell batteries with lithium titanium oxide (LTO) and lithium cobalt oxide (LCO) electrodes and (i) standard metallic current collectors and (ii) paper-based current collectors were fabricated and tested. The achieved specific capacities were (i) 123 mAh g{sup −1} for standard metallic current collectors and (ii) 99.5 mAh g{sup −1} for paper-based current collectors. Thus, the presented electrolyte has potential to become a viable candidate in paper-based and flexible battery applications. Fabrication methods, experimental procedures, and test results for the polymer gel electrolyte and batteries are presented and discussed.« less

  17. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    PubMed Central

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574

  18. Poly(vinylidene fluoride-hexafluoropropylene) polymer electrolyte for paper-based and flexible battery applications

    NASA Astrophysics Data System (ADS)

    Aliahmad, Nojan; Shrestha, Sudhir; Varahramyan, Kody; Agarwal, Mangilal

    2016-06-01

    Paper-based batteries represent a new frontier in battery technology. However, low-flexibility and poor ionic conductivity of solid electrolytes have been major impediments in achieving practical mechanically flexible batteries. This work discuss new highly ionic conductive polymer gel electrolytes for paper-based battery applications. In this paper, we present a poly(vinylidene fluoride-hexafluoropropylene) (PVDH-HFP) porous membrane electrolyte enhanced with lithium bis(trifluoromethane sulphone)imide (LiTFSI) and lithium aluminum titanium phosphate (LATP), with an ionic conductivity of 2.1 × 10-3 S cm-1. Combining ceramic (LATP) with the gel structure of PVDF-HFP and LiTFSI ionic liquid harnesses benefits of ceramic and gel electrolytes in providing flexible electrolytes with a high ionic conductivity. In a flexibility test experiment, bending the polymer electrolyte at 90° for 20 times resulted in 14% decrease in ionic conductivity. Efforts to further improving the flexibility of the presented electrolyte are ongoing. Using this electrolyte, full-cell batteries with lithium titanium oxide (LTO) and lithium cobalt oxide (LCO) electrodes and (i) standard metallic current collectors and (ii) paper-based current collectors were fabricated and tested. The achieved specific capacities were (i) 123 mAh g-1 for standard metallic current collectors and (ii) 99.5 mAh g-1 for paper-based current collectors. Thus, the presented electrolyte has potential to become a viable candidate in paper-based and flexible battery applications. Fabrication methods, experimental procedures, and test results for the polymer gel electrolyte and batteries are presented and discussed.

  19. Current drive for stability of thermonuclear plasma reactor

    NASA Astrophysics Data System (ADS)

    Amicucci, L.; Cardinali, A.; Castaldo, C.; Cesario, R.; Galli, A.; Panaccione, L.; Paoletti, F.; Schettini, G.; Spigler, R.; Tuccillo, A.

    2016-01-01

    To produce in a thermonuclear fusion reactor based on the tokamak concept a sufficiently high fusion gain together stability necessary for operations represent a major challenge, which depends on the capability of driving non-inductive current in the hydrogen plasma. This request should be satisfied by radio-frequency (RF) power suitable for producing the lower hybrid current drive (LHCD) effect, recently demonstrated successfully occurring also at reactor-graded high plasma densities. An LHCD-based tool should be in principle capable of tailoring the plasma current density in the outer radial half of plasma column, where other methods are much less effective, in order to ensure operations in the presence of unpredictably changes of the plasma pressure profiles. In the presence of too high electron temperatures even at the periphery of the plasma column, as envisaged in DEMO reactor, the penetration of the coupled RF power into the plasma core was believed for long time problematic and, only recently, numerical modelling results based on standard plasma wave theory, have shown that this problem should be solved by using suitable parameter of the antenna power spectrum. We show here further information on the new understanding of the RF power deposition profile dependence on antenna parameters, which supports the conclusion that current can be actively driven over a broad layer of the outer radial half of plasma column, thus enabling current profile control necessary for the stability of a reactor.

  20. Picture This... Developing Standards for Electronic Images at the National Library of Medicine

    PubMed Central

    Masys, Daniel R.

    1990-01-01

    New computer technologies have made it feasible to represent, store, and communicate high resolution biomedical images via electronic means. Traditional two dimensional medical images such as those on printed pages have been supplemented by three dimensional images which can be rendered, rotated, and “dissected” from any point of view. The library of the future will provide electronic access not only to words and numbers, but to pictures, sounds, and other nontextual information. There currently exist few widely-accepted standards for the representation and communication of complex images, yet such standards will be critical to the feasibility and usefulness of digital image collections in the life sciences. The National Library of Medicine is embarked on a project to develop a complete digital volumetric representation of an adult human male and female. This “Visible Human Project” will address the issue of standards for computer representation of biological structure.

  1. Setting the standards for signal transduction research.

    PubMed

    Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo

    2011-02-15

    Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.

  2. Quality standards for bone conduction implants.

    PubMed

    Gavilan, Javier; Adunka, Oliver; Agrawal, Sumit; Atlas, Marcus; Baumgartner, Wolf-Dieter; Brill, Stefan; Bruce, Iain; Buchman, Craig; Caversaccio, Marco; De Bodt, Marc T; Dillon, Meg; Godey, Benoit; Green, Kevin; Gstoettner, Wolfgang; Hagen, Rudolf; Hagr, Abdulrahman; Han, Demin; Kameswaran, Mohan; Karltorp, Eva; Kompis, Martin; Kuzovkov, Vlad; Lassaletta, Luis; Li, Yongxin; Lorens, Artur; Martin, Jane; Manoj, Manikoth; Mertens, Griet; Mlynski, Robert; Mueller, Joachim; O'Driscoll, Martin; Parnes, Lorne; Pulibalathingal, Sasidharan; Radeloff, Andreas; Raine, Christopher H; Rajan, Gunesh; Rajeswaran, Ranjith; Schmutzhard, Joachim; Skarzynski, Henryk; Skarzynski, Piotr; Sprinzl, Georg; Staecker, Hinrich; Stephan, Kurt; Sugarova, Serafima; Tavora, Dayse; Usami, Shin-Ichi; Yanov, Yuri; Zernotti, Mario; Zorowka, Patrick; de Heyning, Paul Van

    2015-01-01

    Bone conduction implants are useful in patients with conductive and mixed hearing loss for whom conventional surgery or hearing aids are no longer an option. They may also be used in patients affected by single-sided deafness. To establish a consensus on the quality standards required for centers willing to create a bone conduction implant program. To ensure a consistently high level of service and to provide patients with the best possible solution the members of the HEARRING network have established a set of quality standards for bone conduction implants. These standards constitute a realistic minimum attainable by all implant clinics and should be employed alongside current best practice guidelines. Fifteen items are thoroughly analyzed. They include team structure, accommodation and clinical facilities, selection criteria, evaluation process, complete preoperative and surgical information, postoperative fitting and assessment, follow-up, device failure, clinical management, transfer of care and patient complaints.

  3. Calibration of the NPL secondary standard radionuclide calibrator for 32P, 89Sr and 90Y

    NASA Astrophysics Data System (ADS)

    Woods, M. J.; Munster, A. S.; Sephton, J. P.; Lucas, S. E. M.; Walsh, C. Paton

    1996-02-01

    Pure beta particle emitting radionuclides have many therapeutic applications in nuclear medicine. The response of the NPL secondary standard radionuclide calibrator to 32P, 89Sr and 90Y has been measured using accurately calibrated solutions. For this purpose, high efficiency solid sources were prepared gravimetrically from dilute solutions of each radionuclide and assayed in a 4π proportional counter; the source activities were determined using known detection efficiency factors. Measurements were made of the current response (pA/MBq) of the NPL secondary standard radionuclide calibrator using the original concentrated solutions. Calibration figures have been derived for 2 and 5 ml British Standard glass ampoules and Amersham International plc P6 vials. Volume correction factors have also been determined. Gamma-ray emitting contaminants can have a disproportionate effect on the calibrator response and particular attention has been paid to this.

  4. Final Environmental Assessment (EA) for Replacement of the Kennel Facility (Building 949) GHLN 11-6002 F. E. Warren Air Force Base, Wyoming

    DTIC Science & Technology

    2012-01-01

    HYDROLOGIC FEATURES AND WETLAND LOCATIONS 23 FIGURE 3. PROPOSED KENNEL DESiGN 24 SED L IONS ..... .... ..... ... ..... ..... .... .... .... ... J...8’ (feet) x 10’ (feet)). Inadequate pen size can lead to working dog injury. The current Kennel also fails to meet sanitation standards: it has a 3...that working dogs receive adequate rest to perform effectively when required for duty. Kennels should be located outside of highly populated, high

  5. Pneumocystis carinii pneumonia: a late presentation following treatment for stage IV neuroblastoma.

    PubMed

    Clarke, Edward; Glaser, Adam W; Picton, Susan V

    2003-09-01

    This report describes a child who develops Pneumocystis carinii pneumonia 7 months after high-dose chemotherapy for stage IV neuroblastoma. In addition to chemotherapy the child had also been treated with abdominal radiotherapy and 13-cis-retinoic acid. Standard practice has been to treat patients with prophylactic co-trimoxazole for 3 months after high-dose therapy, but this report highlights the intensity and complexity of current treatment for stage IV neuroblastoma and the need to be aware of prolonged lymphopenia after such treatment.

  6. Selective perturbation of in vivo linear energy transfer using high- Z vaginal applicators for Cf-252 brachytherapy

    NASA Astrophysics Data System (ADS)

    Rivard, M. J.; Evans, K. E.; Leal, L. C.; Kirk, B. L.

    2004-01-01

    Californium-252 ( 252Cf) brachytherapy sources emit both neutrons and photons, and have the potential to vastly improve the current standard-of-practice for brachytherapy. While hydrogenous materials readily attenuate the 252Cf fission energy neutrons, high- Z materials are utilized to attenuate the 252Cf gamma-rays. These differences in shielding materials may be exploited when treating with a vaginal applicator to possibly improve patient survival through perturbation of the in vivo linear energy transfer radiation.

  7. Ergonomics standards and guidelines for computer workstation design and the impact on users' health - a review.

    PubMed

    Woo, E H C; White, P; Lai, C W K

    2016-03-01

    This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.

  8. SciNews: Incorporating Science Current Events in 21st Century Classrooms

    NASA Astrophysics Data System (ADS)

    DiMaggio, E.

    2011-12-01

    Middle school students are instructed with the aid of textbooks, lectures, and activities to teach topics that satisfy state standards. However, teaching materials created to convey standard-aligned science concepts often leave students asking how the content relates to their lives and why they should be learning it. Conveying relevance is important for student learning and retention, especially in science where abstract concepts can often be incorrectly perceived as irrelevant. One way to create an educational link between classroom content and everyday life is through the use of scientific current events. Students read, hear, and watch media coverage of natural events (such as the 2011 earthquake and tsunami in Japan), but do not necessarily relate the scientific information from media sources to classroom studies. Taking advantage of these brief 'teachable moments'--when student interest is high--provides a valuable opportunity to make classroom-to-everyday life associations and to incorporate inquiry based learning. To address this need, I create pre-packaged current event materials for middle to high school teachers that align to state standards, and which are short, effective, and easy to implement in the classroom. Each lesson takes approximately 15-30 minutes to implement, allowing teachers time to facilitate brief but meaningful discussions. I assemble materials within approximately one week of the regional or global science event, consisting of short slide shows, maps, videos, pictures, and real-time data. I use a listserv to send biweekly emails to subscribed instructors containing the current event topic and a link to download the materials. All materials are hosted on the Arizona State University Education Outreach SciNews website (http://sese.asu.edu/teacher-resources) and are archived. Currently, 285 educators subscribe to the SciNews listserv, representing 36 states and 19 countries. In order to assess the effectiveness and usefulness of SciNews materials, each lesson links to a brief online survey. I ask educators for basic information (grade level, number of students) as well as feedback on lesson content, accessibility of media types used, agreement with standards, and general comments on how to improve SciNews. Survey results show that SciNews lessons have been implemented in elementary through college classrooms. Comments express an overall agreement that Scinews lessons facilitate classroom discussion, heighten student interest in the topic, and that lessons are easy to use and modify. Current events help demonstrate to students that, unlike fact-filled textbooks suggest, science is not static and scientists are actively investigating many 'textbook' concepts. Showing students the process and progressive nature of scientific information reinforces critical thinking rather than pure memorization.

  9. Predictors of awareness of standard drink labelling and drinking guidelines to reduce negative health effects among Australian drinkers.

    PubMed

    Coomber, Kerri; Jones, Sandra C; Martino, Florentine; Miller, Peter G

    2017-03-01

    This study examined rates of awareness of standard drink labelling and drinking guidelines among Australian adult drinkers. Demographic predictors of these two outcomes were also explored. Online survey panel participants aged 18-45 years(n = 1061; mean age = 33.2 years) completed an online survey assessing demographics, alcohol consumption patterns, awareness of standard drink labels and the National Health and Medical Research Council (NHMRC) guidelines, and support for more detailed labels. The majority (80%) of participants had seen standard drink labels on alcohol products; with younger drinkers, those from a regional/rural location and high-risk drinkers significantly more likely to have seen such labelling. Most respondents estimated at or below the maximum number of drinks stipulated in the NHMRC guidelines. However, their estimates of the levels for male drinkers were significantly higher than for female drinkers. High-risk drinkers were significantly less likely to provide accurate estimates, while those who had seen the standard drink logo were significantly more likely to provide accurate estimates of drinking levels to reduce the risk of long-term harms only. Just under three-quarters of respondents supported the inclusion of more information on labels regarding guidelines to reduce negative health effects. The current standard drink labelling approach fails to address high-risk drinkers. The inclusion of information about NHMRC guidelines on alcohol labels, and placing standard drink labelling on the front of products could improve awareness of what constitutes a standard drink and safe levels of consumption among Australian drinkers.[Kerri Coomber, Sandra C. Jones, Florentine Martino, Peter G. Miller. Predictors of awareness of standard drink labelling and drinking guidelines to reduce negative health effects among Australian drinkers. Drug Alcohol Rev 2017;36:200-209]. © 2016 Australasian Professional Society on Alcohol and other Drugs.

  10. In Vivo Evaluation of ¹⁸F-SiFAlin-Modified TATE: A Potential Challenge for ⁶⁸Ga-DOTATATE, the Clinical Gold Standard for Somatostatin Receptor Imaging with PET.

    PubMed

    Niedermoser, Sabrina; Chin, Joshua; Wängler, Carmen; Kostikov, Alexey; Bernard-Gauthier, Vadim; Vogler, Nils; Soucy, Jean-Paul; McEwan, Alexander J; Schirrmacher, Ralf; Wängler, Björn

    2015-07-01

    Radiolabeled peptides for tumor imaging with PET that can be produced with kits are currently in the spotlight of radiopharmacy and nuclear medicine. The diagnosis of neuroendocrine tumors in particular has been a prime example for the usefulness of peptides labeled with a variety of different radionuclides. Among those, (68)Ga and (18)F stand out because of the ease of radionuclide introduction (e.g., (68)Ga isotope) or optimal nuclide properties for PET imaging (slightly favoring the (18)F isotope). The in vivo properties of good manufacturing practice-compliant, newly developed kitlike-producible (18)F-SiFA- and (18)F-SiFAlin- (SiFA = silicon-fluoride acceptor) modified TATE derivatives were compared with the current clinical gold standard (68)Ga-DOTATATE for high-quality imaging of somatostatin receptor-bearing tumors. SiFA- and SiFAlin-derivatized somatostatin analogs were synthesized and radiolabeled using cartridge-based dried (18)F and purified via a C18 cartridge (radiochemical yield 49.8% ± 5.9% within 20-25 min) without high-performance liquid chromatography purification. Tracer lipophilicity and stability in human serum were tested in vitro. Competitive receptor binding affinity studies were performed using AR42J cells. The most promising tracers were evaluated in vivo in an AR42J xenograft mouse model by ex vivo biodistribution and in vivo PET/CT imaging studies for evaluation of their pharmacokinetic profiles, and the results were compared with those of the current clinical gold standard (68)Ga-DOTATATE. Synthetically easily accessible (18)F-labeled silicon-fluoride acceptor-modified somatostatin analogs were developed. They exhibited high binding affinities to somatostatin receptor-positive tumor cells (1.88-14.82 nM). The most potent compound demonstrated comparable pharmacokinetics and an even slightly higher absolute tumor accumulation level in ex vivo biodistribution studies as well as higher tumor standardized uptake values in PET/CT imaging than (68)Ga-DOTATATE in vivo. The radioactivity uptake in nontumor tissue was higher than for (68)Ga-DOTATATE. The introduction of the novel SiFA building block SiFAlin and of hydrophilic auxiliaries enables a favorable in vivo biodistribution profile of the modified TATE peptides, resulting in high tumor-to-background ratios although lower than those observed with (68)Ga-DOTATATE. As further advantage, the SiFA methodology enables a kitlike labeling procedure for (18)F-labeled peptides advantageous for routine clinical application. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  11. Evidence for current recommendations concerning the management of foot health for people with chronic long-term conditions: a systematic review.

    PubMed

    Edwards, Katherine; Borthwick, Alan; McCulloch, Louise; Redmond, Anthony; Pinedo-Villanueva, Rafael; Prieto-Alhambra, Daniel; Judge, Andrew; Arden, Nigel; Bowen, Catherine

    2017-01-01

    Research focusing on management of foot health has become more evident over the past decade, especially related to chronic conditions such as diabetes. The level of methodological rigour across this body of work however is varied and outputs do not appear to have been developed or translated into clinical practice. The aim of this systematic review was to assess the latest guidelines, standards of care and current recommendations relative to people with chronic conditions to ascertain the level of supporting evidence concerning the management of foot health. A systematic search of electronic databases (Medline, Embase, Cinahl, Web of Science, SCOPUS and The Cochrane Library) for literature on recommendations for foot health management for people with chronic conditions was performed between 2000 and 2016 using predefined criteria. Data from the included publications was synthesised via template analysis, employing a thematic organisation and structure. The methodological quality of all included publications was appraised using the Appraisal for Research and Evaluation (AGREE II) instrument. A more in-depth analysis was carried out that specifically considered the levels of evidence that underpinned the strength of their recommendations concerning management of foot health. The data collected revealed 166 publications in which the majority (102) were guidelines, standards of care or recommendations related to the treatment and management of diabetes. We noted a trend towards a systematic year on year increase in guidelines standards of care or recommendations related to the treatment and management of long term conditions other than diabetes over the past decade. The most common recommendation is for preventive care or assessments (e.g. vascular tests), followed by clinical interventions such as foot orthoses, foot ulcer care and foot health education. Methodological quality was spread across the range of AGREE II scores with 62 publications falling into the category of high quality (scores 6-7). The number of publications providing a recommendation in the context of a narrative but without an indication of the strength or quality of the underlying evidence was high (79 out of 166). It is clear that evidence needs to be accelerated and in place to support the future of the Podiatry workforce. Whilst high level evidence for podiatry is currently low in quantity, the methodological quality is growing. Where levels of evidence have been given in in high quality guidelines, standards of care or recommendations, they also tend to be strong-moderate quality such that further strategically prioritised research, if performed, is likely to have an important impact in the field.

  12. Synthesizing 2D MoS2 Nanofins on carbon nanospheres as catalyst support for Proton Exchange Membrane Fuel Cells.

    PubMed

    Hu, Yan; Chua, Daniel H C

    2016-06-15

    Highly dense 2D MoS2 fin-like nanostructures on carbon nanospheres were fabricated and formed the main catalyst support structure in the oxygen reduction reaction (ORR) for polymer electrolyte membrane (PEM) fuel cells. These nanofins were observed growing perpendicular to the carbon nanosphere surface in random orientations and high resolution transmission electron microscope confirmed 2D layers. The PEM fuel cell test showed enhanced electrochemical activity with good stability, generating over 8.5 W.mgPt(-1) as compared to standard carbon black of 7.4 W.mgPt(-1) under normal operating conditions. Electrochemical Impedance Spectroscopy confirmed that the performance improvement is highly due to the excellent water management of the MoS2 lamellar network, which facilitates water retention at low current density and flood prevention at high current density. Reliability test further demonstrated that these nanofins are highly stable in the electrochemical reaction and is an excellent ORR catalyst support.

  13. Synthesizing 2D MoS2 Nanofins on carbon nanospheres as catalyst support for Proton Exchange Membrane Fuel Cells

    PubMed Central

    Hu, Yan; Chua, Daniel H. C.

    2016-01-01

    Highly dense 2D MoS2 fin-like nanostructures on carbon nanospheres were fabricated and formed the main catalyst support structure in the oxygen reduction reaction (ORR) for polymer electrolyte membrane (PEM) fuel cells. These nanofins were observed growing perpendicular to the carbon nanosphere surface in random orientations and high resolution transmission electron microscope confirmed 2D layers. The PEM fuel cell test showed enhanced electrochemical activity with good stability, generating over 8.5 W.mgPt−1 as compared to standard carbon black of 7.4 W.mgPt−1 under normal operating conditions. Electrochemical Impedance Spectroscopy confirmed that the performance improvement is highly due to the excellent water management of the MoS2 lamellar network, which facilitates water retention at low current density and flood prevention at high current density. Reliability test further demonstrated that these nanofins are highly stable in the electrochemical reaction and is an excellent ORR catalyst support. PMID:27302135

  14. Mental and social health in disasters: the Sphere standards and post-tsunami psychosocial interventions in Asia.

    PubMed

    Henderson, Silja E K; Elsass, Peter; Berliner, Peter

    2016-07-01

    The primary objective of this paper is to examine and inform the mental health and psychosocial support standards of the 2011 edition of the Sphere Project's Humanitarian Charter and Minimum Standards in Humanitarian Response. This is done through a qualitative analysis of internal evaluation documents, reflecting four long-term humanitarian psychosocial programmes in different countries in post-tsunami Asia. The analysis yielded three overall conclusions. First, the Sphere standards on mental health and psychosocial support generally are highly relevant to long-term psychosocial interventions after disasters such as the Indian Ocean tsunami of 26 December 2004, and their application in such settings may improve the quality of the response. Second, some of the standards in the current Sphere handbook may lack sufficient guidance to ensure the quality of humanitarian response required. Third, the long-term intervention approach poses specific challenges to programming, a problem that could be addressed by including additional guidance in the publication. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  15. Three-Dimensional Measurement Applied in Design Eye Point of Aircraft Cockpits.

    PubMed

    Wang, Yanyan; Guo, Xiaochao; Liu, Qingfeng; Xiao, Huajun; Bai, Yu

    2018-04-01

    Inappropriate design eye point (DEP) will lead to nonstandard sitting postures, including nonneutral head positions and other uncomfortable sitting postures, which are high risk factors for neck pain in fighter pilots exposed to high G forces. Therefore, application of a 3D measurement method to collect data regarding eye position while in the cruising sitting posture in the aircraft cockpit to guide the design eye point has been proposed. A total of 304 male fixed wing aircraft pilots were divided into two groups. Subgroup A (N = 48) were studied to define the cruising posture during flight. Subgroup B (N = 256) were studied with Romer 3D measurement equipment to locate the cruising eye position of the pilots in a simulated cockpit. The 3D data were compared to DEP data in the current standard cockpit. According to 3D measurement, the vertical distance from the cruising eye point to the neutral seat reference point was 759 mm, which is 36 mm lower than that of the Chinese standard DEP and also lower than the U.S. military standard. The horizontal distance was 131 mm, which is 24 mm shorter than that of the Chinese standard. The current DEP data cannot fulfill the needs of fighter pilots and should be amended according to the results of the 3D measurement so that pilots can acquire the optimal cruising posture in flight. This new method has the value of practical application to investigate cockpit ergonomics and the measurement data can guide DEP design.Wang Y, Guo X, Liu Q, Xiao H, Bai Y. Three-dimensional measurement applied in design eye point of aircraft cockpits. Aerosp Med Hum Perform. 2018; 89(4):371-376.

  16. Posturography and locomotor tests of dynamic balance after long-duration spaceflight.

    PubMed

    Cohen, Helen S; Kimball, Kay T; Mulavara, Ajitkumar P; Bloomberg, Jacob J; Paloski, William H

    2012-01-01

    The currently approved objective clinical measure of standing balance in astronauts after space flight is the Sensory Organization Test battery of computerized dynamic posturography. No tests of walking balance are currently approved for standard clinical testing of astronauts. This study determined the sensitivity and specificity of standing and walking balance tests for astronauts before and after long-duration space flight. Astronauts were tested on an obstacle avoidance test known as the Functional Mobility Test (FMT) and on the Sensory Organization Test using sway-referenced support surface motion with eyes closed (SOT 5) before and six months after (n=15) space flight on the International Space Station. They were tested two to seven days after landing. Scores on SOT tests decreased and scores on FMT increased significantly from pre- to post-flight. In other words, post-flight scores were worse than pre-flight scores. SOT and FMT scores were not significantly related. ROC analyses indicated supra-clinical cut-points for SOT 5 and for FMT. The standard clinical cut-point for SOT 5 had low sensitivity to post-flight astronauts. Higher cut-points increased sensitivity to post-flight astronauts but decreased specificity to pre-flight astronauts. Using an FMT cut-point that was moderately highly sensitive and highly specific plus SOT 5 at the standard clinical cut-point was no more sensitive than SOT 5, alone. FMT plus SOT 5 at higher cut-points was more specific and more sensitive. The total correctly classified was highest for FMT, alone, and for FMT plus SOT 5 at the highest cut-point. These findings indicate that standard clinical comparisons are not useful for identifying problems. Testing both standing and walking balance will be more likely to identify balance deficits.

  17. SLAE–CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies

    PubMed Central

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-01-01

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE–CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE–CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE–CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology. PMID:28657577

  18. Cellular effects of acute exposure to high peak power microwave systems: Morphology and toxicology.

    PubMed

    Ibey, Bennett L; Roth, Caleb C; Ledwig, Patrick B; Payne, Jason A; Amato, Alayna L; Dalzell, Danielle R; Bernhard, Joshua A; Doroski, Michael W; Mylacraine, Kevin S; Seaman, Ronald L; Nelson, Gregory S; Woods, Clifford W

    2016-03-15

    Electric fields produced by advanced pulsed microwave transmitter technology now readily exceed the Institute of Electrical and Electronic Engineers (IEEE) C.95.1 peak E-field limit of 100 kV/m, highlighting a need for scientific validation of such a specific limit. Toward this goal, we exposed Jurkat Clone E-6 human lymphocyte preparations to 20 high peak power microwave (HPPM) pulses (120 ns duration) with a mean peak amplitude of 2.3 MV/m and standard deviation of 0.1 with the electric field at cells predicted to range from 0.46 to 2.7 MV/m, well in excess of current standard limit. We observed that membrane integrity and cell morphology remained unchanged 4 h after exposure and cell survival 24 h after exposure was not statistically different from sham exposure or control samples. Using flow cytometry to analyze membrane disruption and morphological changes per exposed cell, no changes were observed in HPPM-exposed samples. Current IEEE C95.1-2005 standards for pulsed radiofrequency exposure limits peak electric field to 100 kV/m for pulses shorter than 100 ms [IEEE (1995) PC95.1-Standard for Safety Levels with Respect to Human Exposure to Electric, Magnetic and Electromagnetic Fields, 0 Hz to 300 GHz, Institute of Electrical and Electronic Engineers: Piscataway, NJ, USA]. This may impose large exclusion zones that limit HPPM technology use. In this study, we offer evidence that maximum permissible exposure of 100 kV/m for peak electric field may be unnecessarily restrictive for HPPM devices. Bioelectromagnetics. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. SLAE-CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies.

    PubMed

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-06-28

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE-CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE-CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE-CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology.

  20. Analysis of Heavy Metal Content (Pb) on Waters and Fish at The Floating Cages BPPP Ambon

    NASA Astrophysics Data System (ADS)

    Wattimena, Rachel L.; Selanno, Debby A. J.; Tuhumury, Semuel F.; Tuahatu, Juliana W.

    2018-02-01

    Coastal waters play important roles due to highly in natural resources and developing of environmental services. However, there are highly intensity of natural resources utilization, environment and settlement. Consequently, environment and natural resources would be degraded such as in the Ambon Bay. One of the potency at the Ambon Bay is mariculture area namely the floating cages (KJA) which belongs to Fisheries education and training (BPPP) Ambon. The research aimed to analyze physical-chemical of waters (temperature, pH, salinity and current speed), to analyze heavy metal concentration (Pb) on water and fish from floating cages (KJA) and to analyze waters pollution status at KJA BPPP Ambon. The average salinity of each floating cage ranged from 30.09 - 30.34°C, pH ranged from 8.03 - 8.44, salinity ranged from 31.36 - 33.34 PSU, and current speed at spring tide ranged from 0.5 - 55.8 Cm/sec while neap tide ranged from 0.1 - 9.8 Cm/sec. Heavy metal concentration (Pb) on waters was below the standard for waters quality and the average concentration was 0.002 mg/l. Whilst, the heavy metal concentration (Pb) on fishes was below standard for floating cages (floating cages 2-6) which was 0.05 and 0.17mg/l. Otherwise, floating cage 1 had been above maximum standard for fish food and its processing following SNI 7387:2009 (0.3mg/l) which was 0.31 mg/l. The status of waters pollution at KJA BPPP Ambon belonged to C class and could be categorized as moderate based on standard for waters quality issued by State Ministerial Decree for the Environment No. 51 Year 2004.

  1. New integration concept of PIN photodiodes in 0.35μm CMOS technologies

    NASA Astrophysics Data System (ADS)

    Jonak-Auer, I.; Teva, J.; Park, J. M.; Jessenig, S.; Rohrbacher, M.; Wachmann, E.

    2012-06-01

    We report on a new and very cost effective way to integrate PIN photo detectors into a standard CMOS process. Starting with lowly p-doped (intrinsic) EPI we need just one additional mask and ion implantation in order to provide doping concentrations very similar to standard CMOS substrates to areas outside the photoactive regions. Thus full functionality of the standard CMOS logic can be guaranteed while the photo detectors highly benefit from the low doping concentrations of the intrinsic EPI. The major advantage of this integration concept is that complete modularity of the CMOS process remains untouched by the implementation of PIN photodiodes. Functionality of the implanted region as host of logic components was confirmed by electrical measurements of relevant standard transistor as well as ESD protection devices. We also succeeded in establishing an EPI deposition process in austriamicrosystems 200mm wafer fabrication which guarantees the formation of very lowly p-doped intrinsic layers, which major semiconductor vendors could not provide. With our EPI deposition process we acquire doping levels as low as 1•1012/cm3. In order to maintain those doping levels during CMOS processing we employed special surface protection techniques. After complete CMOS processing doping concentrations were about 4•1013/cm3 at the EPI surface while the bulk EPI kept its original low doping concentrations. Photodiode parameters could further be improved by bottom antireflective coatings and a special implant to reduce dark currents. For 100×100μm2 photodiodes in 20μm thick intrinsic EPI on highly p-doped substrates we achieved responsivities of 0.57A/W at λ=675nm, capacitances of 0.066pF and dark currents of 0.8pA at 2V reverse voltage.

  2. The Call for Colloquialisms: Origins and Remedies

    ERIC Educational Resources Information Center

    Salah, Reem

    2015-01-01

    This research aims at discovering the gap between Standard Arabic and the current spoken varieties of Arabic due to social, educational, political, colonial, and media factors. The researcher will try to also analyse the causes of the current gap and suggest remedies. Standard Arabic (SA) or FuSha (the Arabic term for "standard Arabic")…

  3. 75 FR 39023 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-07

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  4. 75 FR 27348 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  5. 75 FR 9229 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-01

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  6. 75 FR 5088 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-01

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  7. 75 FR 154 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  8. 75 FR 45128 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-02

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the..., ``Certification of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  9. 75 FR 32950 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  10. 75 FR 55795 - Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-14

    ... Current List of Laboratories Which Meet Minimum Standards To Engage in Urine Drug Testing for Federal... Drug Testing Programs (Mandatory Guidelines). The Mandatory Guidelines were first published in the... of Laboratories Engaged in Urine Drug Testing for Federal Agencies,'' sets strict standards that...

  11. A 32-Channel Combined RF and B0 Shim Array for 3T Brain Imaging

    PubMed Central

    Stockmann, Jason P.; Witzel, Thomas; Keil, Boris; Polimeni, Jonathan R.; Mareyam, Azma; LaPierre, Cristen; Setsompop, Kawin; Wald, Lawrence L.

    2016-01-01

    Purpose We add user-controllable direct currents (DC) to the individual elements of a 32-channel radio-frequency (RF) receive array to provide B0 shimming ability while preserving the array’s reception sensitivity and parallel imaging performance. Methods Shim performance using constrained DC current (±2.5A) is simulated for brain arrays ranging from 8 to 128 elements. A 32-channel 3-tesla brain array is realized using inductive chokes to bridge the tuning capacitors on each RF loop. The RF and B0 shimming performance is assessed in bench and imaging measurements. Results The addition of DC currents to the 32-channel RF array is achieved with minimal disruption of the RF performance and/or negative side effects such as conductor heating or mechanical torques. The shimming results agree well with simulations and show performance superior to third-order spherical harmonic (SH) shimming. Imaging tests show the ability to reduce the standard frontal lobe susceptibility-induced fields and improve echo planar imaging geometric distortion. The simulation of 64- and 128-channel brain arrays suggest that even further shimming improvement is possible (equivalent to up to 6th-order SH shim coils). Conclusion Including user-controlled shim currents on the loops of a conventional highly parallel brain array coil is feasible with modest current levels and produces improved B0 shimming performance over standard second-order SH shimming. PMID:25689977

  12. The Importance of Item Wording: The Distinction Between Measuring High Standards versus Measuring Perfectionism and Why It Matters

    ERIC Educational Resources Information Center

    Blasberg, Jonathan S.; Hewitt, Paul L.; Flett, Gordon L.; Sherry, Simon B.; Chen, Chang

    2016-01-01

    In the current research, we illustrate the impact that item wording has on the content of personality scales and how differences in item wording influence empirical results. We present evidence indicating that items in certain scales used to measure "adaptive" perfectionism fail to capture the disabling all-or-nothing approach that is…

  13. A Comprehensive Analysis of the Efficacy of Non-Cognitive Measures: Predicting Academic Success in a Historically Black University in South Texas

    ERIC Educational Resources Information Center

    Lanham, B. Dean; Schauer, Edward J.; Osho, G. Solomon

    2011-01-01

    Universities have long used standardized American College Tests (ACT), Scholastic Aptitude Tests (SAT), and high school Grade Point Averages (HS GPA) for academic admission requirements. The current study of 127 minority college students in a Historically Black University in South Texas assesses an alternative measure, the Non-Cognitive…

  14. Paul Revere Rides through High School Government Class: Teacher Research and the Power of Discussion to Motivate Thinking

    ERIC Educational Resources Information Center

    Newstreet, Carmen

    2008-01-01

    Teachers in the secondary social studies classroom do not regularly take the time to practice structured reflection on their teaching methods. In our current standards-driven environment, social studies classrooms are often not seen as places of higher learning. To combat these stereotypes, the author presents a method for accomplishing reflection…

  15. A single tri-epitopic antibody virtually recapitulates the potency of a combination of three monoclonal antibodies in neutralization of Botulinum Neurotoxin Serotype A

    USDA-ARS?s Scientific Manuscript database

    Botulinum neurotoxins (BoNTs) are one of the six highest-risk threat agents for bioterrorism, due to their extreme potency and lethality, ease of production, and need for prolonged intensive care of intoxicated patients. The current standard of treatment, equine antitoxin, has a high incidence of al...

  16. Ranking Disciplinary Journals with the Google Scholar H-Index: A New Tool for Constructing Cases for Tenure, Promotion, and Other Professional Decisions

    ERIC Educational Resources Information Center

    Hodge, David R.; Lacasse, Jeffrey R.

    2011-01-01

    Given the importance of journal rankings to tenure, promotion, and other professional decisions, this study examines a new method for ranking social work journals. The Google Scholar h-index correlated highly with the current gold standard for measuring journal quality, Thomson Institute for Scientific Information (ISI) impact factors, but…

  17. Combating WMD Journal. Issue 2

    DTIC Science & Technology

    2008-03-01

    can be conducted utilizing passive detectors such as thermoluminescent dosime- ters (TLDs) or optically stimulated luminescent ( OSL ) dosimeters ...reasonable estimate of the dose. The challenge in high-energy bremsstrahlung fields is that current (standard) dosimeters do not provide for CPE...above a few MeV. CPE can be obtained by placing tissue- equivalent material (such as a build- up cap) around the dosimeter . This Dosimetry Needs

  18. An Examination of the Potential of Secondary Mathematics Curriculum Materials to Support Teacher and Student Learning of Probabiility and Statistics

    ERIC Educational Resources Information Center

    Williams, Joshua E.

    2016-01-01

    The Common Core State Standards for Mathematics (CCSSSM) suggest many changes to secondary mathematics education including an increased focus on conceptual understanding and the inclusion of content and processes that are beyond what is currently taught to most high school students. To facilitate these changes, students will need opportunities to…

  19. Intratheater Airlift Functional Needs Analysis (FNA)

    DTIC Science & Technology

    2011-01-01

    information on reprint and linking permissions, please see RAND Permissions. Skip all front matter: Jump to Page 16 The RAND Corporation is a nonprofit...facing the public and private sectors. All RAND mono- graphs undergo rigorous peer review to ensure high standards for research quality and...personnel. xii Intratheater Airlift Functional Needs Analysis all operating environments. The FNA assesses the ability of current assets to

  20. Analysis of Reading Fluency and Comprehension Measures for Sixth Grade Students. Technical Report #24

    ERIC Educational Resources Information Center

    Alonzo, Julie; Tindal, Gerald

    2004-01-01

    The No Child Left Behind Act of 2001 has increased the importance of assessment in K-12 education. Designed to ensure that all students meet high academic standards, the law currently requires states receiving Title I funds to test all children annually in reading and math in grades 3 through 8 and report student performance disaggregated by…

  1. Antibody-mediated delivery of therapeutics for cancer therapy.

    PubMed

    Parakh, Sagun; Parslow, Adam C; Gan, Hui K; Scott, Andrew M

    2016-01-01

    Antibody-conjugated therapies (ACTs) combine the specificity of monoclonal antibodies to target cancer cells directly with highly potent payloads, often resulting in superior efficacy and/or reduced toxicity. This represents a new approach to the treatment of cancer. There have been highly promising clinical trial results using this approach with improvements in linker and payload technology. The breadth of current trials examining ACTs in haematological malignancies and solid tumours indicate the potential for clinical impact. This review will provide an overview of ACTs currently in clinical development as well as the principles of antibody delivery and types of payloads used, including cytotoxic drugs, radiolabelled isotopes, nanoparticle-based siRNA particles and immunotoxins. The focus of much of the clinical activity in ACTs has, understandably, been on their use as a monotherapy or in combination with standard of care drugs. This will continue, as will the search for better targets, linkers and payloads. Increasingly, as these drugs enter routine clinical care, important questions will arise regarding how to optimise ACT treatment approaches, including investigation of resistance mechanisms, biomarker and patient selection strategies, understanding of the unique toxicities of these drugs, and combinatorial approaches with standard therapies as well as emerging therapeutic agents like immunotherapy.

  2. Stellar Presentations (Abstract)

    NASA Astrophysics Data System (ADS)

    Young, D.

    2015-12-01

    (Abstract only) The AAVSO is in the process of expanding its education, outreach and speakers bureau program. powerpoint presentations prepared for specific target audiences such as AAVSO members, educators, students, the general public, and Science Olympiad teams, coaches, event supervisors, and state directors will be available online for members to use. The presentations range from specific and general content relating to stellar evolution and variable stars to specific activities for a workshop environment. A presentation—even with a general topic—that works for high school students will not work for educators, Science Olympiad teams, or the general public. Each audience is unique and requires a different approach. The current environment necessitates presentations that are captivating for a younger generation that is embedded in a highly visual and sound-bite world of social media, twitter and U-Tube, and mobile devices. For educators, presentations and workshops for themselves and their students must support the Next Generation Science Standards (NGSS), the Common Core Content Standards, and the Science Technology, Engineering and Mathematics (STEM) initiative. Current best practices for developing relevant and engaging powerpoint presentations to deliver information to a variety of targeted audiences will be presented along with several examples.

  3. Early Maladaptive Schemas in a Sample of Airline Pilots seeking Residential Substance Use Treatment: An Initial Investigation

    PubMed Central

    Shorey, Ryan C.; Brasfield, Hope; Anderson, Scott; Stuart, Gregory L.

    2014-01-01

    Background Recent research has begun to examine the early maladaptive schemas of substance abusers, as it is believed that targeting these core beliefs in treatment may result in improved substance use outcomes. One special population that has received scant attention in the research literature, despite high levels of substance use, is airline pilots. Aims The current study examined the early maladaptive schemas of a sample of airline pilots (n = 64) who were seeking residential treatment for alcohol dependence and whether they differed in early maladaptive schemas from non-pilot substance abusers who were also seeking residential treatment for alcohol dependence (n = 45). Method Pre-existing medical records from patients of a residential substance abuse treatment facility were reviewed for the current study. Results Of the 18 early maladaptive schemas, results demonstrated that pilots scored higher than non-pilots on the early maladaptive schema of unrelenting standards (high internalized standards of behavior), whereas non-pilots scored higher on insufficient self-control (low frustration tolerance and self-control). Conclusions Early maladaptive schemas may be a relevant treatment target for substance abuse treatment seeking pilots and non-pilots. PMID:24701252

  4. Study of superhydrophobic electrosprayed catalyst layers using a localized reference electrode technique

    NASA Astrophysics Data System (ADS)

    Chaparro, A. M.; Ferreira-Aparicio, P.; Folgado, M. A.; Brightman, E.; Hinds, G.

    2016-09-01

    The performance of electrosprayed cathode catalyst layers in a polymer electrolyte membrane fuel cell (PEMFC) is studied using a localized reference electrode technique. Single cells with an electrosprayed cathode catalyst layer show an increase of >20% in maximum power density under standard testing conditions, compared with identical cells assembled with a conventional, state-of-the-art, gas diffusion cathode. When operated at high current density (1.2 A cm-2) the electrosprayed catalyst layers show more homogeneous distribution of the localized cathode potential, with a standard deviation from inlet to outlet of <50 mV, compared with 79 mV for the conventional gas diffusion cathode. Higher performance and homogeneity of cell response is attributed to the superhydrophobic nature of the macroporous electrosprayed catalyst layer structure, which enhances the rate of expulsion of liquid water from the cathode. On the other hand, at low current densities (<0.5 A cm-2), the electrosprayed layers exhibit more heterogeneous distribution of cathode potential than the conventional cathodes; this behavior is attributed to less favorable kinetics for oxygen reduction in very hydrophobic catalyst layers. The optimum performance may be obtained with electrosprayed catalyst layers employing a high Pt/C catalyst ratio.

  5. Comparison of compression efficiency between HEVC/H.265 and VP9 based on subjective assessments

    NASA Astrophysics Data System (ADS)

    Řeřábek, Martin; Ebrahimi, Touradj

    2014-09-01

    Current increasing effort of broadcast providers to transmit UHD (Ultra High Definition) content is likely to increase demand for ultra high definition televisions (UHDTVs). To compress UHDTV content, several alternative encoding mechanisms exist. In addition to internationally recognized standards, open access proprietary options, such as VP9 video encoding scheme, have recently appeared and are gaining popularity. One of the main goals of these encoders is to efficiently compress video sequences beyond HDTV resolution for various scenarios, such as broadcasting or internet streaming. In this paper, a broadcast scenario rate-distortion performance analysis and mutual comparison of one of the latest video coding standards H.265/HEVC with recently released proprietary video coding scheme VP9 is presented. Also, currently one of the most popular and widely spread encoder H.264/AVC has been included into the evaluation to serve as a comparison baseline. The comparison is performed by means of subjective evaluations showing actual differences between encoding algorithms in terms of perceived quality. The results indicate a general dominance of HEVC based encoding algorithm in comparison to other alternatives, while VP9 and AVC showing similar performance.

  6. Lasers in clinical urology: state of the art and new horizons.

    PubMed

    Marks, Andrew J; Teichman, Joel M H

    2007-06-01

    We present an overview of current and emerging lasers for Urology. We begin with an overview of the Holmium:YAG laser. The Ho:YAG laser is the gold standard lithotripsy modality for endoscopic lithotripsy, and compares favorably to standard electrocautery transurethral resection of the prostate for benign prostatic hyperplasia (BPH). Available laser technologies currently being studied include the frequency doubled double-pulse Nd:Yag (FREDDY) and high-powered potassium-titanyl-phosphate (KTP) lasers. The FREDDY laser presents an affordable and safe option for intracorporeal lithotripsy, but it does not fragment all stone compositions, and does not have soft tissue applications. The high power KTP laser shows promise in the ablative treatment of BPH. Initial experiments with the Erbium:YAG laser show it has improved efficiency of lithotripsy and more precise ablative and incisional properties compared to Ho:YAG, but the lack of adequate optical fibers limits its use in Urology. Thulium:YAG fiber lasers have also demonstrated tissue ablative and incision properties comparable to Ho:YAG. Lastly, compact size, portability, and low maintenance schedules of fiber lasers may allow them to shape the way lasers are used by urologists in the future.

  7. Feasibility of an in situ measurement device for bubble size and distribution.

    PubMed

    Junker, Beth; Maciejak, Walter; Darnell, Branson; Lester, Michael; Pollack, Michael

    2007-09-01

    The feasibility of in situ measurement device for bubble size and distribution was explored. A novel in situ probe measurement system, the EnviroCam, was developed. Where possible, this probe incorporated strengths, and minimized weaknesses of historical and currently available real-time measurement methods for bubbles. The system was based on a digital, high-speed, high resolution, modular camera system, attached to a stainless steel shroud, compatible with standard Ingold ports on fermenters. Still frames and/or video were produced, capturing bubbles passing through the notch of the shroud. An LED light source was integral with the shroud. Bubbles were analyzed using customized commercially available image analysis software and standard statistical methods. Using this system, bubble sizes were measured as a function of various operating parameters (e.g., agitation rate, aeration rate) and as a function of media properties (e.g., viscosity, antifoam, cottonseed flour, and microbial/animal cell broths) to demonstrate system performance and its limitations. For selected conditions, mean bubble size changes qualitatively compared favorably with published relationships. Current instrument measurement capabilities were limited primarily to clear solutions that did not contain large numbers of overlapping bubbles.

  8. Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Link, H.; Keller, J.; Guo, Y.

    2013-04-01

    Gearboxes in wind turbines have not been achieving their expected design life even though they commonly meet or exceed the design criteria specified in current design standards. One of the basic premises of the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) is that the low gearbox reliability results from the absence of critical elements in the design process or insufficient design tools. Key goals of the GRC are to improve design approaches and analysis tools and to recommend practices and test methods resulting in improved design standards for wind turbine gearboxes that lower the cost of energy (COE)more » through improved reliability. The GRC uses a combined gearbox testing, modeling and analysis approach, along with a database of information from gearbox failures collected from overhauls and investigation of gearbox condition monitoring techniques to improve wind turbine operations and maintenance practices. Testing of Gearbox 2 (GB2) using the two-speed turbine controller that has been used in prior testing. This test series will investigate non-torque loads, high-speed shaft misalignment, and reproduction of field conditions in the dynamometer. This test series will also include vibration testing using an eddy-current brake on the gearbox's high speed shaft.« less

  9. Noise producing toys and the efficacy of product standard criteria to protect health and education outcomes.

    PubMed

    McLaren, Stuart J; Page, Wyatt H; Parker, Lou; Rushton, Martin

    2013-12-19

    An evaluation of 28 commercially available toys imported into New Zealand revealed that 21% of these toys do not meet the acoustic criteria in the ISO standard, ISO 8124-1:2009 Safety of Toys, adopted by Australia and New Zealand as AS/NZS ISO 8124.1:2010. While overall the 2010 standard provided a greater level of protection than the earlier 2002 standard, there was one high risk toy category where the 2002 standard provided greater protection. A secondary set of toys from the personal collections of children known to display atypical methods of play with toys, such as those with autism spectrum disorders (ASD), was part of the evaluation. Only one of these toys cleanly passed the 2010 standard, with the remainder failing or showing a marginal-pass. As there is no tolerance level stated in the standards to account for interpretation of data and experimental error, a value of +2 dB was used. The findings of the study indicate that the current standard is inadequate in providing protection against excessive noise exposure. Amendments to the criteria have been recommended that apply to the recently adopted 2013 standard. These include the integration of the new approaches published in the recently amended European standard (EN 71) on safety of toys.

  10. Noise Producing Toys and the Efficacy of Product Standard Criteria to Protect Health and Education Outcomes

    PubMed Central

    McLaren, Stuart J.; Page, Wyatt H.; Parker, Lou; Rushton, Martin

    2013-01-01

    An evaluation of 28 commercially available toys imported into New Zealand revealed that 21% of these toys do not meet the acoustic criteria in the ISO standard, ISO 8124-1:2009 Safety of Toys, adopted by Australia and New Zealand as AS/NZS ISO 8124.1:2010. While overall the 2010 standard provided a greater level of protection than the earlier 2002 standard, there was one high risk toy category where the 2002 standard provided greater protection. A secondary set of toys from the personal collections of children known to display atypical methods of play with toys, such as those with autism spectrum disorders (ASD), was part of the evaluation. Only one of these toys cleanly passed the 2010 standard, with the remainder failing or showing a marginal-pass. As there is no tolerance level stated in the standards to account for interpretation of data and experimental error, a value of +2 dB was used. The findings of the study indicate that the current standard is inadequate in providing protection against excessive noise exposure. Amendments to the criteria have been recommended that apply to the recently adopted 2013 standard. These include the integration of the new approaches published in the recently amended European standard (EN 71) on safety of toys. PMID:24452254

  11. Quality control evaluation of Keshamasi, Keshanjana and Keshamasi eye ointment.

    PubMed

    Dhiman, Kartar Singh; Shukla, Vinay J; Bhalodia, Nayan R; Sharma, Vinay R

    2014-01-01

    Keshanjana (collyrium) is a well known Ayurvedic preparation prepared out of Keshamasi (ash prepared by scalp hairs) mixed with Goghrita (cow's ghee). This medicine is indicated for the treatment of Shushkakshipaka (dry eye syndrome) in the classical literature of Ayurveda; hence, it was under taken for standardization and clinical evaluation in an extra-mural research project from Central Council for Research in Ayurvedic Sciences, Department of AYUSH, New Delhi. To develop standard quality parameters for the Keshamasi, Keshanjana and Keshamasi ointment. Scalp hairs of male and females collected from saloons were converted to classical Masi Kalpana and mixed with cow ghee and petrolatum in the ratio of 1:5 to prepare the Keshanjana and Keshamasi ointment respectively. Standard Operation Procedure (SOP) were adopted and recorded accordingly. The raw material, furnished products and plain Goghrita were subjected for quality control parameters i.e., physico-chemical evaluation, anti-microbial study, particle size analysis, heavy metal analysis through inductive couple plasma spectroscopy with high performance thin layer liquid chromatography fingerprints. Rancidity was negative in all the samples, indicating that the physico-chemical parameters are in acceptable range. Lead and zinc were present in most of the samples; while all samples are were free from microbial contamination. As no standards are available to compare the results of the current study, the observations cannot be compared. Thus the profile generated in the current study can be considered as standard to refer in future studies.

  12. Quality control evaluation of Keshamasi, Keshanjana and Keshamasi eye ointment

    PubMed Central

    Dhiman, Kartar Singh; Shukla, Vinay J.; Bhalodia, Nayan R.; Sharma, Vinay R.

    2014-01-01

    Background: Keshanjana (collyrium) is a well known Ayurvedic preparation prepared out of Keshamasi (ash prepared by scalp hairs) mixed with Goghrita (cow's ghee). This medicine is indicated for the treatment of Shushkakshipaka (dry eye syndrome) in the classical literature of Ayurveda; hence, it was under taken for standardization and clinical evaluation in an extra-mural research project from Central Council for Research in Ayurvedic Sciences, Department of AYUSH, New Delhi. Aim: To develop standard quality parameters for the Keshamasi, Keshanjana and Keshamasi ointment. Materials and Methods: Scalp hairs of male and females collected from saloons were converted to classical Masi Kalpana and mixed with cow ghee and petrolatum in the ratio of 1:5 to prepare the Keshanjana and Keshamasi ointment respectively. Standard Operation Procedure (SOP) were adopted and recorded accordingly. The raw material, furnished products and plain Goghrita were subjected for quality control parameters i.e., physico-chemical evaluation, anti-microbial study, particle size analysis, heavy metal analysis through inductive couple plasma spectroscopy with high performance thin layer liquid chromatography fingerprints. Results: Rancidity was negative in all the samples, indicating that the physico-chemical parameters are in acceptable range. Lead and zinc were present in most of the samples; while all samples are were free from microbial contamination. Conclusion: As no standards are available to compare the results of the current study, the observations cannot be compared. Thus the profile generated in the current study can be considered as standard to refer in future studies. PMID:25364202

  13. Specific NIST projects in support of the NIJ Concealed Weapon Detection and Imaging Program

    NASA Astrophysics Data System (ADS)

    Paulter, Nicholas G.

    1998-12-01

    The Electricity Division of the National Institute of Standards and Technology is developing revised performance standards for hand-held (HH) and walk-through (WT) metal weapon detectors, test procedures and systems for these detectors, and a detection/imaging system for finding concealed weapons. The revised standards will replace the existing National Institute of Justice (NIJ) standards for HH and WT devices and will include detection performance specifications as well as system specifications (environmental conditions, mechanical strength and safety, response reproducibility and repeatability, quality assurance, test reporting, etc.). These system requirements were obtained from the Law Enforcement and corrections Technology Advisory Council, an advisory council for the NIJ. Reproducible and repeatable test procedures and appropriate measurement systems will be developed for evaluating HH and WT detection performance. A guide to the technology and application of non- eddy-current-based detection/imaging methods (such as acoustic, passive millimeter-wave and microwave, active millimeter-wave and terahertz-wave, x-ray, etc.) Will be developed. The Electricity Division is also researching the development of a high- frequency/high-speed (300 GH to 1 THz) pulse-illuminated, stand- off, video-rate, concealed weapons/contraband imaging system.

  14. An Update on the CCSDS Optical Communications Working Group

    NASA Technical Reports Server (NTRS)

    Edwards, Bernard L.; Schulz, Klaus-Juergen; Hamkins, Jonathan; Robinson, Bryan; Alliss, Randall; Daddato, Robert; Schmidt, Christopher; Giggebach, Dirk; Braatz, Lena

    2017-01-01

    International space agencies around the world are currently developing optical communication systems for Near Earth and Deep Space applications for both robotic and human rated spacecraft. These applications include both links between spacecraft and links between spacecraft and ground. The Interagency Operation Advisory Group (IOAG) has stated that there is a strong business case for international cross support of spacecraft optical links. It further concluded that in order to enable cross support the links must be standardized. This paper will overview the history and structure of the space communications international standards body, the Consultative Committee for Space Data Systems (CCSDS), that will develop the standards and provide an update on the proceedings of the Optical Communications Working Group within CCSDS. This paper will also describe the set of optical communications standards being developed and outline some of the issues that must be addressed in the next few years. The paper will address in particular the ongoing work on application scenarios for deep space to ground called High Photon Efficiency, for LEO to ground called Low Complexity, for inter-satellite and near Earth to ground called High Data Rate, as well as associated atmospheric measurement techniques and link operations concepts.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendell, Mark J.; Fisk, William J.

    Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effectsmore » associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each specific outcome threshold are estimated; and the highest of these MVRs, which would then meet all outcome thresholds, is selected as the target MVR. In a second step, implemented only if the target MVR from step 1 is judged impractically high, costs and benefits are estimated and this information is used in a risk management process. Four human outcomes with substantial quantitative evidence of relationships to VRs are identified for initial consideration in setting MVR standards. These are: building-related symptoms (sometimes called sick building syndrome symptoms), poor perceived indoor air quality, and diminished work performance, all with data relating them directly to VRs; and cancer and non-cancer chronic outcomes, related indirectly to VRs through specific VR-influenced indoor contaminants. In an application of step 1 for offices using a set of example outcome thresholds, a target MVR of 9 L/s (19 cfm) per person was needed. Because this target MVR was close to MVRs in current standards, use of a cost/benefit process seemed unnecessary. Selection of more stringent thresholds for one or more human outcomes, however, could raise the target MVR to 14 L/s (30 cfm) per person or higher, triggering the step 2 risk management process. Consideration of outdoor air pollutant effects would add further complexity to the framework. For balancing the objective and subjective factors involved in setting MVRs in a cost-benefit process, it is suggested that a diverse group of stakeholders make the determination after assembling as much quantitative data as possible.« less

  16. Role of particle radiotherapy in the management of head and neck cancer.

    PubMed

    Laramore, George E

    2009-05-01

    Modern imaging techniques and powerful computers allow a radiation oncologist to design treatments delivering higher doses of radiation than previously possible. Dose distributions imposed by the physics of 'standard' photon and electron beams limit further dose escalation. Hadron radiotherapy offers advantages in either dose distribution and/or improved radiobiology that may significantly improve the treatment of certain head and neck malignancies. Clinical studies support the effectiveness of fast-neutron radiotherapy in the treatment of major and minor salivary gland tumors. Data show highly favorable outcomes with proton radiotherapy for skull-base malignancies and tumors near highly critical normal tissues compared with that expected with standard radiotherapy. Heavy-ion radiotherapy clinical studies are mainly being conducted with fully stripped carbon ions, and limited data seem to indicate a possible improvement over proton radiotherapy for the same subset of radioresistant tumors where neutrons show a benefit over photons. Fast-neutron radiotherapy has different radiobiological properties compared with standard radiotherapy but similar depth dose distributions. Its role in the treatment of head and neck cancer is currently limited to salivary gland malignancies and certain radioresistant tumors such as sarcomas. Protons have the same radiobiological properties as standard radiotherapy beams but more optimal depth dose distributions, making it particularly advantageous when treating tumors adjacent to highly critical structures. Heavy ions combine the radiobiological properties of fast neutrons with the physical dose distributions of protons, and preliminary data indicate their utility for radioresistant tumors adjacent to highly critical structures.

  17. Next Generation Flow for highly sensitive and standardized detection of minimal residual disease in multiple myeloma.

    PubMed

    Flores-Montero, J; Sanoja-Flores, L; Paiva, B; Puig, N; García-Sánchez, O; Böttcher, S; van der Velden, V H J; Pérez-Morán, J-J; Vidriales, M-B; García-Sanz, R; Jimenez, C; González, M; Martínez-López, J; Corral-Mateos, A; Grigore, G-E; Fluxá, R; Pontes, R; Caetano, J; Sedek, L; Del Cañizo, M-C; Bladé, J; Lahuerta, J-J; Aguilar, C; Bárez, A; García-Mateo, A; Labrador, J; Leoz, P; Aguilera-Sanz, C; San-Miguel, J; Mateos, M-V; Durie, B; van Dongen, J J M; Orfao, A

    2017-10-01

    Flow cytometry has become a highly valuable method to monitor minimal residual disease (MRD) and evaluate the depth of complete response (CR) in bone marrow (BM) of multiple myeloma (MM) after therapy. However, current flow-MRD has lower sensitivity than molecular methods and lacks standardization. Here we report on a novel next generation flow (NGF) approach for highly sensitive and standardized MRD detection in MM. An optimized 2-tube 8-color antibody panel was constructed in five cycles of design-evaluation-redesign. In addition, a bulk-lysis procedure was established for acquisition of ⩾10 7 cells/sample, and novel software tools were constructed for automatic plasma cell gating. Multicenter evaluation of 110 follow-up BM from MM patients in very good partial response (VGPR) or CR showed a higher sensitivity for NGF-MRD vs conventional 8-color flow-MRD -MRD-positive rate of 47 vs 34% (P=0.003)-. Thus, 25% of patients classified as MRD-negative by conventional 8-color flow were MRD-positive by NGF, translating into a significantly longer progression-free survival for MRD-negative vs MRD-positive CR patients by NGF (75% progression-free survival not reached vs 7 months; P=0.02). This study establishes EuroFlow-based NGF as a highly sensitive, fully standardized approach for MRD detection in MM which overcomes the major limitations of conventional flow-MRD methods and is ready for implementation in routine diagnostics.

  18. AHCODA-DB: a data repository with web-based mining tools for the analysis of automated high-content mouse phenomics data.

    PubMed

    Koopmans, Bastijn; Smit, August B; Verhage, Matthijs; Loos, Maarten

    2017-04-04

    Systematic, standardized and in-depth phenotyping and data analyses of rodent behaviour empowers gene-function studies, drug testing and therapy design. However, no data repositories are currently available for standardized quality control, data analysis and mining at the resolution of individual mice. Here, we present AHCODA-DB, a public data repository with standardized quality control and exclusion criteria aimed to enhance robustness of data, enabled with web-based mining tools for the analysis of individually and group-wise collected mouse phenotypic data. AHCODA-DB allows monitoring in vivo effects of compounds collected from conventional behavioural tests and from automated home-cage experiments assessing spontaneous behaviour, anxiety and cognition without human interference. AHCODA-DB includes such data from mutant mice (transgenics, knock-out, knock-in), (recombinant) inbred strains, and compound effects in wildtype mice and disease models. AHCODA-DB provides real time statistical analyses with single mouse resolution and versatile suite of data presentation tools. On March 9th, 2017 AHCODA-DB contained 650 k data points on 2419 parameters from 1563 mice. AHCODA-DB provides users with tools to systematically explore mouse behavioural data, both with positive and negative outcome, published and unpublished, across time and experiments with single mouse resolution. The standardized (automated) experimental settings and the large current dataset (1563 mice) in AHCODA-DB provide a unique framework for the interpretation of behavioural data and drug effects. The use of common ontologies allows data export to other databases such as the Mouse Phenome Database. Unbiased presentation of positive and negative data obtained under the highly standardized screening conditions increase cost efficiency of publicly funded mouse screening projects and help to reach consensus conclusions on drug responses and mouse behavioural phenotypes. The website is publicly accessible through https://public.sylics.com and can be viewed in every recent version of all commonly used browsers.

  19. Application of HEC-RAS for flood forecasting in perched river-A case study of hilly region, China

    NASA Astrophysics Data System (ADS)

    Sun, Pingping; Wang, Shuqian; Gan, Hong; Liu, Bin; Jia, Ling

    2017-04-01

    Flooding in small and medium rivers are seriously threatening the safety of human beings’ life and property. The simulation forecasting of the river flood and bank risk in hilly region has gradually become a hotspot. At present, there are few studies on the simulation of hilly perched river, especially in the case of lacking section flow data. And the method of how to determine the position of the levee breach along the river bank is not much enough. Based on the characteristics of the sections in hilly perched river, an attempt is applied in this paper which establishes the correlation between the flow profile computed by HEC-RAS model and the river bank. A hilly perched river in Lingshi County, Shanxi Province of China, is taken as the study object, the levee breach positions along the bank are simulated under four different design storm. The results show that the flood control standard of upper reach is high, which can withstand the design storm of 100 years. The current standard of lower reach is low, which is the flooding channel with high frequency. As the standard of current channel between the 2rd and the 11th section is low, levee along that channel of the river bank is considered to be heighten and reinforced. The study results can provide some technical support for flood proofing in hilly region and some reference for the reinforcement of river bank.

  20. A summative, Objective, Structured, Clinical Examination in ENT used to assess postgraduate doctors after one year of ENT training, as part of the Diploma of Otorhinolaryngology, Head and Neck Surgery.

    PubMed

    Drake-Lee, A B; Skinner, D; Hawthorne, M; Clarke, R

    2009-10-01

    'High stakes' postgraduate medical examinations should conform to current educational standards. In the UK and Ireland, national assessments in surgery are devised and managed through the examination structure of the Royal Colleges of Surgeons. Their efforts are not reported in the medical education literature. In the current paper, we aim to clarify this process. To replace the clinical section of the Diploma of Otorhinolaryngology with an Objective, Structured, Clinical Examination, and to set the level of the assessment at one year of postgraduate training in the specialty. After 'blueprinting' against the whole curriculum, an Objective, Structured, Clinical Examination comprising 25 stations was divided into six clinical stations and 19 other stations exploring written case histories, instruments, test results, written communication skills and interpretation skills. The pass mark was set using a modified borderline method and other methods, and statistical analysis of the results was performed. The results of nine examinations between May 2004 and May 2008 are presented. The pass mark varied between 68 and 82 per cent. Internal consistency was good, with a Cronbach's alpha value of 0.99 for all examinations and split-half statistics varying from 0.96 to 0.99. Different standard settings gave similar pass marks. We have developed a summative, Objective, Structured, Clinical Examination for doctors training in otorhinolaryngology, reported herein. The objectives and standards of setting a high quality assessment were met.

  1. Results of the 1973 NASA/JPL balloon flight solar cell calibration program

    NASA Technical Reports Server (NTRS)

    Yasui, R. K.; Greenwood, R. F.

    1975-01-01

    High altitude balloon flights carried 37 standard solar cells for calibration above 99.5 percent of the earth's atmosphere. The cells were assembled into standard modules with appropriate resistors to load each cell at short circuit current. Each standardized module was mounted at the apex of the balloon on a sun tracker which automatically maintained normal incidence to the sun within 1.0 deg. The balloons were launched to reach a float altitude of approximately 36.6 km two hours before solar noon and remain at float altitude for two hours beyond solar noon. Telemetered calibration data on each standard solar cell was collected and recorded on magnetic tape. At the end of each float period the solar cell payload was separated from the balloon by radio command and descended via parachute to a ground recovery crew. Standard solar cells calibrated and recovered in this manner are used as primary intensity reference standards in solar simulators and in terrestrial sunlight for evaluating the performance of other solar cells and solar arrays with similar spectral response characteristics.

  2. The linguistic demands of the Common Core State Standards for reading and writing informational text in the primary grades.

    PubMed

    Roberts, Kathryn L

    2012-05-01

    Forty-five states and four U.S. territories have committed to implementing the new Common Core State Standards, with the goal of graduating students from our K-12 programs who are ready for college and careers. For many, the new standards represent a shift in genre focus, giving much more specific attention to informational genres. Beginning in the primary grades, the standards set high expectations for students' interaction with informational text, many of which are significantly more linguistically demanding than the standards that they replace. These increased demands are likely to pose difficulties not only for students currently receiving language support, but also for students without identified delays or disabilities. This article describes several of the kindergarten through fifth-grade standards related to informational text, highlighting the linguistic demands that each poses. In addition, instructional strategies are provided that teachers and speech-language pathologists can use to support the understanding and formulation of informational text for listening, reading, speaking, and writing. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  3. Silicosis among gold miners: exposure--response analyses and risk assessment.

    PubMed

    Steenland, K; Brown, D

    1995-10-01

    This study sought to estimate the risk of silicosis by cumulative exposure-years in a cohort of miners exposed to silica, as well as the lifetime risk of silicosis under the current Occupational Safety and Health Administration (OSHA) standard (0.09 mg/m3). In a cohort study of 3330 gold miners who worked at least 1 year underground from 1940 to 1965 (average 9 years) and were exposed to a median silica level of 0.05 mg/m3 (0.15 mg/m3 for those hired before 1930), 170 cases of silicosis were determined from either death certificates or two cross-sectional radiographic surveys. The risk of silicosis was less than 1% with a cumulative exposure under 0.5 mg/m3-years, increasing to 68% to 84% for the highest cumulative exposure category of more than 4 mg/m3-years. Cumulative exposure was the best predictor of disease, followed by duration of exposure and average exposure. After adjustment for competing risks of death, a 45-year exposure under the current OSHA standard would lead to a lifetime risk of silicosis of 35% to 47%. Almost 2 million US workers are currently exposed to silica. Our results add to a small but increasing body of literature that suggests that the current OSHA silica exposure level is unacceptably high.

  4. Non-Standard Interactions in propagation at the Deep Underground Neutrino Experiment

    DOE PAGES

    Coloma, Pilar

    2016-03-03

    Here, we study the sensitivity of current and future long-baseline neutrino oscillation experiments to the effects of dimension six operators affecting neutrino propagation through Earth, commonly referred to as Non-Standard Interactions (NSI). All relevant parameters entering the oscillation probabilities (standard and non-standard) are considered at once, in order to take into account possible cancellations and degeneracies between them. We find that the Deep Underground Neutrino Experiment will significantly improve over current constraints for most NSI parameters. Most notably, it will be able to rule out the so-called LMA-dark solution, still compatible with current oscillation data, and will be sensitive to off-diagonal NSI parameters at the level of ε ~more » $$ \\mathcal{O} $$ (0.05 – 0.5). We also identify two degeneracies among standard and non-standard parameters, which could be partially resolved by combining T2HK and DUNE data.« less

  5. Implementation of clinical research trials using web-based and mobile devices: challenges and solutions.

    PubMed

    Eagleson, Roy; Altamirano-Diaz, Luis; McInnis, Alex; Welisch, Eva; De Jesus, Stefanie; Prapavessis, Harry; Rombeek, Meghan; Seabrook, Jamie A; Park, Teresa; Norozi, Kambiz

    2017-03-17

    With the increasing implementation of web-based, mobile health interventions in clinical trials, it is crucial for researchers to address the security and privacy concerns of patient information according to high ethical standards. The full process of meeting these standards is often made more complicated due to the use of internet-based technology and smartphones for treatment, telecommunication, and data collection; however, this process is not well-documented in the literature. The Smart Heart Trial is a single-arm feasibility study that is currently assessing the effects of a web-based, mobile lifestyle intervention for overweight and obese children and youth with congenital heart disease in Southwestern Ontario. Participants receive telephone counseling regarding nutrition and fitness; and complete goal-setting activities on a web-based application. This paper provides a detailed overview of the challenges the study faced in meeting the high standards of our Research Ethics Board, specifically regarding patient privacy. We outline our solutions, successes, limitations, and lessons learned to inform future similar studies; and model much needed transparency in ensuring high quality security and protection of patient privacy when using web-based and mobile devices for telecommunication and data collection in clinical research.

  6. Physical appearance anxiety impedes the therapeutic effects of video feedback in high socially anxious individuals.

    PubMed

    Orr, Elizabeth M J; Moscovitch, David A

    2014-01-01

    Video feedback (VF) interventions effectively reduce social anxiety symptoms and negative self-perception, particularly when they are preceded by cognitive preparation (CP) and followed by cognitive review. In the current study, we re-examined data from a study on the efficacy of a novel VF intervention for individuals high in social anxiety to test the hypothesis that physical appearance anxiety would moderate the effects of VF. Data were analyzed from 68 socially anxious participants who performed an initial public speech, and were randomly assigned to an Elaborated VF condition (VF plus cognitive preparation and cognitive review), a Standard VF condition (VF plus cognitive preparation) or a No VF condition (exposure alone), and then performed a second speech. As hypothesized, when appearance concerns were low, both participants who received Elaborated and Standard VF were significantly less anxious during speech 2 than those in the No VF condition. However, when levels of appearance concern were high, neither Elaborated nor Standard VF reduced anxiety levels during speech 2 beyond the No VF condition. Results from our analog sample suggest the importance of tailoring treatment protocols to accommodate the idiosyncratic concerns of socially anxious patients.

  7. Role of imaging in testicular cancer: current and future practice.

    PubMed

    Barrisford, Glen W; Kreydin, Evgeniy I; Preston, Mark A; Rodriguez, Dayron; Harisighani, Mukesh G; Feldman, Adam S

    2015-09-01

    The article provides a summary of the epidemiologic and clinical aspects of testicular malignancy. Current standard imaging and novel techniques are reviewed. Present data and clinical treatment trends have favored surveillance protocols over adjuvant radiation or chemotherapy for low-stage testicular malignancy. This has resulted in increasing numbers of imaging studies and the potential for increased long-term exposure risks. Understanding imaging associated risks as well as strategies to minimize these risks is of increasing importance. The development, validation and incorporation of alternative lower risk highly efficacious and cost-effective imaging techniques is essential.

  8. Power loss in open cavity diodes and a modified Child-Langmuir law

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biswas, Debabrata; Kumar, Raghwendra; Puri, R.R.

    Diodes used in most high power devices are inherently open. It is shown that under such circumstances, there is a loss of electromagnetic radiation leading to a lower critical current as compared to closed diodes. The power loss can be incorporated in the standard Child-Langmuir framework by introducing an effective potential. The modified Child-Langmuir law can be used to predict the maximum power loss for a given plate separation and potential difference as well as the maximum transmitted current for this power loss. The effectiveness of the theory is tested numerically.

  9. Development of reverse biased p-n junction electron emission

    NASA Technical Reports Server (NTRS)

    Fowler, P.; Muly, E. C.

    1971-01-01

    A cold cathode emitter of hot electrons for use as a source of electrons in vacuum gauges and mass spectrometers was developed using standard Norton electroluminescent silicon carbide p-n diodes operated under reverse bias conditions. Continued development including variations in the geometry of these emitters was carried out such that emitters with an emission efficiency (emitted current/junction current) as high as 3 x 10-0.00001 were obtained. Pulse measurements of the diode characteristics were made and showed that higher efficiency can be attained under pulse conditions probably due to the resulting lower temperatures resulting from such operation.

  10. Eddy current imaging for electrical characterization of silicon solar cells and TCO layers

    NASA Astrophysics Data System (ADS)

    Hwang, Byungguk; Hillmann, Susanne; Schulze, Martin; Klein, Marcus; Heuer, Henning

    2015-03-01

    Eddy Current Testing has been mainly used to determine defects of conductive materials and wall thicknesses in heavy industries such as construction or aerospace. Recently, high frequency Eddy Current imaging technology was developed. This enables the acquirement of information of different depth level in conductive thin-film structures by realizing proper standard penetration depth. In this paper, we summarize the state of the art applications focusing on PV industry and extend the analysis implementing achievements by applying spatially resolved Eddy Current Testing. The specific state of frequency and complex phase angle rotation demonstrates diverse defects from front to back side of silicon solar cells and characterizes homogeneity of sheet resistance in Transparent Conductive Oxide (TCO) layers. In order to verify technical feasibility, measurement results from the Multi Parameter Eddy Current Scanner, MPECS are compared to the results from Electroluminescence.

  11. User-Defined Data Distributions in High-Level Programming Languages

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Zima, Hans P.

    2006-01-01

    One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.

  12. Advances in the vaccination of the elderly against influenza: role of a high-dose vaccine.

    PubMed

    Sullivan, Seth J; Jacobson, Robert; Poland, Gregory A

    2010-10-01

    On 23 December 2009, the US FDA approved Fluzone® High Dose, a high-dose formulation of the trivalent inactivated influenza vaccine, for prevention of influenza in people 65 years of age and older. As it was approved via an accelerated process designed to allow expeditious availability of safe and effective products with promise to treat or prevent serious or life-threatening diseases, the manufacturer is required to conduct further studies to demonstrate effectiveness. Although these studies are underway, a recently completed randomized, controlled trial demonstrated that this vaccine, containing four-times more hemagglutinin than standard-dose inactivated influenza vaccines, can produce an enhanced immunologic response in subjects of 65 years of age and older, while maintaining a favorable safety profile. This article introduces the vaccine, presents currently available safety and immunogenicity data, discusses current recommendations for use, and proposes what we can expect in the coming years.

  13. Computational modeling of transcranial direct current stimulation (tDCS) in obesity: Impact of head fat and dose guidelines☆

    PubMed Central

    Truong, Dennis Q.; Magerowski, Greta; Blackburn, George L.; Bikson, Marom; Alonso-Alonso, Miguel

    2013-01-01

    Recent studies show that acute neuromodulation of the prefrontal cortex with transcranial direct current stimulation (tDCS) can decrease food craving, attentional bias to food, and actual food intake. These data suggest potential clinical applications for tDCS in the field of obesity. However, optimal stimulation parameters in obese individuals are uncertain. One fundamental concern is whether a thick, low-conductivity layer of subcutaneous fat around the head can affect current density distribution and require dose adjustments during tDCS administration. The aim of this study was to investigate the role of head fat on the distribution of current during tDCS and evaluate whether dosing standards for tDCS developed for adult individuals in general are adequate for the obese population. We used MRI-derived high-resolution computational models that delineated fat layers in five human heads from subjects with body mass index (BMI) ranging from “normal-lean” to “super-obese” (20.9 to 53.5 kg/m2). Data derived from these simulations suggest that head fat influences tDCS current density across the brain, but its relative contribution is small when other components of head anatomy are added. Current density variability between subjects does not appear to have a direct and/or simple link to BMI. These results indicate that guidelines for the use of tDCS can be extrapolated to obese subjects without sacrificing efficacy and/or treatment safety; the recommended standard parameters can lead to the delivery of adequate current flow to induce neuromodulation of brain activity in the obese population. PMID:24159560

  14. Ground-water quality and geochemistry in Carson and Eagle Valleys, western Nevada and eastern California

    USGS Publications Warehouse

    Welch, Alan H.

    1994-01-01

    Aquifers in Carson and Eagle Valleys are an important source of water for human consumption and agriculture. Concentrations of major constituents in water from the principal aquifers on the west sides of Carson and Eagle Valleys appear to be a result of natural geochemical reactions with minerals derived primarily from plutonic rocks. In general, water from principal aquifers is acceptable for drinking when compared with current (1993) Nevada State drinking-water maximum contaminant level standards. Water was collected and analyzed for all inorganic constituents for which primary or secondary drinking-water standards have been established. About 3 percent of these sites had con- stituents that exceeded one or more primary or secondary drinking-water standards have been established. About 3 percent of these sites had con- stituents that exceeded one or more primary standards and water at about 10 percent of the sites had at least one constituent that surpassed a secondary standard. Arsenic exceeded the standard in water at less than 1 percent of the principal aquifer sites; nitrate surpassed its standard in water at 3 percent of 93 sites. Water from wells in the principal aquifer with high concentrations of nitrate was in areas where septic systems are used; these concentrations indicate that contamination may be entering the wells. Concentrations of naturally occurring radionuclides in water from the principal aquifers, exceed the proposed Federal standards for some constituents, but were not found t be above current (1993) State standards. The uranium concen- trations exceeded the proposed 20 micrograms per liter Federal standard at 10 percent of the sites. Of the sites analyzed for all of the inorganic constituents with primary standards plus uranium, 15 percent exceed one or more established standards. If the proposed 20 micrograms per liter standard for uranium is applied to the sampled sites, then 23 percent would exceed the standard for uranium or some other constituent with a primary drinking water standard. This represents a 50-percent increase in the frequency of exceedance. Almost all water sampled from the principal aquifers exceeds the 300 picocuries per liter proposed standard for radon. Ground-water sampling sites with the highest radon activities in water are most commonly located in the upland aquifers in the Sierra Nevada and in the principal aquifers beneath the west sides of Carson and Eagle Valleys.

  15. Identification of pavement marking colors.

    DOT National Transportation Integrated Search

    2002-04-01

    Current pavement marking color specifications are given in terms of a single color with no indication of acceptable tolerances. Recently proposed standards include tolerances, but neither current nor proposed standards are based on psychophysical dat...

  16. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  17. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  18. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  19. 17 CFR 232.13 - Date of filing; adjustment of filing date.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed filed on.... Eastern Standard Time or Eastern Daylight Saving Time, whichever is currently in effect, shall be deemed... Daylight Savings Time, whichever is currently in effect, shall be deemed filed on the same business day. (4...

  20. Impact of host cell variation on the neutralization of HIV-1 in vitro.

    PubMed

    Polonis, Victoria R; Schuitemaker, Hanneke; Bunnik, Evelien M; Brown, Bruce K; Scarlatti, Gabriella

    2009-09-01

    In this review we present current advances in our understanding of HIV-1 neutralization assays that employ primary cell types, as compared with those that utilize cell lines and the newer, more standardized pseudovirus assays. A commentary on the challenges of standardizing in-vitro neutralization assays using primary cells is included. The data from reporter cell line neutralization assays may agree with results observed in primary cells; however, exceptions have recently been reported. Multiple variables exist in primary cell assays using peripheral blood mononuclear cells from HIV-seronegative donors; in-vitro neutralization titers can vary significantly based on the donor cells used for assay targets and for virus propagation. Thus, more research is required to achieve validated primary cell neutralization assays. HIV-vaccine-induced antibody performance in the current neutralization assays may function as a 'gatekeeper' for HIV-1 subunit vaccine advancement. Development of standardized platforms for reproducible measurement of in-vitro neutralization is therefore a high priority. Given the considerable variation in results obtained from some widely applied HIV neutralization platforms, parallel evaluation of new antibodies using different host cells for assay targets, as well as virus propagation, is recommended until immune correlates of protection are identified.

  1. Lightweight IMM PV Flexible Blanket Assembly

    NASA Technical Reports Server (NTRS)

    Spence, Brian

    2015-01-01

    Deployable Space Systems (DSS) has developed an inverted metamorphic multijunction (IMM) photovoltaic (PV) integrated modular blanket assembly (IMBA) that can be rolled or z-folded. This IMM PV IMBA technology enables a revolutionary flexible PV blanket assembly that provides high specific power, exceptional stowed packaging efficiency, and high-voltage operation capability. DSS's technology also accommodates standard third-generation triple junction (ZTJ) PV device technologies to provide significantly improved performance over the current state of the art. This SBIR project demonstrated prototype, flight-like IMM PV IMBA panel assemblies specifically developed, designed, and optimized for NASA's high-voltage solar array missions.

  2. Non-destructive diagnostics of irradiated materials using neutron scattering from pulsed neutron sources

    NASA Astrophysics Data System (ADS)

    Korenev, Sergey; Sikolenko, Vadim

    2004-09-01

    The advantage of neutron-scattering studies as compared to the standard X-ray technique is the high penetration of neutrons that allow us to study volume effects. The high resolution of instrumentation on the basis neutron scattering allows measurement of the parameters of lattice structure with high precision. We suggest the use of neutron scattering from pulsed neutron sources for analysis of materials irradiated with pulsed high current electron and ion beams. The results of preliminary tests using this method for Ni foils that have been studied by neutron diffraction at the IBR-2 (Pulsed Fast Reactor at Joint Institute for Nuclear Research) are presented.

  3. Framework for Establishment of a Comprehensive and Standardized Administration System for Prevention and Control of Tuberculosis in College Student Community in China.

    PubMed

    Zhang, Shaoru; Li, Xiaohong; Zhang, Tianhua; Wang, Xiangni; Liu, Weiping; Ma, Xuexue; Li, Yuelu; Fan, Yahui

    2016-10-01

    College student community is the one with high risk of tuberculosis (TB). A systemic and standardized administration model for prevention and control of TB is significance in controlling TB spread in universities. Currently, the universities in China have not established the comprehensive and standardized administration system for TB prevention and control in college student community. Firstly, the literature research and brainstorming method (n=13) were used to construct the clause and sub-clause pool for the administration of TB prevention and control within college student community in 2014. Secondly, a total of twenty experts in the field of TB prevention and control who are representatives of the east, west, south and north parts of China were selected and invited to participate the Delphi letter-inquiry. After two rounds of letter-inquiry, the opinions of the experts reached a consensus and the framework for the administration system was constructed. A framework for the administration system was constructed, which included 8 first class indexes, 26 second class indexes and 104 third class indexes. The results are highly scientific and reliable, which can be helpful for improving the systemic and standardized levels for the administration of TB prevention and control in universities in China and perhaps in other developing counties with high TB burden as well.

  4. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    PubMed Central

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  5. Framework for Establishment of a Comprehensive and Standardized Administration System for Prevention and Control of Tuberculosis in College Student Community in China

    PubMed Central

    ZHANG, Shaoru; LI, Xiaohong; ZHANG, Tianhua; WANG, Xiangni; LIU, Weiping; MA, Xuexue; LI, Yuelu; FAN, Yahui

    2016-01-01

    Background: College student community is the one with high risk of tuberculosis (TB). A systemic and standardized administration model for prevention and control of TB is significance in controlling TB spread in universities. Currently, the universities in China have not established the comprehensive and standardized administration system for TB prevention and control in college student community. Methods: Firstly, the literature research and brainstorming method (n=13) were used to construct the clause and sub-clause pool for the administration of TB prevention and control within college student community in 2014. Secondly, a total of twenty experts in the field of TB prevention and control who are representatives of the east, west, south and north parts of China were selected and invited to participate the Delphi letter-inquiry. After two rounds of letter-inquiry, the opinions of the experts reached a consensus and the framework for the administration system was constructed. Results: A framework for the administration system was constructed, which included 8 first class indexes, 26 second class indexes and 104 third class indexes. Conclusion: The results are highly scientific and reliable, which can be helpful for improving the systemic and standardized levels for the administration of TB prevention and control in universities in China and perhaps in other developing counties with high TB burden as well. PMID:27957436

  6. Fabrication of nanostructured transmissive optical devices on ITO-glass with UV1116 photoresist using high-energy electron beam lithography.

    PubMed

    Williams, Calum; Bartholomew, Richard; Rughoobur, Girish; Gordon, George S D; Flewitt, Andrew J; Wilkinson, Timothy D

    2016-12-02

    High-energy electron beam lithography for patterning nanostructures on insulating substrates can be challenging. For high resolution, conventional resists require large exposure doses and for reasonable throughput, using typical beam currents leads to charge dissipation problems. Here, we use UV1116 photoresist (Dow Chemical Company), designed for photolithographic technologies, with a relatively low area dose at a standard operating current (80 kV, 40-50 μC cm -2 , 1 nAs -1 ) to pattern over large areas on commercially coated ITO-glass cover slips. The minimum linewidth fabricated was ∼33 nm with 80 nm spacing; for isolated structures, ∼45 nm structural width with 50 nm separation. Due to the low beam dose, and nA current, throughput is high. This work highlights the use of UV1116 photoresist as an alternative to conventional e-beam resists on insulating substrates. To evaluate suitability, we fabricate a range of transmissive optical devices, that could find application for customized wire-grid polarisers and spectral filters for imaging, which operate based on the excitation of surface plasmon polaritons in nanosized geometries, with arrays encompassing areas ∼0.25 cm 2 .

  7. Fabrication of nanostructured transmissive optical devices on ITO-glass with UV1116 photoresist using high-energy electron beam lithography

    NASA Astrophysics Data System (ADS)

    Williams, Calum; Bartholomew, Richard; Rughoobur, Girish; Gordon, George S. D.; Flewitt, Andrew J.; Wilkinson, Timothy D.

    2016-12-01

    High-energy electron beam lithography for patterning nanostructures on insulating substrates can be challenging. For high resolution, conventional resists require large exposure doses and for reasonable throughput, using typical beam currents leads to charge dissipation problems. Here, we use UV1116 photoresist (Dow Chemical Company), designed for photolithographic technologies, with a relatively low area dose at a standard operating current (80 kV, 40-50 μC cm-2, 1 nAs-1) to pattern over large areas on commercially coated ITO-glass cover slips. The minimum linewidth fabricated was ˜33 nm with 80 nm spacing; for isolated structures, ˜45 nm structural width with 50 nm separation. Due to the low beam dose, and nA current, throughput is high. This work highlights the use of UV1116 photoresist as an alternative to conventional e-beam resists on insulating substrates. To evaluate suitability, we fabricate a range of transmissive optical devices, that could find application for customized wire-grid polarisers and spectral filters for imaging, which operate based on the excitation of surface plasmon polaritons in nanosized geometries, with arrays encompassing areas ˜0.25 cm2.

  8. Spacetime Curvature and Higgs Stability after Inflation.

    PubMed

    Herranen, M; Markkanen, T; Nurmi, S; Rajantie, A

    2015-12-11

    We investigate the dynamics of the Higgs field at the end of inflation in the minimal scenario consisting of an inflaton field coupled to the standard model only through the nonminimal gravitational coupling ξ of the Higgs field. Such a coupling is required by renormalization of the standard model in curved space, and in the current scenario also by vacuum stability during high-scale inflation. We find that for ξ≳1, rapidly changing spacetime curvature at the end of inflation leads to significant production of Higgs particles, potentially triggering a transition to a negative-energy Planck scale vacuum state and causing an immediate collapse of the Universe.

  9. Development of low-shock pyrotechnic separation nuts. [design performance of flight type nuts

    NASA Technical Reports Server (NTRS)

    Bement, L. J.; Neubert, V. H.

    1973-01-01

    Performance demonstrations and comparisons were made on six flight type pyrotechnic separation nut designs, two of which are standard designs in current use, and four of which were designed to produce low shock on actuation. Although the shock performances of the four low shock designs are considerably lower than the standard designs, some penalties may be incurred in increased volume, weight, or complexity. These nuts, and how they are installed, can significantly influence the pyrotechnic shock created in spacecraft structures. A high response monitoring system has been developed and demonstrated to provide accurate performance comparisons for pyrotechnic separation nuts.

  10. Stereotactically Standard Areas: Applied Mathematics in the Service of Brain Targeting in Deep Brain Stimulation.

    PubMed

    Mavridis, Ioannis N

    2017-12-11

    The concept of stereotactically standard areas (SSAs) within human brain nuclei belongs to the knowledge of the modern field of stereotactic brain microanatomy. These are areas resisting the individual variability of the nuclear location in stereotactic space. This paper summarizes the current knowledge regarding SSAs. A mathematical formula of SSAs was recently invented, allowing for their robust, reproducible, and accurate application to laboratory studies and clinical practice. Thus, SSAs open new doors for the application of stereotactic microanatomy to highly accurate brain targeting, which is mainly useful for minimally invasive neurosurgical procedures, such as deep brain stimulation.

  11. Exposure safety standards for nonionizing radiation (NIR) from collision-avoidance radar

    NASA Astrophysics Data System (ADS)

    Palmer-Fortune, Joyce; Brecher, Aviva; Spencer, Paul; Huguenin, Richard; Woods, Ken

    1997-02-01

    On-vehicle technology for collision avoidance using millimeter wave radar is currently under development and is expected to be in vehicles in coming years. Recently approved radar bands for collision avoidance applications include 47.5 - 47.8 GHz and 76 - 77 GHz. Widespread use of active radiation sources in the public domain would contribute to raised levels of human exposure to high frequency electromagnetic radiation, with potential for adverse health effects. In order to design collision avoidance systems that will pose an acceptably low radiation hazard, it is necessary to determine what levels of electromagnetic radiation at millimeter wave frequencies will be acceptable in the environment. This paper will summarize recent research on NIR (non-ionizing radiation) exposure safety standards for high frequency electromagnetic radiation. We have investigated both governmental and non- governmental professional organizations worldwide.

  12. Fundamental contradictions in cultural competence.

    PubMed

    Johnson, Yvonne M; Munch, Shari

    2009-07-01

    Cultural competence (CC) is considered highly relevant to social work practice with clients belonging to ethnic and racial minority groups, as the burgeoning literature and creation of practice standards on CC attest. However, examination of the conceptual underpinnings of CC reveals several major anomalies. The authors argue that several aspects of CC contradict central social work concepts or are at odds with current, standard social work practice. These contradictions extend to the epistemological foundations of CC and the rights and dignity of the individual. To further stress the conceptual tensions at the heart of CC, the authors incorporate recent philosophical work addressing collective identities and group rights. The question of whether culturally competent practice is achievable is also addressed. The authors urge academicians and practitioners to thoroughly examine the theoretical and ethical bases of CC because of their highly important ramifications for social work practice.

  13. High Accuracy Temperature Measurements Using RTDs with Current Loop Conditioning

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.

    1997-01-01

    To measure temperatures with a greater degree of accuracy than is possible with thermocouples, RTDs (Resistive Temperature Detectors) are typically used. Calibration standards use specialized high precision RTD probes with accuracies approaching 0.001 F. These are extremely delicate devices, and far too costly to be used in test facility instrumentation. Less costly sensors which are designed for aeronautical wind tunnel testing are available and can be readily adapted to probes, rakes, and test rigs. With proper signal conditioning of the sensor, temperature accuracies of 0.1 F is obtainable. For reasons that will be explored in this paper, the Anderson current loop is the preferred method used for signal conditioning. This scheme has been used in NASA Lewis Research Center's 9 x 15 Low Speed Wind Tunnel, and is detailed.

  14. Impedance of an intense plasma-cathode electron source for tokamak startup

    DOE PAGES

    Hinson, Edward Thomas; Barr, Jayson L.; Bongard, Michael W.; ...

    2016-05-31

    In this study, an impedance model is formulated and tested for the ~1kV, ~1kA/cm 2, arc-plasma cathode electron source used for local helicity injection tokamak startup. A double layer sheath is established between the high-density arc plasma (n arc ≈ 10 21 m -3) within the electron source, and the less dense external tokamak edge plasma (n edge ≈ 10 18 m -3) into which current is injected at the applied injector voltage, V inj. Experiments on the Pegasus spherical tokamak show the injected current, I inj, increases with V inj according to the standard double layer scaling I injmore » ~ V inj 3/2 at low current and transitions to I inj ~ V inj 1/2 at high currents. In this high current regime, sheath expansion and/or space charge neutralization impose limits on the beam density n b ~ I inj/V inj 1/2. For low tokamak edge density n edge and high I inj, the inferred beam density n b is consistent with the requirement n b ≤ n edge imposed by space-charge neutralization of the beam in the tokamak edge plasma. At sufficient edge density, n b ~ n arc is observed, consistent with a limit to n b imposed by expansion of the double layer sheath. These results suggest that n arc is a viable control actuator for the source impedance.« less

  15. Mooring Measurements of the Abyssal Circulations in the Western Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, F.

    2016-12-01

    A scientific observing network in the western tropical Pacific has initially been established by the Institute of Oceanology, Chinese Academy of Sciences (IOCAS). Using fifteen moorings that gives unprecedented measurements in the intermediate and abyssal layers, we present multi-timescale variations of the deep ocean circulations prior to and during 2015 El Niño event. The deep ocean velocities increase equatorward with high standard deviation and nearly zero mean. The deep ocean currents mainly flow in meridional direction in the central Philippine Basin, and are dominated by a series of alternating westward and eastward zonal jets in the Caroline Basin. The currents in the deep channel connecting the East and West Mariana Basins mainly flow southeastward. Seasonal variation is only present in the deep jets in the Caroline Basin, associating with vertical propagating annual Rossby wave. The high-frequency flow bands are dominated by diurnal, and semi-diurnal tidal currents, and near-inertial currents. The rough topography has a strong influence on the abyssal circulations, including the intensifications in velocity and internal tidal energy, and the formation of upwelling flow.

  16. 50 MHz-10 GHz low-power resistive feedback current-reuse mixer with inductive peaking for cognitive radio receiver.

    PubMed

    Vitee, Nandini; Ramiah, Harikrishnan; Chong, Wei-Keat; Tan, Gim-Heng; Kanesan, Jeevan; Reza, Ahmed Wasif

    2014-01-01

    A low-power wideband mixer is designed and implemented in 0.13 µm standard CMOS technology based on resistive feedback current-reuse (RFCR) configuration for the application of cognitive radio receiver. The proposed RFCR architecture incorporates an inductive peaking technique to compensate for gain roll-off at high frequency while enhancing the bandwidth. A complementary current-reuse technique is used between transconductance and IF stages to boost the conversion gain without additional power consumption by reusing the DC bias current of the LO stage. This downconversion double-balanced mixer exhibits a high and flat conversion gain (CG) of 14.9 ± 1.4 dB and a noise figure (NF) better than 12.8 dB. The maximum input 1-dB compression point (P1dB) and maximum input third-order intercept point (IIP3) are -13.6 dBm and -4.5 dBm, respectively, over the desired frequency ranging from 50 MHz to 10 GHz. The proposed circuit operates down to a supply headroom of 1 V with a low-power consumption of 3.5 mW.

  17. Continuum mechanics analysis of fracture progression in the vitrified cryoprotective agent DP6

    PubMed Central

    Steif, Paul S.; Palastro, Matthew C.; Rabin, Yoed

    2008-01-01

    As part of an ongoing effort to study the continuum mechanics effects associated with cryopreservation, the current report focuses on the prediction of fracture formation in cryoprotective agents. Fractures had been previously observed in 1 mℓ samples of the cryoprotective agent cocktail DP6, contained in a standard 15 mℓ glass vial, and subjected to various cooling rates. These experimental observations were obtained by means of a cryomacroscope, which has been recently presented by the current research team. High and low cooling rates were found to produce very distinct patterns of cracking. The current study seeks to explain the observed patterns on the basis of stresses predicted from finite element analysis, which relies on a simple viscoelastic constitutive model and on estimates of the critical stress for cracking. The current study demonstrates that the stress which results in instantaneous fracture at low cooling rates is consistent with the stress to initiate fracture at high cooling rate. This consistency supports the credibility of the proposed constitutive model and analysis, and the unified criterion for fracturing, that is, a critical stress threshold. PMID:18412493

  18. Informatics and Standards for Nanomedicine Technology

    PubMed Central

    Thomas, Dennis G.; Klaessig, Fred; Harper, Stacey L.; Fritts, Martin; Hoover, Mark D.; Gaheen, Sharon; Stokes, Todd H.; Reznik-Zellen, Rebecca; Freund, Elaine T.; Klemm, Juli D.; Paik, David S.; Baker, Nathan A.

    2011-01-01

    There are several issues to be addressed concerning the management and effective use of information (or data), generated from nanotechnology studies in biomedical research and medicine. These data are large in volume, diverse in content, and are beset with gaps and ambiguities in the description and characterization of nanomaterials. In this work, we have reviewed three areas of nanomedicine informatics: information resources; taxonomies, controlled vocabularies, and ontologies; and information standards. Informatics methods and standards in each of these areas are critical for enabling collaboration, data sharing, unambiguous representation and interpretation of data, semantic (meaningful) search and integration of data; and for ensuring data quality, reliability, and reproducibility. In particular, we have considered four types of information standards in this review, which are standard characterization protocols, common terminology standards, minimum information standards, and standard data communication (exchange) formats. Currently, due to gaps and ambiguities in the data, it is also difficult to apply computational methods and machine learning techniques to analyze, interpret and recognize patterns in data that are high dimensional in nature, and also to relate variations in nanomaterial properties to variations in their chemical composition, synthesis, characterization protocols, etc. Progress towards resolving the issues of information management in nanomedicine using informatics methods and standards discussed in this review will be essential to the rapidly growing field of nanomedicine informatics. PMID:21721140

  19. Revision of the NIST Standard for (223)Ra: New Measurements and Review of 2008 Data.

    PubMed

    Zimmerman, B E; Bergeron, D E; Cessna, J T; Fitzgerald, R; Pibida, L

    2015-01-01

    After discovering a discrepancy in the transfer standard currently being disseminated by the National Institute of Standards and Technology (NIST), we have performed a new primary standardization of the alpha-emitter (223)Ra using Live-timed Anticoincidence Counting (LTAC) and the Triple-to-Double Coincidence Ratio Method (TDCR). Additional confirmatory measurements were made with the CIEMAT-NIST efficiency tracing method (CNET) of liquid scintillation counting, integral γ-ray counting using a NaI(Tl) well counter, and several High Purity Germanium (HPGe) detectors in an attempt to understand the origin of the discrepancy and to provide a correction. The results indicate that a -9.5 % difference exists between activity values obtained using the former transfer standard relative to the new primary standardization. During one of the experiments, a 2 % difference in activity was observed between dilutions of the (223)Ra master solution prepared using the composition used in the original standardization and those prepared using 1 mol·L(-1) HCl. This effect appeared to be dependent on the number of dilutions or the total dilution factor to the master solution, but the magnitude was not reproducible. A new calibration factor ("K-value") has been determined for the NIST Secondary Standard Ionization Chamber (IC "A"), thereby correcting the discrepancy between the primary and secondary standards.

  20. Induction of auroral zone electric currents within the Alaska pipeline

    USGS Publications Warehouse

    Campbell, W.H.

    1978-01-01

    The Alaskar pipeline is a highly conducting anomaly extending 800 miles (1300 km) from about 62?? to 69?? geomagnetic latitude beneath the most active regions of the ionospheric electrojet current. The spectral behavior of the magnetic field from this current was analyzed using data from standard geomagnetic observatories to establish the predictable patterns of temporal and spatial changes for field pulsation periods between 5 min and 4 hr. Such behavior is presented in a series of tables, graphs and formulae. Using 2- and 3-layer models of the conducting earth, the induced electric fields associated with the geomagnetic changes were established. From the direct relationship of the current to the geomagnetic field variation patterns one can infer counterpart temporal and spatial characteristics of the pipeline current. The relationship of the field amplitudes to geomagnetic activity indices, Ap, and the established occurrence of various levels of Ap over several solar cycles were employed to show that about half of the time the induced currents in the pipe would be under 1 A for the maximum response oscillatory periods near 1 hr. Such currents should be of minimal consequence in corrosion effects for even a section of the pipeline unprotected by sacrificial electrodes. Of greater interest was the result that the extreme surges of current should reach over one-hundred amperes in the pipeline during high activity. ?? 1978 Birkha??user Verlag.

  1. Considerations for human exposure standards for fast-rise-time high-peak-power electromagnetic pulses.

    PubMed

    Merritt, J H; Kiel, J L; Hurt, W D

    1995-06-01

    Development of new emitter systems capable of producing high-peak-power electromagnetic pulses with very fast rise times and narrow pulse widths is continuing. Such directed energy weapons systems will be used in the future to defeat electronically vulnerable targets. Human exposures to these pulses can be expected during testing and operations. Development of these technologies for radar and communications purposes has the potential for wider environmental exposure, as well. Current IEEE C95.1-1991 human exposure guidelines do not specifically address these types of pulses, though limits are stated for pulsed emissions. The process for developing standards includes an evaluation of the relevant bioeffects data base. A recommendation has been made that human exposure to ultrashort electromagnetic pulses that engender electromagnetic transients, called precursor waves, should be avoided. Studies that purport to show the potential for tissue damage induced by such pulses were described. The studies cited in support of the recommendation were not relevant to the issues of tissue damage by propagated pulses. A number of investigations are cited in this review that directly address the biological effects of electromagnetic pulses. These studies have not shown evidence of tissue damage as a result of exposure to high-peak-power pulsed microwaves. It is our opinion that the current guidelines are sufficiently protective for human exposure to these pulses.

  2. Velocity profile, water-surface slope, and bed-material size for selected streams in Colorado

    USGS Publications Warehouse

    Marchand, J.P.; Jarrett, R.D.; Jones, L.L.

    1984-01-01

    Existing methods for determining the mean velocity in a vertical sampling section do not address the conditions present in high-gradient, shallow-depth streams common to mountainous regions such as Colorado. The report presents velocity-profile data that were collected for 11 streamflow-gaging stations in Colorado using both a standard Price type AA current meter and a prototype Price Model PAA current meter. Computational results are compiled that will enable mean velocities calculated from measurements by the two current meters to be compared with each other and with existing methods for determining mean velocity. Water-surface slope, bed-material size, and flow-characteristic data for the 11 sites studied also are presented. (USGS)

  3. Teaching physics using project-based engineering curriculum with a theme of alternative energy

    NASA Astrophysics Data System (ADS)

    Tasior, Bryan

    The Next Generation Science Standards (NGSS) provide a new set of science standards that, if adopted, shift the focus from content knowledge-based to skill-based education. Students will be expected to use science to investigate the natural world and solve problems using the engineering design process. The world also is facing an impending crisis related to climate, energy supply and use, and alternative energy development. Education has an opportunity to help provide the much needed paradigm shift from our current methods of providing the energy needs of society. The purpose of this research was to measure the effectiveness of a unit that accomplishes the following objectives: uses project-based learning to teach the engineering process and standards of the NGSS, addresses required content expectations of energy and electricity from the HSCE's, and provides students with scientific evidence behind issues (both environmental and social/economic) relating to the energy crisis and current dependence of fossil fuels as our primary energy source. The results of the research indicate that a physics unit can be designed to accomplish these objectives. The unit that was designed, implemented and reported here also shows that it was highly effective at improving students' science content knowledge, implementing the engineering design standards of the NGSS, while raising awareness, knowledge and motivations relating to climate and the energy crisis.

  4. Standardization of databases for AMDB taxi routing functions

    NASA Astrophysics Data System (ADS)

    Pschierer, C.; Sindlinger, A.; Schiefele, J.

    2010-04-01

    Input, management, and display of taxi routes on airport moving map displays (AMM) have been covered in various studies in the past. The demonstrated applications are typically based on Aerodrome Mapping Databases (AMDB). Taxi routing functions require specific enhancements, typically in the form of a graph network with nodes and edges modeling all connectivities within an airport, which are not supported by the current AMDB standards. Therefore, the data schemas and data content have been defined specifically for the purpose and test scenarios of these studies. A standardization of the data format for taxi routing information is a prerequisite for turning taxi routing functions into production. The joint RTCA/EUROCAE special committee SC-217, responsible for updating and enhancing the AMDB standards DO-272 [1] and DO-291 [2], is currently in the process of studying different alternatives and defining reasonable formats. Requirements for taxi routing data are primarily driven by depiction concepts for assigned and cleared taxi routes, but also by database size and the economic feasibility. Studied concepts are similar to the ones described in the GDF (geographic data files) specification [3], which is used in most car navigation systems today. They include - A highly aggregated graph network of complex features - A modestly aggregated graph network of simple features - A non-explicit topology of plain AMDB taxi guidance line elements This paper introduces the different concepts and their advantages and disadvantages.

  5. Electric field characteristics of electroconvulsive therapy with individualized current amplitude: a preclinical study.

    PubMed

    Lee, Won Hee; Lisanby, Sarah H; Laine, Andrew F; Peterchev, Angel V

    2013-01-01

    This study examines the characteristics of the electric field induced in the brain by electroconvulsive therapy (ECT) with individualized current amplitude. The electric field induced by bilateral (BL), bifrontal (BF), right unilateral (RUL), and frontomedial (FM) ECT electrode configurations was computed in anatomically realistic finite element models of four nonhuman primates (NHPs). We generated maps of the electric field strength relative to an empirical neural activation threshold, and determined the stimulation strength and focality at fixed current amplitude and at individualized current amplitudes corresponding to seizure threshold (ST) measured in the anesthetized NHPs. The results show less variation in brain volume stimulated above threshold with individualized current amplitudes (16-36%) compared to fixed current amplitude (30-62%). Further, the stimulated brain volume at amplitude-titrated ST is substantially lower than that for ECT with conventional fixed current amplitudes. Thus individualizing the ECT stimulus current could compensate for individual anatomical variability and result in more focal and uniform electric field exposure across different subjects compared to the standard clinical practice of using high, fixed current for all patients.

  6. Improving sexuality education: the development of teacher-preparation standards.

    PubMed

    Barr, Elissa M; Goldfarb, Eva S; Russell, Susan; Seabert, Denise; Wallen, Michele; Wilson, Kelly L

    2014-06-01

    Teaching sexuality education to support young people's sexual development and overall sexual health is both needed and supported. Data continue to highlight the high rates of teen pregnancy, sexually transmitted disease, including human immunodeficiency virus (HIV) infections, among young people in the United States as well as the overwhelming public support for sexuality education instruction. In support of the implementation of the National Sexuality Education Standards, the current effort focuses on better preparing teachers to deliver sexuality education. An expert panel was convened by the Future of Sex Education Initiative to develop teacher-preparation standards for sexuality education. Their task was to develop standards and indicators that addressed the unique elements intrinsic to sexuality education instruction. Seven standards and associated indicators were developed that address professional disposition, diversity and equity, content knowledge, legal and professional ethics, planning, implementation, and assessment. The National Teacher-Preparation Standards for Sexuality Education represent an unprecedented unified effort to enable prospective health education teachers to become competent in teaching methodology, theory, practice of pedagogy, content, and skills, specific to sexuality education. Higher education will play a key role in ensuring the success of these standards. © 2014, American School Health Association.

  7. Standards and Students with Disabilities: Reality or Virtual Reality? Brief Report 8.

    ERIC Educational Resources Information Center

    Saint Cloud State Univ., MN.

    This Brief Report highlights current activities focused on setting standards in education, and examines whether students with disabilities are considered when standards are set. Types of standards are distinguished, including performance standards, delivery standards, and content standards. Information on organizations developing standards in…

  8. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, X., E-mail: xzm0005@auburn.edu; Maurer, D. A.; Knowlton, S. F.

    2015-12-15

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. The inversion radius of standard sawteeth is used tomore » infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.« less

  9. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    NASA Astrophysics Data System (ADS)

    Ma, X.; Maurer, D. A.; Knowlton, S. F.; ArchMiller, M. C.; Cianciosa, M. R.; Ennis, D. A.; Hanson, J. D.; Hartwell, G. J.; Hebert, J. D.; Herfindal, J. L.; Pandya, M. D.; Roberds, N. A.; Traverso, P. J.

    2015-12-01

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. The inversion radius of standard sawteeth is used to infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.

  10. A high-precision voltage source for EIT

    PubMed Central

    Saulnier, Gary J; Liu, Ning; Ross, Alexander S

    2006-01-01

    Electrical impedance tomography (EIT) utilizes electrodes placed on the surface of a body to determine the complex conductivity distribution within the body. EIT can be performed by applying currents through the electrodes and measuring the electrode voltages or by applying electrode voltages and measuring the currents. Techniques have also been developed for applying the desired currents using voltage sources. This paper describes a voltage source for use in applied-voltage EIT that includes the capability of measuring both the applied voltage and applied current. A calibration circuit and calibration algorithm are described which enables all voltage sources in an EIT system to be calibrated to a common standard. The calibration minimizes the impact of stray shunt impedance, passive component variability and active component non-ideality. Simulation data obtained using PSpice are used to demonstrate the effectiveness of the circuits and calibration algorithm. PMID:16636413

  11. Non-axisymmetric equilibrium reconstruction of a current-carrying stellarator using external magnetic and soft x-ray inversion radius measurements

    DOE PAGES

    Ma, X.; Maurer, D. A.; Knowlton, Stephen F.; ...

    2015-12-22

    Non-axisymmetric free-boundary equilibrium reconstructions of stellarator plasmas are performed for discharges in which the magnetic configuration is strongly modified by ohmically driven plasma current. These studies were performed on the compact toroidal hybrid device using the V3FIT reconstruction code with a set of 50 magnetic diagnostics external to the plasma. With the assumption of closed magnetic flux surfaces, the reconstructions using external magnetic measurements allow accurate estimates of the net toroidal flux within the last closed flux surface, the edge safety factor, and the plasma shape of these highly non-axisymmetric plasmas. Lastly, the inversion radius of standard saw-teeth is usedmore » to infer the current profile near the magnetic axis; with external magnetic diagnostics alone, the current density profile is imprecisely reconstructed.« less

  12. Autonomous Spacecraft Navigation Using Above-the-Constellation GPS Signals

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke

    2017-01-01

    GPS-based spacecraft navigation offers many performance and cost benefits, and GPS receivers are now standard GNC components for LEO missions. Recently, more and more high-altitude missions are taking advantage of the benefits of GPS navigation as well. High-altitude applications pose challenges, however, because receivers operating above the GPS constellations are subject to reduced signal strength and availability, and uncertain signal quality. This presentation will present the history and state-of-the-art in high-altitude GPS spacecraft navigation, including early experiments, current missions and receivers, and efforts to characterize and protect signals available to high-altitude users. Recent results from the very-high altitude MMS mission are also provided.

  13. High-temperature electronics

    NASA Technical Reports Server (NTRS)

    Seng, Gary T.

    1987-01-01

    In recent years, there was a growing need for electronics capable of sustained high-temperature operation for aerospace propulsion system instrumentation, control and condition monitoring, and integrated sensors. The desired operating temperature in some applications exceeds 600 C, which is well beyond the capability of currently available semiconductor devices. Silicon carbide displays a number of properties which make it very attractive as a semiconductor material, one of which is the ability to retain its electronic integrity at temperatures well above 600 C. An IR-100 award was presented to NASA Lewis in 1983 for developing a chemical vapor deposition process to grow single crystals of this material on standard silicon wafers. Silicon carbide devices were demonstrated above 400 C, but much work remains in the areas of crystal growth, characterization, and device fabrication before the full potential of silicon carbide can be realized. The presentation will conclude with current and future high-temperature electronics program plans. Although the development of silicon carbide falls into the category of high-risk research, the future looks promising, and the potential payoffs are tremendous.

  14. Highly flexible and all-solid-state paperlike polymer supercapacitors.

    PubMed

    Meng, Chuizhou; Liu, Changhong; Chen, Luzhuo; Hu, Chunhua; Fan, Shoushan

    2010-10-13

    In recent years, much effort have been dedicated to achieve thin, lightweight and even flexible energy-storage devices for wearable electronics. Here we demonstrate a novel kind of ultrathin all-solid-state supercapacitor configuration with an extremely simple process using two slightly separated polyaniline-based electrodes well solidified in the H(2)SO(4)-polyvinyl alcohol gel electrolyte. The thickness of the entire device is much comparable to that of a piece of commercial standard A4 print paper. Under its highly flexible (twisting) state, the integrate device shows a high specific capacitance of 350 F/g for the electrode materials, well cycle stability after 1000 cycles and a leakage current of as small as 17.2 μA. Furthermore, due to its polymer-based component structure, it has a specific capacitance of as high as 31.4 F/g for the entire device, which is more than 6 times that of current high-level commercial supercapacitor products. These highly flexible and all-solid-state paperlike polymer supercapacitors may bring new design opportunities of device configuration for energy-storage devices in the future wearable electronic area.

  15. Using polarized positrons to probe physics beyond the standard model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furletova, Yulia; Mantry, Sonny

    A high intensity polarized positron beam, as part of the JLAB 12 GeV program and the proposed electron-ion collider (EIC), can provide a unique opportunity for testing the Standard Model (SM) and probing for new physics. The combination of high luminosity with polarized electrons and positrons incident on protons and deuterons can isolate important effects and distinguish between possible new physics scenarios in a manner that will complement current experimental efforts. Here, a comparison of cross sections between polarized electron and positron beams will allow for an extraction of the poorly known weak neutral current coupling combination 2C 3u -more » C 3d and would complement the proposed plan for a precision extraction of the combination 2C 2u - C d at the EIC. Precision measurements of these neutral weak couplings would constrain new physics scenarios including Leptoquarks, R-parity violating supersymmetry, and electron and quark compositeness. The dependence of the charged current cross section on the longitudinal polarization of the positron beam will provide an independent probe to test the chiral structure of the electroweak interactions. A polarized positron can probe charged lepton flavor violation (CLFV) through a search for e + → τ + transitions in a manner that is independent and complementary to the proposed e - → τ - search at the EIC. A positron beam incident on an electron in a stationary nuclear target will also allow for a dark-photon (A') search via the annihilation process e + + e - → A' + γ.« less

  16. Using polarized positrons to probe physics beyond the standard model

    DOE PAGES

    Furletova, Yulia; Mantry, Sonny

    2018-05-25

    A high intensity polarized positron beam, as part of the JLAB 12 GeV program and the proposed electron-ion collider (EIC), can provide a unique opportunity for testing the Standard Model (SM) and probing for new physics. The combination of high luminosity with polarized electrons and positrons incident on protons and deuterons can isolate important effects and distinguish between possible new physics scenarios in a manner that will complement current experimental efforts. Here, a comparison of cross sections between polarized electron and positron beams will allow for an extraction of the poorly known weak neutral current coupling combination 2C 3u -more » C 3d and would complement the proposed plan for a precision extraction of the combination 2C 2u - C d at the EIC. Precision measurements of these neutral weak couplings would constrain new physics scenarios including Leptoquarks, R-parity violating supersymmetry, and electron and quark compositeness. The dependence of the charged current cross section on the longitudinal polarization of the positron beam will provide an independent probe to test the chiral structure of the electroweak interactions. A polarized positron can probe charged lepton flavor violation (CLFV) through a search for e + → τ + transitions in a manner that is independent and complementary to the proposed e - → τ - search at the EIC. A positron beam incident on an electron in a stationary nuclear target will also allow for a dark-photon (A') search via the annihilation process e + + e - → A' + γ.« less

  17. Tuberculosis in a South African prison – a transmission modelling analysis

    PubMed Central

    Johnstone-Robertson, Simon; Lawn, Stephen D; Welte, Alex; Bekker, Linda-Gail; Wood, Robin

    2015-01-01

    Background Prisons are recognised internationally as institutions with very high tuberculosis (TB) burdens where transmission is predominantly determined by contact between infectious and susceptible prisoners. A recent South African court case described the conditions under which prisoners awaiting trial were kept. With the use of these data, a mathematical model was developed to explore the interactions between incarceration conditions and TB control measures. Methods Cell dimensions, cell occupancy, lock-up time, TB incidence and treatment delays were derived from court evidence and judicial reports. Using the Wells-Riley equation and probability analyses of contact between prisoners, we estimated the current TB transmission probability within prison cells, and estimated transmission probabilities of improved levels of case finding in combination with implementation of national and international minimum standards for incarceration. Results Levels of overcrowding (230%) in communal cells and poor TB case finding result in annual TB transmission risks of 90% per annum. Implementing current national or international cell occupancy recommendations would reduce TB transmission probabilities by 30% and 50%, respectively. Improved passive case finding, modest ventilation increase or decreased lock-up time would minimally impact on transmission if introduced individually. However, active case finding together with implementation of minimum national and international standards of incarceration could reduce transmission by 50% and 94%, respectively. Conclusions Current conditions of detention for awaiting-trial prisoners are highly conducive for spread of drug-sensitive and drug-resistant TB. Combinations of simple well-established scientific control measures should be implemented urgently. PMID:22272961

  18. Thromboxane Formation Assay to Identify High On-Treatment Platelet Reactivity to Aspirin.

    PubMed

    Mohring, Annemarie; Piayda, Kerstin; Dannenberg, Lisa; Zako, Saif; Schneider, Theresa; Bartkowski, Kirsten; Levkau, Bodo; Zeus, Tobias; Kelm, Malte; Hohlfeld, Thomas; Polzin, Amin

    2017-01-01

    Platelet inhibition by aspirin is indispensable in the secondary prevention of cardiovascular events. Nevertheless, impaired aspirin antiplatelet effects (high on-treatment platelet reactivity [HTPR]) are frequent. This is associated with an enhanced risk of cardiovascular events. The current gold standard to evaluate platelet hyper-reactivity despite aspirin intake is the light-transmittance aggregometry (LTA). However, pharmacologically, the most specific test is the measurement of arachidonic acid (AA)-induced thromboxane (TX) B2 formation. Currently, the optimal cut-off to define HTPR to aspirin by inhibition of TX formation is not known. Therefore, in this pilot study, we aimed to calculate a TX formation cut-off value to detect HTPR defined by the current gold standard LTA. We measured platelet function in 2,507 samples. AA-induced TX formation by ELISA and AA-induced LTA were used to measure aspirin antiplatelet effects. TX formation correlated nonlinearly with the maximum of aggregation in the AA-induced LTA (Spearman's rho R = 0.7396; 95% CI 0.7208-0.7573, p < 0.0001). Receiver operating characteristic analysis and Youden's J statistics revealed 209.8 ng/mL as the optimal cut-off value to detect HTPR to aspirin with the TX ELISA (area under the curve: 0.92, p < 0.0001, sensitivity of 82.7%, specificity of 90.3%). In summary, TX formation ELISA is reliable in detecting HTPR to aspirin. The calculated cut-off level needs to be tested in trials with clinical end points. © 2017 S. Karger AG, Basel.

  19. Using polarized positrons to probe physics beyond the standard model

    NASA Astrophysics Data System (ADS)

    Furletova, Yulia; Mantry, Sonny

    2018-05-01

    A high intensity polarized positron beam, as part of the JLAB 12 GeV program and the proposed electron-ion collider (EIC), can provide a unique opportunity for testing the Standard Model (SM) and probing for new physics. The combination of high luminosity with polarized electrons and positrons incident on protons and deuterons can isolate important effects and distinguish between possible new physics scenarios in a manner that will complement current experimental efforts. A comparison of cross sections between polarized electron and positron beams will allow for an extraction of the poorly known weak neutral current coupling combination 2C3u - C3d and would complement the proposed plan for a precision extraction of the combination 2C2u - Cd at the EIC. Precision measurements of these neutral weak couplings would constrain new physics scenarios including Leptoquarks, R-parity violating supersymmetry, and electron and quark compositeness. The dependence of the charged current cross section on the longitudinal polarization of the positron beam will provide an independent probe to test the chiral structure of the electroweak interactions. A polarized positron can probe charged lepton flavor violation (CLFV) through a search for e+ → τ+ transitions in a manner that is independent and complementary to the proposed e- → τ- search at the EIC. A positron beam incident on an electron in a stationary nuclear target will also allow for a dark-photon (A') search via the annihilation process e+ + e- → A' + γ.

  20. Assessing the Genetics Content in the Next Generation Science Standards.

    PubMed

    Lontok, Katherine S; Zhang, Hubert; Dougherty, Michael J

    2015-01-01

    Science standards have a long history in the United States and currently form the backbone of efforts to improve primary and secondary education in science, technology, engineering, and math (STEM). Although there has been much political controversy over the influence of standards on teacher autonomy and student performance, little light has been shed on how well standards cover science content. We assessed the coverage of genetics content in the Next Generation Science Standards (NGSS) using a consensus list of American Society of Human Genetics (ASHG) core concepts. We also compared the NGSS against state science standards. Our goals were to assess the potential of the new standards to support genetic literacy and to determine if they improve the coverage of genetics concepts relative to state standards. We found that expert reviewers cannot identify ASHG core concepts within the new standards with high reliability, suggesting that the scope of content addressed by the standards may be inconsistently interpreted. Given results that indicate that the disciplinary core ideas (DCIs) included in the NGSS documents produced by Achieve, Inc. clarify the content covered by the standards statements themselves, we recommend that the NGSS standards statements always be viewed alongside their supporting disciplinary core ideas. In addition, gaps exist in the coverage of essential genetics concepts, most worryingly concepts dealing with patterns of inheritance, both Mendelian and complex. Finally, state standards vary widely in their coverage of genetics concepts when compared with the NGSS. On average, however, the NGSS support genetic literacy better than extant state standards.

  1. League Tables Must Go: There Are Better Ways of Ensuring a Quality Education for All Our Children

    ERIC Educational Resources Information Center

    James, Mary

    2015-01-01

    Despite claims made for them, many current education policies have perverse consequences. If all our children are to benefit from the good education they deserve, we need: forms of accountability that do not rely on school performance tables of test results; a focus on standards that embody high expectations for all; the urgent creation of a…

  2. Current management of penetrating torso trauma: nontherapeutic is not good enough anymore.

    PubMed

    Ball, Chad G

    2014-04-01

    A highly organized approach to the evaluation and treatment of penetrating torso injuries based on regional anatomy provides rapid diagnostic and therapeutic consistency. It also minimizes delays in diagnosis, missed injuries and nontherapeutic laparotomies. This review discusses an optimal sequence of structured rapid assessments that allow the clinician to rapidly proceed to gold standard therapies with a minimal risk of associated morbidity.

  3. Historical development and current status of emergency nursing in Turkey.

    PubMed

    Selimen, Deniz; Gürkan, Aysel

    2009-09-01

    As the demand for high quality Accident and Emergency Departments and nursing staff increases throughout Turkey, the need for more specialized emergency nurse training has also increased. Although there have been a number of positive developments regarding emergency nursing standards, the general quality of emergency nurse training needs to be improved and job definitions amended to better reflect the specialist duties of emergency nurses.

  4. The Evolving Landscape of HIV Drug Resistance Diagnostics for Expanding Testing in Resource-Limited Settings.

    PubMed

    Inzaule, Seth C; Hamers, Ralph L; Paredes, Roger; Yang, Chunfu; Schuurman, Rob; Rinke de Wit, Tobias F

    2017-01-01

    Global scale-up of antiretroviral treatment has dramatically changed the prospects of HIV/AIDS disease, rendering life-long chronic care and treatment a reality for millions of HIV-infected patients. Affordable technologies to monitor antiretroviral treatment are needed to ensure long-term durability of limited available drug regimens. HIV drug resistance tests can complement existing strategies in optimizing clinical decision-making for patients with treatment failure, in addition to facilitating population-based surveillance of HIV drug resistance. This review assesses the current landscape of HIV drug resistance technologies and discusses the strengths and limitations of existing assays available for expanding testing in resource-limited settings. These include sequencing-based assays (Sanger sequencing assays and nextgeneration sequencing), point mutation assays, and genotype-free data-based prediction systems. Sanger assays are currently considered the gold standard genotyping technology, though only available at a limited number of resource-limited setting reference and regional laboratories, but high capital and test costs have limited their wide expansion. Point mutation assays present opportunities for simplified laboratory assays, but HIV genetic variability, extensive codon redundancy at or near the mutation target sites with limited multiplexing capability have restricted their utility. Next-generation sequencing, despite high costs, may have potential to reduce the testing cost significantly through multiplexing in high-throughput facilities, although the level of bioinformatics expertise required for data analysis is currently still complex and expensive and lacks standardization. Web-based genotype-free prediction systems may provide enhanced antiretroviral treatment decision-making without the need for laboratory testing, but require further clinical field evaluation and implementation scientific research in resource-limited settings.

  5. High Energy Colliders and Hidden Sectors

    NASA Astrophysics Data System (ADS)

    Dror, Asaf Jeff

    This thesis explores two dominant frontiers of theoretical physics, high energy colliders and hidden sectors. The Large Hadron Collider (LHC) is just starting to reach its maximum operational capabilities. However, already with the current data, large classes of models are being put under significant pressure. It is crucial to understand whether the (thus far) null results are a consequence of a lack of solution to the hierarchy problem around the weak scale or requires expanding the search strategy employed at the LHC. It is the duty of the current generation of physicists to design new searches to ensure that no stone is left unturned. To this end, we study the sensitivity of the LHC to the couplings in the Standard Model top sector. We find it can significantly improve the measurements on ZtRtR coupling by a novel search strategy, making use of an implied unitarity violation in such models. Analogously, we show that other couplings in the top sector can also be measured with the same technique. Furthermore, we critically analyze a set of anomalies in the LHC data and how they may appear from consistent UV completions. We also propose a technique to measure lifetimes of new colored particles with non-trivial spin. While the high energy frontier will continue to take data, it is likely the only collider of its kind for the next couple decades. On the other hand, low-energy experiments have a promising future with many new proposed experiments to probe the existence of particles well below the weak scale but with small couplings to the Standard Model. In this work we survey the different possibilities, focusingon the constraints as well as possible new hidden sector dynamics. In particular, we show that vector portals which couple to an anomalous current, e.g., baryon number, are significantly constrained from flavor changing meson decays and rare Z decays. Furthermore, we present a new mechanism for dark matter freezeout which depletes the dark sector through an out-of-equilibrium decay into the Standard Model.

  6. Current Issues in the Design and Information Content of Instrument Approach Charts

    DOT National Transportation Integrated Search

    1995-03-01

    This report documents an analysis and interview effort conducted to identify common operational errors made using : current Instrument Approach Plates (IAP), Standard Terminal Arrival Route (STAR) charts. Standard Instrument Departure : (SID) charts,...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rupich, Martin W.; Sathyamurthy, Srivatsan; Fleshler, Steven

    We demonstrate a twofold increase in the in-field critical current of AMSC's standard 2G coil wire by irradiation with 18-MeV Au ions. The optimum pinning enhancement is achieved with a dose of 6 × 10 11 Au ions/cm 2. Although the 77 K, self-field critical current is reduced by about 35%, the in-field critical current (H//c) shows a significant enhancement between 4 and 50 K in fields > 1 T. The process was used for the roll-to-roll irradiation of AMSC's standard 46-mm-wide production coated conductor strips, which were further processed into standard copper laminated coil wire. The long-length wires showmore » the same enhancement as attained with short static irradiated samples. The roll-to-roll irradiation process can be incorporated in the standard 2G wire manufacturing, with no modifications to the current process. In conclusion, the enhanced performance of the wire will benefit rotating machine and magnet applications.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rupich, Martin W.; Sathyamurthy, Srivatsan; Fleshler, Steven

    We demonstrate a twofold increase in the in-field critical current of AMSC's standard 2G coil wire by irradiation with 18-MeV Au ions. The optimum pinning enhancement is achieved with a dose of 6 x 10(11) Au ions/cm(2). Although the 77 K, self-field critical current is reduced by about 35%, the in-field critical current (H//c) shows a significant enhancement between 4 and 50 K in fields > 1 T. The process was used for the roll-to-roll irradiation of AMSC's standard 46-mm-wide production coated conductor strips, which were further processed into standard copper laminated coil wire. The long-length wires show the samemore » enhancement as attained with short static irradiated samples. The roll-to-roll irradiation process can be incorporated in the standard 2G wire manufacturing, with no modifications to the current process. The enhanced performance of the wire will benefit rotating machine and magnet applications.« less

  9. Incretin response to a standard test meal in a rat model of sleeve gastrectomy with diet-induced obesity.

    PubMed

    Al-Sabah, Suleiman; Alasfar, Fahad; Al-Khaledi, Ghanim; Dinesh, Reshma; Al-Saleh, Mervat; Abul, Habib

    2014-01-01

    Currently, the most effective treatment for obesity is bariatric surgery. Gastroduodenal bypass surgery produces sustained weight loss and improves glycemic control and insulin sensitivity. Previous studies have shown that sleeve gastrectomy (SG) produces similar results and implicate changes in incretin hormone release in these effects. Male Sprague-Dawley rats were divided into four groups; lean control (lean), diet-induced obesity (DIO), DIO animals that had undergone SG (SG), and DIO animals that had undergone a sham operation (sham). After a 2-week recovery period, the incretin response to a standard test meal was measured. Blood sampling was performed in free-moving rats at various time points using chronic vascular access to the right jugular vein. There was a significant increase in the bodyweight of DIO animals fed a high-fat/high-sugar diet compared with the lean animals, which was reversed by SG. DIO caused an impairment of the GLP-1 response to a standard test meal, but not the GIP response. SG resulted in a dramatic increase in the GLP-1 response to a standard test meal but had no effect on the GIP response. A rapid rise in blood sugar was observed in the SG group following a standard test meal that was followed by reactive hypoglycemia. SG dramatically increases the GLP-1 response to a standard test meal but has no effect on GIP in a rat model of DIO.

  10. Do Formal Inspections Ensure that British Zoos Meet and Improve on Minimum Animal Welfare Standards?

    PubMed Central

    Draper, Chris; Browne, William; Harris, Stephen

    2013-01-01

    Simple Summary Key aims of the formal inspections of British zoos are to assess compliance with minimum standards of animal welfare and promote improvements in animal care and husbandry. We compared reports from two consecutive inspections of 136 British zoos to see whether these goals were being achieved. Most zoos did not meet all the minimum animal welfare standards and there was no clear evidence of improving levels of compliance with standards associated with the Zoo Licensing Act 1981. The current system of licensing and inspection does not ensure that British zoos meet and maintain, let alone exceed, the minimum animal welfare standards. Abstract We analysed two consecutive inspection reports for each of 136 British zoos made by government-appointed inspectors between 2005 and 2011 to assess how well British zoos were complying with minimum animal welfare standards; median interval between inspections was 1,107 days. There was no conclusive evidence for overall improvements in the levels of compliance by British zoos. Having the same zoo inspector at both inspections affected the outcome of an inspection; animal welfare criteria were more likely to be assessed as unchanged if the same inspector was present on both inspections. This, and erratic decisions as to whether a criterion applied to a particular zoo, suggest inconsistency in assessments between inspectors. Zoos that were members of a professional association (BIAZA) did not differ significantly from non-members in the overall number of criteria assessed as substandard at the second inspection but were more likely to meet the standards on both inspections and less likely to have criteria remaining substandard. Lack of consistency between inspectors, and the high proportion of zoos failing to meet minimum animal welfare standards nearly thirty years after the Zoo Licensing Act came into force, suggest that the current system of licensing and inspection is not meeting key objectives and requires revision. PMID:26479752

  11. Towards a standards-compliant genomic and metagenomic publication record

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenner, Marsha W; Garrity, George M.; Field, Dawn

    2008-04-01

    Increasingly we are aware as a community of the growing need to manage the avalanche of genomic and metagenomic data, in addition to related data types like ribosomal RNA and barcode sequences, in a way that tightly integrates contextual data with traditional literature in a machine-readable way. It is for this reason that the Genomic Standards Consortium (GSC) formed in 2005. Here we suggest that we move beyond the development of standards and tackle standards-compliance and improved data capture at the level of the scientific publication. We are supported in this goal by the fact that the scientific community ismore » in the midst of a publishing revolution. This revolution is marked by a growing shift away from a traditional dichotomy between 'journal articles' and 'database entries' and an increasing adoption of hybrid models of collecting and disseminating scientific information. With respect to genomes and metagenomes and related data types, we feel the scientific community would be best served by the immediate launch of a central repository of short, highly structured 'Genome Notes' that must be standards-compliant. This could be done in the context of an existing journal, but we also suggest the more radical solution of launching a new journal. Such a journal could be designed to cater to a wide range of standards-related content types that are not currently centralized in the published literature. It could also support the demand for centralizing aspects of the 'gray literature' (documents developed by institutions or communities) such as the call by the GSCl for a central repository of Standard Operating Procedures describing the genomic annotation pipelines of the major sequencing centers. We argue that such an 'eJournal', published under the Open Access paradigm by the GSC, could be an attractive publishing forum for a broader range of standardization initiatives within, and beyond, the GSC and thereby fill an unoccupied yet increasingly important niche within the current research landscape.« less

  12. Positive animal welfare states and reference standards for welfare assessment.

    PubMed

    Mellor, D J

    2015-01-01

    Developments in affective neuroscience and behavioural science during the last 10-15 years have together made it increasingly apparent that sentient animals are potentially much more sensitive to their environmental and social circumstances than was previously thought to be the case. It therefore seems likely that both the range and magnitude of welfare trade-offs that occur when animals are managed for human purposes have been underestimated even when minimalistic but arguably well-intentioned attempts have been made to maintain high levels of welfare. In light of these neuroscience-supported behaviour-based insights, the present review considers the extent to which the use of currently available reference standards might draw attention to these previously neglected areas of concern. It is concluded that the natural living orientation cannot provide an all-embracing or definitive welfare benchmark because of its primary focus on behavioural freedom. However assessments of this type, supported by neuroscience insights into behavioural motivation, may now carry greater weight when used to identify management practices that should be avoided, discontinued or substantially modified. Using currently accepted baseline standards as welfare reference points may result in small changes being accorded greater significance than would be the case if they were compared with higher standards, and this could slow the progress towards better levels of welfare. On the other hand, using "what animals want" as a reference standard has the appeal of focusing on the specific resources or conditions the animals would choose themselves and can potentially improve their welfare more quickly than the approach of making small increments above baseline standards. It is concluded that the cautious use of these approaches in different combinations could lead to recommendations that would more effectively promote positive welfare states in hitherto neglected areas of concern.

  13. Integrated Science Assessment (ISA) for Oxides of Nitrogen ...

    EPA Pesticide Factsheets

    EPA has announced that the Second External Review Draft of the Integrated Science Assessment (ISA) for Oxides of Nitrogen and Sulfur - Environmental Criteria has been made available for independent peer review and public review. This draft ISA document represents a concise synthesis and evaluation of the most policy-relevant science and will ultimately provide the scientific bases for EPA's decision on retaining or revising the current secondary standards for NO2 and SO2. The current secondary NAAQS for SOX, set in 1973, is a 3-h average 0.5 ppm of SO2, not to be exceeded more than once per year. The secondary NOX NAAQS is identical to the primary standard set in 1971: 0.053 ppm NO2 as an annual average. These secondary standards are intended to protect against direct damage to vegetation by exposure to gas-phase NOX or SOX. Acute and chronic exposures to SO2 can have phytotoxic effects on vegetation, such as foliar injury, decreased photosynthesis and decreased growth. Similarly, exposure to sufficient concentrations of NO2, NO, PAN, and HNO3 can cause foliar injury, decreased photosynthesis and decreased growth. In addition, these gas-phase NOX may contribute to N saturation in some areas of the U.S. There is little new evidence overall for direct effects of exposure to gas-phase NOX or SOX on vegetation at current concentrations. However, there is some evidence that vegetation in regions with high concentrations of photochemical oxidants may be affected by HN

  14. Comparison of the Standard of Air Leakage in Current Metal Duct Systems in the World

    NASA Astrophysics Data System (ADS)

    Di, Yuhui; Wang, Jiqian; Feng, Lu; Li, Xingwu; Hu, Chunlin; Shi, Junshe; Xu, Qingsong; Qiao, Leilei

    2018-01-01

    Based on the requirements of air leakage of metal ducts in Chinese design standards, technical measures and construction standards, this paper compares the development history, the classification of air pressure levels and the air tightness levels of air leakage standards of current Chinese and international metal ducts, sums up the differences, finds shortage by investigating the design and construction status and access to information, and makes recommendations, hoping to help the majority of engineering and technical personnel.

  15. Assessing Current State Science Teaching and Learning Standards for Ability to Achieve Climate Science Literacy

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2012-12-01

    Even though there exists a high degree of consensus among scientists about climate change, doubt has actually increased over the last five years within the general U.S. public. In 2006, 79% of those polled agreed that there is evidence for global warming, while only 59% agreed in 2010 (Pew Research Center, 2010). The source for this doubt can be partially attributed to lack of knowledge. Formal education is one mechanism that potentially can address inadequate public understanding as school is the primary place where students - and future citizens - learn about the climate. In a joint effort, several governmental agencies, non-governmental organizations, scientists and educators have created a framework called The Essential Principles of Climate Science Literacy, detailing seven concepts that are deemed vital for individuals and communities to understand Earth's climate system (USGCRP, 2009). Can students reach climate literacy - as defined by these 7 concepts - if they are taught using a curriculum based on the current state standards? To answer this question, the K-12 state science teaching and learning standards for Texas and California - two states that heavily influence nation-wide textbook creation - were compared against the Essential Principles. The data analysis consisted of two stages, looking for: 1) direct reference to "climate" and "climate change" and 2) indirect reference to the 7 Essential Principles through axial coding. The word "climate" appears in the California K-12 science standards 4 times and in the Texas standards 7 times. The word "climate change" appears in the California and Texas standards only 3 times each. Indirect references to the 7 Essential Principles of climate science literacy were more numerous. Broadly, California covered 6 of the principles while Texas covered all 7. In looking at the 7 principles, the second one "Climate is regulated by complex interactions among component of the Earth system" was the most substantively addressed. Least covered were number 6 "Human activities are impacting the climate system" and number 7 "Climate change will have consequences for the Earth system and human lives." Most references, either direct or indirect, occurred in the high school standards for earth science, a class not required for graduation in either state. This research points to the gaps between what the 7 Essential Principles of Climate Literacy defines as essential knowledge and what students may learn in their K-12 science classes. Thus, the formal system does not seem to offer an experience which can potentially develop a more knowledgeable citizenry who will be able to make wise personal and policy decisions about climate change, falling short of the ultimate goal of achieving widespread climate literacy. Especially troubling was the sparse attention to the principles addressing the human connection to the climate - principles number 6 and 7. If climate literate citizens are to make "wise personal and policy decisions" (USGCRP, 2009), these two principles especially are vital. This research, therefore, has been valuable for identifying current shortcomings in state standards.

  16. Current Sheet Properties and Dynamics During Sympathetic Breakout Eruptions

    NASA Astrophysics Data System (ADS)

    Lynch, B. J.; Edmondson, J. K.

    2013-12-01

    We present the continued analysis of the high-resolution 2.5D MHD simulations of sympathetic magnetic breakout eruptions from a pseudostreamer source region. We examine the generation of X- and O-type null points during the current sheet tearing and track the magnetic island formation and evolution during periods of reconnection. The magnetic breakout eruption scenario forms an overlying 'breakout' current sheet that evolves slowly and removes restraining flux from above the sheared field core that will eventually become the center of the erupting flux rope-like structure. The runaway expansion from the expansion-breakout reconnection positive feedback enables the formation of the second, vertical/radial current sheet underneath the rising sheared field core as in the standard CHSKP eruptive flare scenario. We will examine the flux transfer rates through the breakout and flare current sheets and compare the properties of the field and plasma inflows into the current sheets and the reconnection jet outflows into the flare loops and flux rope ejecta.

  17. Neural interfaces for control of upper limb prostheses: the state of the art and future possibilities.

    PubMed

    Schultz, Aimee E; Kuiken, Todd A

    2011-01-01

    Current treatment of upper limb amputation restores some degree of functional ability, but this ability falls far below the standard set by the natural arm. Although acceptance rates can be high when patients are highly motivated and receive proper training and care, current prostheses often fail to meet the daily needs of amputees and frequently are abandoned. Recent advancements in science and technology have led to promising methods of accessing neural information for communication or control. Researchers have explored invasive and noninvasive methods of connecting with muscles, nerves, or the brain to provide increased functionality for patients experiencing disease or injury, including amputation. These techniques offer hope of more natural and intuitive prosthesis control, and therefore increased quality of life for amputees. In this review, we discuss the current state of the art of neural interfaces, particularly those that may find application within the prosthetics field. Copyright © 2011 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  18. Size-dependent electrocatalytic activity of gold nanoparticles on HOPG and highly boron-doped diamond surfaces.

    PubMed

    Brülle, Tine; Ju, Wenbo; Niedermayr, Philipp; Denisenko, Andrej; Paschos, Odysseas; Schneider, Oliver; Stimming, Ulrich

    2011-12-06

    Gold nanoparticles were prepared by electrochemical deposition on highly oriented pyrolytic graphite (HOPG) and boron-doped, epitaxial 100-oriented diamond layers. Using a potentiostatic double pulse technique, the average particle size was varied in the range from 5 nm to 30 nm in the case of HOPG as a support and between < 1 nm and 15 nm on diamond surfaces, while keeping the particle density constant. The distribution of particle sizes was very narrow, with standard deviations of around 20% on HOPG and around 30% on diamond. The electrocatalytic activity towards hydrogen evolution and oxygen reduction of these carbon supported gold nanoparticles in dependence of the particle sizes was investigated using cyclic voltammetry. For oxygen reduction the current density normalized to the gold surface (specific current density) increased for decreasing particle size. In contrast, the specific current density of hydrogen evolution showed no dependence on particle size. For both reactions, no effect of the different carbon supports on electrocatalytic activity was observed.

  19. Benefits of adjuvant chemotherapy in high-grade gliomas.

    PubMed

    DeAngelis, Lisa M

    2003-12-01

    The current standard of care for patients with high-grade glioma is resection followed by radiotherapy. Adjuvant chemotherapy is not widely accepted because of the low sensitivity of gliomas to traditional antineoplastic agents, the poor penetration of most drugs across the blood-brain barrier, and the significant systemic toxicity associated with current agents. However, nitrosoureas and, subsequently, temozolomide (Temodar [US], Temodal [international]; Schering-Plough Corporation, Kenilworth, NJ), a novel alkylating agent, cross the blood-brain barrier and have activity against gliomas. Nitrosoureas have been studied in phase III trials in the adjuvant setting. In individual trials, chemotherapy did not increase median survival but did increase the proportion of patients surviving >/=18 months by 15%. Only with large meta-analyses did the addition of chemotherapy achieve a statistically significant improvement in median survival. Currently there is no means of identifying which patients will benefit from adjuvant chemotherapy, but nitrosoureas and temozolomide are well tolerated in most patients, justifying the administration of adjuvant chemotherapy to all newly diagnosed patients with malignant glioma.

  20. Sweet sixteen: changing time preferences in the transition from middle school to high school, for different scenarios.

    PubMed

    Lahav, Eyal; Shavit, Tal; Benzion, Uri

    2015-01-01

    Teenagers earn, save and spend large amounts of money. Therefore, understanding teenagers' time preference and how it affects their economic behavior is very important. The current study investigates time preferences of high school and middle school students, and the effect of different intertemporal choice scenarios on teenagers' subjective discount rate. One scenario used a standard intertemporal choice question while the other was a wage scenario. We found higher future orientation (lower subjective discount rate) among high school students than among middle school students when using a standard scenario but found no difference between groups in the wage scenario. For both groups, we found the subjective discount rates increased when the teenagers are asked to delay receipt of wages they earned by working (wage scenario). Other variables, like participation in sports and an allowance given by parents, were found to affect teenagers' time preferences. © Society for the Experimental Analysis of Behavior.

  1. High-level intuitive features (HLIFs) for intuitive skin lesion description.

    PubMed

    Amelard, Robert; Glaister, Jeffrey; Wong, Alexander; Clausi, David A

    2015-03-01

    A set of high-level intuitive features (HLIFs) is proposed to quantitatively describe melanoma in standard camera images. Melanoma is the deadliest form of skin cancer. With rising incidence rates and subjectivity in current clinical detection methods, there is a need for melanoma decision support systems. Feature extraction is a critical step in melanoma decision support systems. Existing feature sets for analyzing standard camera images are comprised of low-level features, which exist in high-dimensional feature spaces and limit the system's ability to convey intuitive diagnostic rationale. The proposed HLIFs were designed to model the ABCD criteria commonly used by dermatologists such that each HLIF represents a human-observable characteristic. As such, intuitive diagnostic rationale can be conveyed to the user. Experimental results show that concatenating the proposed HLIFs with a full low-level feature set increased classification accuracy, and that HLIFs were able to separate the data better than low-level features with statistical significance. An example of a graphical interface for providing intuitive rationale is given.

  2. Inductive High Power Transfer Technologies for Electric Vehicles

    NASA Astrophysics Data System (ADS)

    Madzharov, Nikolay D.; Tonchev, Anton T.

    2014-03-01

    Problems associated with "how to charge the battery pack of the electric vehicle" become more important every passing day. Most logical solution currently is the non-contact method of charge, possessing a number of advantages over standard contact methods for charging. This article focuses on methods for Inductive high power contact-less transfer of energy at relatively small distances, their advantages and disadvantages. Described is a developed Inductive Power Transfer (IPT) system for fast charging of electric vehicles with nominal power of 30 kW over 7 to 9 cm air gap.

  3. Chaos-chaos transition of left hemisphere EEGs during standard tasks of Waterloo-Stanford Group Scale of hypnotic susceptibility.

    PubMed

    Yargholi, Elahe'; Nasrabadi, Ali Motie

    2015-01-01

    A recent study, recurrence quantification analysis of EEG signals during standard tasks of Waterloo-Stanford Group Scale of hypnotic susceptibility investigated recurrence quantifiers (RQs) of hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility to distinguish subjects of different hypnotizability levels. Following the same analysis, the current study determines the capability of different RQs to distinguish subjects of low, medium and high hypnotizability level and studies the influence of hypnotizability level on underlying dynamic of tasks. Besides, EEG channels were sorted according to the number of their RQs, which differed significantly among subjects of different hypnotizability levels. Another valuable result was determination of major brain regions in observing significant differences in various task types (ideomotors, hallucination, challenge and memory).

  4. [Study on the reorganization of standards related to food contact ceramics and porcelains].

    PubMed

    Zhang, Jianbo; Zhu, Lei; Zhang, Hong; Liu, Shan; Wang, Zhutian

    2014-07-01

    To solve the problem of overlap, iterance and conflict among current standards related to food contact ceramics and porcelains. To collect all the current standards related to food contact ceramics and porcelains and reorganize them following the settled principles and method and list the standards that need to be revoked, revised, incorporated, or keep valid and excluded from the food safety standard system. 19 standards were collected in this study and reorganized. The main food safety indexes in these standards were the limits for lead and cadmium that released from food contact ceramics and porcelains. There were released limits for lead and cadmium in 10 standards, including 4 horizontal standards and 6 commodity standards. The provisions in these 10 standards were in conflict. And as a result of this, the 4 horizontal standards were suggested to be incorporated and revised to one food safety standard, while the 6 commodity standards were suggested to be revised and exclude the lead and cadmium provisions. Another 7 commodity standards only referenced provisions for lead and cadmium limits from horizontal standards, and these 7 standards were suggested to be excluded from food safety standard system. There were no food safety indexes in 2 standards of the 19 standards, these standards were considered not related to food safety and no need to be reorganized. There were conflicts about the released limits of lead and cadmium among the current standards related to food contact ceramics and porcelains. So, it is necessary to a set up a new food safety standard for released lead and cadmium permissible limits which can apply to all food contact ceramics and porcelains. This food safety standard should be based on food safety risk assessment and the actual situations of manufacture and usage of food contact ceramics and porcelains. The provisions in international standards and relative standards from other countries can also provide references to this standard.

  5. Data Sharing to Improve Close Approach Monitoring and Safety of Flight

    NASA Astrophysics Data System (ADS)

    Chan, Joseph; DalBello, Richard; Hope, Dean; Wauthier, Pascal; Douglas, Tim; Inghram, Travis

    2009-03-01

    Individual satellite operators have done a good job of developing the internal protocols and procedures to ensure the safe operation of their fleets. However, data sharing among operators for close approach monitoring is conducted in an ad-hoc manner during relocations, and there is currently no standardized agreement among operators on the content, format, and distribution protocol for data sharing. Crowding in geostationary orbit, participation by new commercial actors, government interest in satellite constellations, and highly maneuverable spacecraft all suggest that satellite operators will need to begin a dialogue on standard communication protocols and procedure to improve situation awareness. We will give an overview of the current best practices among different operators for close approach monitoring and discuss the concept of an active data center to improve data sharing, conjunction monitoring, and avoidance among satellite operators. We will also report on the progress and lessons learned from a Data Center prototype conducted by several operators over a one year period.

  6. Atmospheric/Space Environment Support Lessons Learned Regarding Aerospace Vehicle Design and Operations

    NASA Technical Reports Server (NTRS)

    Vaughan, William W.; Anderson, B. Jeffrey

    2005-01-01

    In modern government and aerospace industry institutions the necessity of controlling current year costs often leads to high mobility in the technical workforce, "one-deep" technical capabilities, and minimal mentoring for young engineers. Thus, formal recording, use, and teaching of lessons learned are especially important in the maintenance and improvement of current knowledge and development of new technologies, regardless of the discipline area. Within the NASA Technical Standards Program Website http://standards.nasa.gov there is a menu item entitled "Lessons Learned/Best Practices". It contains links to a large number of engineering and technical disciplines related data sets that contain a wealth of lessons learned information based on past experiences. This paper has provided a small sample of lessons learned relative to the atmospheric and space environment. There are many more whose subsequent applications have improved our knowledge of the atmosphere and space environment, and the application of this knowledge to the engineering and operations for a variety of aerospace programs.

  7. Laparoscopic and Robotic Total Mesorectal Excision in the Treatment of Rectal Cancer. Brief Review and Personal Remarks

    PubMed Central

    Bianchi, Paolo Pietro; Petz, Wanda; Luca, Fabrizio; Biffi, Roberto; Spinoglio, Giuseppe; Montorsi, Marco

    2014-01-01

    The current standard treatment for rectal cancer is based on a multimodality approach with preoperative radiochemotherapy in advanced cases and complete surgical removal through total mesorectal excision (TME). The most frequent surgical approach is traditional open surgery, as laparoscopic TME requires high technical skill, a long learning curve, and is not widespread, still being confined to centers with great experience in minimally invasive techniques. Nevertheless, in several studies, the laparoscopic approach, when compared to open surgery, has shown some better short-term clinical outcomes and at least comparable oncologic results. Robotic surgery for the treatment of rectal cancer is an emerging technique, which could overcome some of the technical difficulties posed by standard laparoscopy, but evidence from the literature regarding its oncologic safety and clinical outcomes is still lacking. This brief review analyses the current status of minimally invasive surgery for rectal cancer therapy, focusing on oncologic safety and the new robotic approach. PMID:24834429

  8. Modeling for CO poisoning of a fuel cell anode

    NASA Technical Reports Server (NTRS)

    Dhar, H. P.; Kush, A. K.; Patel, D. N.; Christner, L. G.

    1986-01-01

    Poisoning losses in a half-cell in the 110-190 C temperature range have been measured in 100 wt pct H3PO4 for various mixtures of H2, CO, and CO2 gases in order to investigate the polarization loss due to poisoning by CO of a porous fuel cell Pt anode. At a fixed current density, the poisoning loss was found to vary linearly with ln of the CO/H2 concentration ratio, although deviations from linearity were noted at lower temperatures and higher current densities for high CO/H2 concentration ratios. The surface coverages of CO were also found to vary linearly with ln of the CO/H2 concentration ratio. A general adsorption relationship is derived. Standard free energies for CO adsorption were found to vary from -14.5 to -12.1 kcal/mol in the 130-190 C temperature range. The standard entropy for CO adsorption was found to be -39 cal/mol per deg K.

  9. Mobile measurement setup according to IEC 62220-1-2 for DQE determination on digital mammography systems

    NASA Astrophysics Data System (ADS)

    Greiter, Matthias B.; Hoeschen, Christoph

    2010-04-01

    The international standard IEC 62220-1-2 defines the measurement procedure for determination of the detective quantum efficiency (DQE) of digital x-ray imaging devices used in mammography. A mobile setup complying to this standard and adaptable to most current systems was constructed in the Helmholtz Zentrum München to allow for an objective technical comparison of current full field digital mammography units employed in mammography screening in Germany. This article demonstrates the setup's capabilities with a focus on the measurement uncertainties of all quantities contributing to DQE measurements. Evaluation of uncertainties encompasses results from measurements on a Sectra Microdose Mammography in clinical use, as well as on a prototype of a Fujifilm Amulet system at various radiation qualities. Both systems have a high spatial resolution of 50 μm × 50 μm. The modulation transfer function (MTF), noise power spectrum (NPS) and DQE of the Sectra MDM are presented in comparison to results previously published by other authors.

  10. Death certification: an audit of practice entering the 21st century

    PubMed Central

    Swift, B; West, K

    2002-01-01

    Aims: Death certification, a legal duty of doctors, continues to be poorly performed despite Royal College recommendations and increased education at an undergraduate level. Therefore, the current performance of certifying doctors was audited within a large teaching hospital entering the new century. Methods: A total of 1000 completed certificate counterfoils were examined retrospectively for appropriateness of completion and the ability to construct a logical cause of death cascade. Results: Only 55% of certificates were completed to a minimally accepted standard, and many of these failed to provide relevant information to allow adequate ICD-10 coding. Nearly 10% were completed to a poor standard, being illogical or inappropriately completed. Conclusions: The results show no improvement in the state of certification. Possible interventions to improve outcomes are discussed; however, in light of a recent high profile legal case a current Home Office review of death certification may suggest the passing of statutory law to ensure accurate completion. PMID:11919211

  11. Do Heat Pump Clothes Dryers Make Sense for the U.S. Market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, Steve; Franco, Victor; Lekov, Alex

    Heat pump clothes dryers (HPCDs) can be as much as 50percent more energy-efficient than conventional electric resistance clothes dryers, and therefore have the potential to save substantial amounts of electricity. While not currently available in the U.S., there are manufacturers in Europe and Japan that produce units for those markets. Drawing on analysis conducted for the U.S. Department of Energy's (DOE) current rulemaking on amended standards for clothes dryers, this paper evaluates the cost-effectiveness of HPCDs in American homes, as well as the national impact analysis for different market share scenarios. In order to get an accurate measurement of realmore » energy savings potential, the paper offers a new energy use calculation methodology that takes into account the most current data on clothes washer cycles, clothes dryer usage frequency, remaining moisture content, and load weight per cycle, which is very different from current test procedure values. Using the above methodology along with product cost estimates developed by DOE, the paper presents the results of a life-cycle cost analysis of the adoption of HPCDs in a representative sample of American homes. The results show that HPCDs have positive economic benefits only for households with high clothes dryer usage or for households with high electricity prices and moderately high utilization.« less

  12. Research on resistance characteristics of YBCO tape under short-time DC large current impact

    NASA Astrophysics Data System (ADS)

    Zhang, Zhifeng; Yang, Jiabin; Qiu, Qingquan; Zhang, Guomin; Lin, Liangzhen

    2017-06-01

    Research of the resistance characteristics of YBCO tape under short-time DC large current impact is the foundation of the developing DC superconducting fault current limiter (SFCL) for voltage source converter-based high voltage direct current system (VSC-HVDC), which is one of the valid approaches to solve the problems of renewable energy integration. SFCL can limit DC short-circuit and enhance the interrupting capabilities of DC circuit breakers. In this paper, under short-time DC large current impacts, the resistance features of naked tape of YBCO tape are studied to find the resistance - temperature change rule and the maximum impact current. The influence of insulation for the resistance - temperature characteristics of YBCO tape is studied by comparison tests with naked tape and insulating tape in 77 K. The influence of operating temperature on the tape is also studied under subcooled liquid nitrogen condition. For the current impact security of YBCO tape, the critical current degradation and top temperature are analyzed and worked as judgment standards. The testing results is helpful for in developing SFCL in VSC-HVDC.

  13. A Comprehensive Analysis of High School Genetics Standards: Are States Keeping Pace with Modern Genetics?

    PubMed Central

    Dougherty, M.J.; Pleasants, C.; Solow, L.; Wong, A.; Zhang, H.

    2011-01-01

    Science education in the United States will increasingly be driven by testing and accountability requirements, such as those mandated by the No Child Left Behind Act, which rely heavily on learning outcomes, or “standards,” that are currently developed on a state-by-state basis. Those standards, in turn, drive curriculum and instruction. Given the importance of standards to teaching and learning, we investigated the quality of life sciences/biology standards with respect to genetics for all 50 states and the District of Columbia, using core concepts developed by the American Society of Human Genetics as normative benchmarks. Our results indicate that the states’ genetics standards, in general, are poor, with more than 85% of the states receiving overall scores of Inadequate. In particular, the standards in virtually every state have failed to keep pace with changes in the discipline as it has become genomic in scope, omitting concepts related to genetic complexity, the importance of environment to phenotypic variation, differential gene expression, and the differences between inherited and somatic genetic disease. Clearer, more comprehensive genetics standards are likely to benefit genetics instruction and learning, help prepare future genetics researchers, and contribute to the genetic literacy of the U.S. citizenry. PMID:21885828

  14. Effect of the 2.0 mg/m3 coal mine dust standard on underground environmental dust levels.

    PubMed

    Parobeck

    1975-08-01

    The 1969 Federal Coal Mine Health and Safety Act established environmental dust standards for underground coal mines. The Act requires that the average concentration of respirable dust in the active workings of a mine be maintained at or below 3.0 mg/m3; and, that effective December 30, 1972, the 3.0 mg/m3 standard be reduced to 2.0 mg/m3. This paper discusses the current status of dust levels in our underground coal mines, the effect of the 2.0 mg/m3 standard on underground dust levels, and associates the current levels with specific operations and occupations. The comparison is made between current levels and those existing prior to December 30, 1972.

  15. An Overview of Skill Standards Systems in Education & Industry. Systems in the U.S. and Abroad. Volume I.

    ERIC Educational Resources Information Center

    Institute for Educational Leadership, Washington, DC.

    This first volume in a four-volume study of industry- and education-driven skill standards in the United States and other countries describes current practice. Chapter I is the executive summary. Chapter II is an overview of historical and current issues that will affect a voluntary network of industry-based skill standards, competencies, and…

  16. Preventing Epilepsy After Traumatic Brain Injury

    DTIC Science & Technology

    2009-02-01

    treatment of early seizures following TBI, and to compare the efficacy of topiramate to prevent early seizures to the standard of care ( phenytoin ). A...injury (TBI), to determine if topiramate could prevent early seizures better than the current standard of care ( phenytoin ) and to develop a...receive topiramate for three months, and the third, control, arm would receive phenytoin for seven days (current standard of care). EEGs were to

  17. Therapeutic efficacy of alternative primaquine regimens to standard treatment in preventing relapses by Plasmodium vivax: A systematic review and meta-analysis.

    PubMed

    Zuluaga-Idarraga, Lina Marcela; Tamayo Perez, María-Eulalia; Aguirre-Acevedo, Daniel Camilo

    2015-12-30

    To compare efficacy and safety of primaquine regimens currently used to prevent relapses by P. vivax. A systematic review was carried out to identify clinical trials evaluating efficacy and safety to prevent malaria recurrences by P. vivax of primaquine regimen 0.5 mg/kg/ day for 7 or 14 days compared to standard regimen of 0.25 mg/kg/day for 14 days. Efficacy of primaquine according to cumulative incidence of recurrences after 28 days was determined. The overall relative risk with fixed-effects meta-analysis was estimated. For the regimen 0.5 mg/kg/day/7 days were identified 7 studies, which showed an incidence of recurrence between 0% and 20% with follow-up 60-210 days; only 4 studies comparing with the standard regimen 0.25 mg/kg/day/14 days and no difference in recurrences between both regimens (RR= 0.977, 95% CI= 0.670 to 1.423) were found. 3 clinical trials using regimen 0.5 mg/kg/day/14 days with an incidence of recurrences between 1.8% and 18.0% during 330-365 days were identified; only one study comparing with the standard regimen (RR= 0.846, 95% CI= 0.484 to 1.477). High risk of bias and differences in handling of included studies were found. Available evidence is insufficient to determine whether currently PQ regimens used as alternative rather than standard treatment have better efficacy and safety in preventing relapse of P. vivax. Clinical trials are required to guide changes in treatment regimen of malaria vivax.

  18. Data Availability in Appliance Standards and Labeling Program Development and Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romankiewicz, John; Khanna, Nina; Vine, Edward

    2013-05-01

    In this report, we describe the necessary data inputs for both standards development and program evaluation and perform an initial assessment of the availability and uncertainty of those data inputs in China. For standards development, we find that China and its standards and labeling program administrators currently has access to the basic market and technical data needed for conducting market and technology assessment and technological and economic analyses. Some data, such as shipments data, is readily available from the China Energy Label product registration database while the availability of other data, including average unit energy consumption, prices and design options,more » needs improvement. Unlike some other countries such as the United States, most of the necessary data for conducting standards development analyses are not publicly available or compiled in a consolidated data source. In addition, improved data on design and efficiency options as well as cost data (e.g., manufacturing costs, mark-ups, production and product use-phase costs) – key inputs to several technoeconomic analyses – are particularly in need given China’s unconsolidated manufacturing industry. For program evaluation, we find that while China can conduct simple savings evaluations on its incentive programs with the data it currently has available from the Ministry of Finance – the program administrator, the savings estimates produced by such an evaluation will carry high uncertainty. As such, China could benefit from an increase in surveying and metering in the next one to three years to decrease the uncertainty surrounding key data points such as unit energy savings and free ridership.« less

  19. Vorapaxar: The Current Role and Future Directions of a Novel Protease-Activated Receptor Antagonist for Risk Reduction in Atherosclerotic Disease.

    PubMed

    Gryka, Rebecca J; Buckley, Leo F; Anderson, Sarah M

    2017-03-01

    Despite the current standard of care, patients with cardiovascular disease remain at a high risk for recurrent events. Inhibition of thrombin-mediated platelet activation through protease-activated receptor-1 antagonism may provide reductions in atherosclerotic disease beyond those achievable with the current standard of care. Our primary objective is to evaluate the clinical literature regarding the role of vorapaxar (Zontivity™) in the reduction of cardiovascular events in patients with a history of myocardial infarction and peripheral artery disease. In particular, we focus on the potential future directions for protease-activating receptor antagonists in the treatment of a broad range of atherosclerotic diseases. A literature search of PubMed and EBSCO was conducted to identify randomized clinical trials from August 2005 to June 2016 using the search terms: 'vorapaxar', 'SCH 530348', 'protease-activated receptor-1 antagonist', and 'Zontivity™'. Bibliographies were searched and additional resources were obtained. Vorapaxar is a first-in-class, protease-activated receptor-1 antagonist. The Thrombin Receptor Antagonist for Clinical Event Reduction (TRACER) trial did not demonstrate a significant reduction in a broad primary composite endpoint. However, the Thrombin-Receptor Antagonist in Secondary Prevention of Atherothrombotic Ischemic Events (TRA 2°P-TIMI 50) trial examined a more traditional composite endpoint and found a significant benefit with vorapaxar. Vorapaxar significantly increased bleeding compared with standard care. Ongoing trials will help define the role of vorapaxar in patients with peripheral arterial disease, patients with diabetes mellitus, and other important subgroups. The use of multivariate modeling may enable the identification of subgroups with maximal benefit and minimal harm from vorapaxar. Vorapaxar provides clinicians with a novel mechanism of action to further reduce the burden of ischemic heart disease. Identification of patients with a high ischemic risk and low bleeding risk would enable clinicians to maximize the utility of this unique agent.

  20. Is PCR the Next Reference Standard for the Diagnosis of Schistosoma in Stool? A Comparison with Microscopy in Senegal and Kenya.

    PubMed

    Meurs, Lynn; Brienen, Eric; Mbow, Moustapha; Ochola, Elizabeth A; Mboup, Souleymane; Karanja, Diana M S; Secor, W Evan; Polman, Katja; van Lieshout, Lisette

    2015-01-01

    The current reference test for the detection of S. mansoni in endemic areas is stool microscopy based on one or more Kato-Katz stool smears. However, stool microscopy has several shortcomings that greatly affect the efficacy of current schistosomiasis control programs. A highly specific multiplex real-time polymerase chain reaction (PCR) targeting the Schistosoma internal transcriber-spacer-2 sequence (ITS2) was developed by our group a few years ago, but so far this PCR has been applied mostly on urine samples. Here, we performed more in-depth evaluation of the ITS2 PCR as an alternative method to standard microscopy for the detection and quantification of Schistosoma spp. in stool samples. Microscopy and PCR were performed in a Senegalese community (n = 197) in an area with high S. mansoni transmission and co-occurrence of S. haematobium, and in Kenyan schoolchildren (n = 760) from an area with comparatively low S. mansoni transmission. Despite the differences in Schistosoma endemicity the PCR performed very similarly in both areas; 13-15% more infections were detected by PCR when comparing to microscopy of a single stool sample. Even when 2-3 stool samples were used for microscopy, PCR on one stool sample detected more infections, especially in people with light-intensity infections and in children from low-risk schools. The low prevalence of soil-transmitted helminthiasis in both populations was confirmed by an additional multiplex PCR. The ITS2-based PCR was more sensitive than standard microscopy in detecting Schistosoma spp. This would be particularly useful for S. mansoni detection in low transmission areas, and post-control settings, and as such improve schistosomiasis control programs, epidemiological research, and quality control of microscopy. Moreover, it can be complemented with other (multiplex real-time) PCRs to detect a wider range of helminths and thus enhance effectiveness of current integrated control and elimination strategies for neglected tropical diseases.

Top