Sample records for ctbt knowledge databaes

  1. Hydroacoustic propagation grids for the CTBT knowledge databaes BBN technical memorandum W1303

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Angell

    1998-05-01

    The Hydroacoustic Coverage Assessment Model (HydroCAM) has been used to develop components of the hydroacoustic knowledge database required by operational monitoring systems, particularly the US National Data Center (NDC). The database, which consists of travel time, amplitude correction and travel time standard deviation grids, is planned to support source location, discrimination and estimation functions of the monitoring network. The grids will also be used under the current BBN subcontract to support an analysis of the performance of the International Monitoring System (IMS) and national sensor systems. This report describes the format and contents of the hydroacoustic knowledgebase grids, and themore » procedures and model parameters used to generate these grids. Comparisons between the knowledge grids, measured data and other modeled results are presented to illustrate the strengths and weaknesses of the current approach. A recommended approach for augmenting the knowledge database with a database of expected spectral/waveform characteristics is provided in the final section of the report.« less

  2. Proposed Conceptual Requirements for the CTBT Knowledge Base,

    DTIC Science & Technology

    1995-08-14

    knowledge available to automated processing routines and human analysts are significant, and solving these problems is an essential step in ensuring...knowledge storage in a CTBT system. In addition to providing regional knowledge to automated processing routines, the knowledge base will also address

  3. Policy issues facing the Comprehensive Test Ban Treaty and prospects for the future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweeney, J.

    1999-04-01

    This report is divided into the following 5 sections: (1) Background; (2) Major Issues Facing Ratification of CTBT; (3) Current Status on CTBT Ratification; (4) Status of CTBT Signatories and Ratifiers; and (5) CTBT Activities Not Prohibited. The major issues facing ratification of CTBT discussed here are: impact on CTBT of START II and ABM ratification; impact of India and Pakistan nuclear tests; CTBT entry into force; and establishment of the Comprehensive Nuclear Test-Ban Treaty Organization.

  4. Stakeholder engagement for promoting the Comprehensive Nuclear-Test-Ban Treaty (CTBT): Malaysia’s experience

    NASA Astrophysics Data System (ADS)

    Rashid, F. I. A.; Zolkaffly, M. Z.; Jamal, N.

    2018-01-01

    In order to keep abreast on issues related to CTBT in Malaysia, Malaysian Nuclear Agency (Nuklear Malaysia), as the CTBT National Authority in Malaysia, has collaborated with local partners to implement various stakeholder engagement programme. This paper aims at highlighting Malaysia’s approach in promoting CTBT through stakeholder engagement programme targeted at multilevel stakeholders, both national and international. Such programmes includes participation in the international forums, inter-agency meetings, awareness seminars, training courses, technical visits to IMS station, promoting civil and scientific application of International Monitoring System (IMS) data and International Data Centre (IDC) products using Virtual Data Exploitation Center (vDEC), inviting youth groups to participate in the CTBTO Youth Group, and publications of CTBT-related topics. This approach has successfully fortify Malaysia’s commitments at the international level, enhanced national awareness of global multilateral framework, increased stakeholders awareness and their roles related to CTBT, as well as building domestic capacity on CTBT matters. In conclusion, stakeholder engagement is crucial in promoting and enhancing stakeholders understanding on CTBT. Continuous engagement with relevant stakeholders will enable effective dissemination and smooth implementation of CTBT related matters that will eventually support global universalization of CTBT.

  5. The global radioxenon background and its impact on the detection capability of underground nuclear explosions (Invited)

    NASA Astrophysics Data System (ADS)

    Ringbom, A.

    2010-12-01

    A detailed knowledge of both the spatial and isotopic distribution of anthropogenic radioxenon is essential in investigations of the performance of the radioxenon part of the IMS, as well as in the development of techniques to discriminate radioxenon signatures from a nuclear explosion from other sources. Further, the production processes in the facilities causing the radioxenon background has to be understood and be compatible with simulations. In this work, several aspects of the observed atmospheric radioxenon background are investigated, including the global distribution as well as the current understanding of the observed isotopic ratios. Analyzed radioxenon data from the IMS, as well as from other measurement stations, are used to create an up-to-date description of the global radioxenon background, including all four CTBT relevant xenon isotopes (133Xe, 131mXe, 133mXe, and 135Xe). In addition, measured isotopic ratios will be compared to simulations of neutron induced fission of 235U, and the uncertainties will be discussed. Finally, the impact of the radioxenon background on the detection capability of the IMS will be investigated. This work is a continuation of studies [1,2] that was presented at the International Scientific Studies conference held in Vienna in 2009. [1] A. Ringbom, et.al., “Characterization of the global distribution of atmospheric radioxenons”, International Scientific Studies Conference on CTBT Verification, 10-12 June 2009. [2] R. D'Amours and A. Ringbom, “A study on the global detection capability of IMS for all CTBT relevant xenon isotopes“, International Scientific Studies Conference on CTBT Verification, 10-12 June 2009.

  6. A high-efficiency HPGe coincidence system for environmental analysis.

    PubMed

    Britton, R; Davies, A V; Burnett, J L; Jackson, M J

    2015-08-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is supported by a network of certified laboratories which must meet certain sensitivity requirements for CTBT relevant radionuclides. At the UK CTBT Radionuclide Laboratory (GBL15), a high-efficiency, dual-detector gamma spectroscopy system has been developed to improve the sensitivity of measurements for treaty compliance, greatly reducing the time required for each sample. Utilising list-mode acquisition, each sample can be counted once, and processed multiple times to further improve sensitivity. For the 8 key radionuclides considered, Minimum Detectable Activities (MDA's) were improved by up to 37% in standard mode (when compared to a typical CTBT detector system), with the acquisition time required to achieve the CTBT sensitivity requirements reduced from 6 days to only 3. When utilising the system in coincidence mode, the MDA for (60) Co in a high-activity source was improved by a factor of 34 when compared to a standard CTBT detector, and a factor of 17 when compared to the dual-detector system operating in standard mode. These MDA improvements will allow the accurate and timely quantification of radionuclides that decay via both singular and cascade γ emission, greatly enhancing the effectiveness of CTBT laboratories. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  7. Summary report of the workshop on the U.S. use of surface waves for monitoring the CTBT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritzwoller, M; Walter, W R

    1998-09-01

    The workshop addressed the following general research goals of relevance to monitoring and verifying the Comprehensive Test Ban Treaty (CTBT): A) To apprise participants of current and planned research in order to facilitate information exchange, collaboration, and peer review. B) To compare and discuss techniques for data selection, measurement, error assessment, modeling methodologies, etc. To compare results in regions where they overlap and understand the causes of obsenied differences. C) To hear about the U.S. research customer's (AFTAC and DOE Knowledge Base) current and anticipated interests in surface wave research. D) To discuss information flow and integration. How can researchmore » results be prepared for efficient use and integration into operational systems E) To identify and discuss fruitful future directions for research.« less

  8. Integrated Data for Improved Asset Management

    DOT National Transportation Integrated Search

    2016-05-26

    The objective of this research is to demonstrate the potential benefits for agency-wide data integration for VDOT asset management. This objective is achieved through an example application that requires information distributed across multiple databa...

  9. Research notes : information at your fingertips!

    DOT National Transportation Integrated Search

    2000-03-01

    TRIS Online includes full-text reports or links to publishers or suppliers of the original documents. You will find titles, publication dates, authors, abstracts, and document sources. : Each year over 20,000 new records are added to TRIS. The databa...

  10. Freight transportation in Mississippi : selected data from federal sources

    DOT National Transportation Integrated Search

    1996-10-01

    Welcome to the State Freight Transportation Profile. This report presents information on freight transportation in Mississippi and is part of a series of reports covering all 50 States. The purpose of the report is to present the major Federal databa...

  11. Freight transportation in Washington : selected data from federal sources

    DOT National Transportation Integrated Search

    1996-10-01

    Welcome to the State Freight Transportation Profile. This report presents information on freight transportation in Washington and is part of a series of reports covering all 50 States. The purpose of the report is to present the major Federal databas...

  12. Freight transportation in Connecticut : selected data from federal sources

    DOT National Transportation Integrated Search

    1996-10-01

    Welcome to the State Freight Transportation Profile. This report presents information on freight transportation in Connecticut and is part of a series of reports covering all 50 States. The purpose of the report is to present the major Federal databa...

  13. Freight transportation in New Mexico : selected data from federal sources

    DOT National Transportation Integrated Search

    1996-10-01

    Welcome to the State Freight Transportation Profile. This report presents information on freight transportation in New Mexico and is part of a series of reports covering all 50 States. The purpose of the report is to present the major Federal databas...

  14. Freight transportation in New Jersey : selected data from federal sources

    DOT National Transportation Integrated Search

    1996-10-01

    Welcome to the State Freight Transportation Profile. This report presents information on freight transportation in New Jersey and is part of a series of reports covering all 50 States. The purpose of the report is to present the major Federal databas...

  15. Comprehensive Test Ban Treaty (CTBT): Current Status

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Ram

    2003-04-01

    After an effort of nearly a half century the CTBT was approved by the U.N. on September 10, 1996. Out of 185 member nations (at the time), 158 voted in favor, 3 against, and the remaining either abstained or were diplomatically absent. In spite of such an overwhelming support of the international community, the CTBT may well remain on paper. The reason being that one of the opposing nations, India, is considered a "threshold Nuclear Nation" and must approve the treaty to enter into force according to the rules of Conference of Disarmament (CD). India's U.N. representative said that her country would "never sign this unequal treaty, not now, not later." "Unequal" because it does not provide a time table for elimination of the existing nuclear weapons, testing of weapons, etc., which favor nuclear states. This paper will provide details of the above issues and the current status of the CTBT.

  16. Navigating spatial and temporal complexity in developing a long-term land use database for an agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    No comprehensive protocols exist for the collection, standardization, and storage of agronomic management information into a database that preserves privacy, maintains data uncertainty, and translates everyday decisions into quantitative values. This manuscript describes the development of a databas...

  17. IRIS Toxicological Review of Tert-Butyl Alcohol (Tert-Butanol) (Public Comment Draft)

    EPA Science Inventory

    EPA is developing an Integrated Risk Information System (IRIS) assessment of tert-butyl Alcohol (tert-butanol) and has released the public comment draft assessment for public comment and external peer review. When final, the assessment will appear on the IRIS databa...

  18. DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,

    DTIC Science & Technology

    1995-08-14

    processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated

  19. On-Site Inspection RadioIsotopic Spectroscopy (Osiris) System Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caffrey, Gus J.; Egger, Ann E.; Krebs, Kenneth M.

    2015-09-01

    We have designed and tested hardware and software for the acquisition and analysis of high-resolution gamma-ray spectra during on-site inspections under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The On-Site Inspection RadioIsotopic Spectroscopy—Osiris—software filters the spectral data to display only radioisotopic information relevant to CTBT on-site inspections, e.g.,132I. A set of over 100 fission-product spectra was employed for Osiris testing. These spectra were measured, where possible, or generated by modeling. The synthetic test spectral compositions include non-nuclear-explosion scenarios, e.g., a severe nuclear reactor accident, and nuclear-explosion scenarios such as a vented underground nuclear test. Comparing its computer-based analyses to expert visual analysesmore » of the test spectra, Osiris correctly identifies CTBT-relevant fission product isotopes at the 95% level or better.The Osiris gamma-ray spectrometer is a mechanically-cooled, battery-powered ORTEC Transpec-100, chosen to avoid the need for liquid nitrogen during on-site inspections. The spectrometer was used successfully during the recent 2014 CTBT Integrated Field Exercise in Jordan. The spectrometer is controlled and the spectral data analyzed by a Panasonic Toughbook notebook computer. To date, software development has been the main focus of the Osiris project. In FY2016-17, we plan to modify the Osiris hardware, integrate the Osiris software and hardware, and conduct rigorous field tests to ensure that the Osiris system will function correctly during CTBT on-site inspections. The planned development will raise Osiris to technology readiness level TRL-8; transfer the Osiris technology to a commercial manufacturer, and demonstrate Osiris to potential CTBT on-site inspectors.« less

  20. Technology Innovation for the CTBT, the National Laboratory Contribution

    NASA Astrophysics Data System (ADS)

    Goldstein, W. H.

    2016-12-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) and its Protocol are the result of a long history of scientific engagement and international technical collaboration. The U.S. Department of Energy National Laboratories have been conducting nuclear explosive test-ban research for over 50 years and have made significant contributions to this legacy. Recent examples include the RSTT (regional seismic travel time) computer code and the Smart Sampler—both of these products are the result of collaborations among Livermore, Sandia, Los Alamos, and Pacific Northwest National Laboratories. The RSTT code enables fast and accurate seismic event locations using regional data. This code solves the long-standing problem of using teleseismic and regional seismic data together to locate events. The Smart Sampler is designed for use in On-site Inspections to sample soil gases to look for noble gas fission products from a potential underground nuclear explosive test. The Smart Sampler solves the long-standing problem of collecting soil gases without contaminating the sample with gases from the atmosphere by operating only during atmospheric low-pressure events. Both these products are being evaluated by the Preparatory Commission for the CTBT Organization and the international community. In addition to R&D, the National Laboratories provide experts to support U.S. policy makers in ongoing discussions such as CTBT Working Group B, which sets policy for the development of the CTBT monitoring and verification regime.

  1. USDA National Nutrient Database for Standard Reference, Release 24

    USDA-ARS?s Scientific Manuscript database

    The USDA Nutrient Database for Standard Reference, Release 24 contains data for over 7,900 food items for up to 146 food components. It replaces the previous release, SR23, issued in September 2010. Data in SR24 supersede values in the printed Handbooks and previous electronic releases of the databa...

  2. Cosmic veto gamma-spectrometry for Comprehensive Nuclear-Test-Ban Treaty samples

    NASA Astrophysics Data System (ADS)

    Burnett, J. L.; Davies, A. V.

    2014-05-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is supported by a global network of monitoring stations that perform high-resolution gamma-spectrometry on air filter samples for the identification of 85 radionuclides. At the UK CTBT Radionuclide Laboratory (GBL15), a novel cosmic veto gamma-spectrometer has been developed to improve the sensitivity of station measurements, providing a mean background reduction of 80.8% with mean MDA improvements of 45.6%. The CTBT laboratory requirement for a 140Ba MDA is achievable after 1.5 days counting compared to 5-7 days using conventional systems. The system consists of plastic scintillation plates that detect coincident cosmic-ray interactions within an HPGe gamma-spectrometer using the Canberra LynxTM multi-channel analyser. The detector is remotely configurable using a TCP/IP interface and requires no dedicated coincidence electronics. It would be especially useful in preventing false-positives at remote station locations (e.g. Halley, Antarctica) where sample transfer to certified laboratories is logistically difficult. The improved sensitivity has been demonstrated for a CTBT air filter sample collected after the Fukushima incident.

  3. Atmospheric Transport Modelling and Radionuclide Analysis for the NPE 2015 scenario

    NASA Astrophysics Data System (ADS)

    Ross, J. Ole; Bollhöfer, Andreas; Heidmann, Verena; Krais, Roman; Schlosser, Clemens; Gestermann, Nicolai; Ceranna, Lars

    2017-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. The International Monitoring System (IMS) is in place and at about 90% complete to verify compliance with the CTBT. The stations of the waveform technologies are capable to detect seismic, hydro-acoustic and infrasonic signals for detection, localization, and characterization of explosions. For practicing Comprehensive Nuclear-Test-Ban Treaty (CTBT) verification procedures and interplay between the International Data Centre (IDC) and National Data Centres (NDC), prepardness exercises (NPE) are regularly performed with selected events of fictitious CTBT-violation. The German NDC's expertise for radionuclide analyses and operation of station RN33 is provided by the Federal Office for Radiation Protection (BfS) while Atmospheric Transport Modelling (ATM) for CTBT purposes is performed at the Federal Institute for Geosciences and Natural Resources (BGR) for the combination of the radionuclide findings with waveform evidence. The radionuclide part of the NPE 2015 scenario is tackled in a joint effort by BfS and BGR. First, the NPE 2015 spectra are analysed, fission products are identified, and respective activity concentrations are derived. Special focus is on isotopic ratios which allow for source characterization and event timing. For atmospheric backtracking the binary coincidence method is applied for both, SRS fields from IDC and WMO-RSMC, and for in-house backward simulations in higher resolution for the first affected samples. Results are compared with the WebGrape PSR and the spatio-temporal domain with high atmospheric release probability is determined. The ATM results together with the radionuclide fingerprint are used for identification of waveform candidate events. Comparative forward simulations of atmospheric dispersion for candidate events are performed. Finally the overall consistency of various source scenarios is assessed and a fictitious government briefing on the findings is given.

  4. CTBT on-site inspections

    NASA Astrophysics Data System (ADS)

    Zucca, J. J.

    2014-05-01

    On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.

  5. Seismic field measurements in Kylylahti, Finland, in support of the further development of geophysical seismic techniques for CTBT On-site Inspections

    NASA Astrophysics Data System (ADS)

    Labak, Peter; Lindblom, Pasi; Malich, Gregor

    2017-04-01

    The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) during which the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI) were tested in integrated manner. Many of the inspection techniques permitted by the CTBT were applied during IFE14 including a range of geophysical techniques, however, one of the techniques foreseen by the CTBT but not yet developed is resonance seismometry. During August and September 2016, seismic field measurements have been conducted in the region of Kylylahti, Finland, in support of the further development of geophysical seismic techniques for OSIs. 45 seismic stations were used to continuously acquire seismic signals. During that period, data from local, regional and teleseismic natural events and man-made events were acquired, including from a devastating earthquake in Italy and the nuclear explosion announced by the Democratic People's Republic of Korea on 9 September 2016. Also, data were acquired following the small-scale use of man-made chemical explosives in the area and of vibratory sources. This presentation will show examples from the data set and will discuss its use for the development of resonance seimometry for OSIs.

  6. The Impact of Changes in State Identity on Alliance Cohesion in Northeast Asia

    DTIC Science & Technology

    2009-12-01

    Taeho Kim, arg R ing arms tran the SIPRI Arm eapons an nsfer Databa 1950 to 200 ar e instance, t PRK relied mou f its policy ar table indicato rd in...th PRK alliance. d, Beijing umb tems, instead p g spar ry t eat,hr nd anti-missile def Beijing has n c sar 183 McVadon, “Ch Strategy he Korean

  7. Nuclear Weapons: Comprehensive Test Ban Treaty

    DTIC Science & Technology

    2007-11-30

    itself, which has been done. Critics raised concerns about the implications of these policies for testing and new weapons. At present, Congress...CTBT in lieu of the current treaty.1 On October 24, Senator Jon Kyl delivered a speech critical of the CTBT and of Section 3122 in H.R. 1585, the FY2008...to do so.’”6 Critics expressed concern about the implications of these policies for testing and new weapons. A statement by Physicians for Social

  8. Characterization of the Infrasound Field in the Central Pacific

    DTIC Science & Technology

    2006-06-01

    Treaty (CTBT) in December 2001. The array site is in a tropical rainforest on the slopes of Hualalai Volcano , Hawaii Island, Hawaii . Per IMS...Test-Ban Treaty (CTBT) in December 2001. ’The array site is in a tropical rainforest on the slopes of Hualalai Volcano , Hawaii Island, Hawaii . Per IMS...general direction of Kilauea Volcano . These signals are tentatively assigned to the "iv" phase. To date the majority of these events have featured

  9. Technical Issues Related to the Comprehensive Nuclear Test Ban Treaty

    NASA Astrophysics Data System (ADS)

    Garwin, Richard L.

    2003-04-01

    The National Academy of Sciences recently published a detailed study of technical factors related to the Comprehensive Nuclear Test Ban Treaty (CTBT), with emphasis on those issues that arose when the Senate declined to ratify the Treaty in 1999. The study considered (1) the capacity of the United States to maintain confidence in the safety and reliability of its nuclear weapons without nuclear testing; (2) the capabilities of the international nuclear-test monitoring system; and (3) the advances in nuclear weapons capabilities that other countries might make through low-yield testing that might escape detection. Excluding political factors, the committee considered three possible future worlds: (1) a world without a CTBT; (2) a world in which the signatories comply with a CTBT; and (3) a world in the signatories evade its strictures within the limits set by the detection system. The talk and ensuing discussion will elaborate on the study. The principal conclusion of the report, based solely on technical reasons, is that the national security of the United States is better served with a CTBT in force than without it, whether or not other signatories conduct low level but undetected tests in violation of the treaty. Moreover, the study finds that nuclear testing would not add substantially to the US Stockpile Stewardship Program in allowing the United States to maintain confidence in the assessment of its existing nuclear weapons.

  10. Technical Issues Related to the Comprehensive Nuclear Test Ban Treaty

    NASA Astrophysics Data System (ADS)

    2003-03-01

    The National Academy of Sciences recently completed a detailed study of the technical factors related to the Comprehensive Nuclear Test Ban Treaty (CTBT), with emphasis on those issues that arose when the Senate declined to ratify the Treaty in 1999. The study considered (1) the capacity of the United States to maintain confidence in the safety and reliability of its nuclear weapons without nuclear testing; (2) the capabilities of the international nuclear-test monitoring system; and (3) the advances in nuclear weapons capabilities that other countries might make through low-yield testing that might escape detection. While political factors were excluded, the committee considered three possible future worlds: (1) a world without a CTBT; (2) a world in which the signatories comply with a CTBT; and (3) a world in the signatories evade its strictures within the limits set by the detection system. The talk will elaborate on the study. The primary conclusion, based solely on technical reasons, is that the national security of the United States is better served with a CTBT in force than without it, whether or not other signatories conduct low level but undetected tests in violation of the treaty. Moreover, the study finds that nuclear testing would not add substantially to the US Stockpile Stewardship Program in allowing the United States to maintain confidence in the assessment of its existing nuclear weapons."

  11. Administrative and Technical Support for the U.S. Army Medical Research and Development Command Joint Working Group on Medical Chemical Defense

    DTIC Science & Technology

    1989-08-01

    microproces;qor databaAing systems for monitoring project and contract reports and program technology trans. fers, coordinating and providing administratIvo ...The JWGD 3 annual planning process generally included: - Program review by the JWGD’ membership at quarterly meetings, which consisted of the review...Office developed the program planning and budget documents associated with the planning process outlined above. Program project databases and

  12. NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios

    NASA Astrophysics Data System (ADS)

    Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.

    2012-04-01

    For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.

  13. Military Interoperable Digital Hospital Testbed (MIDHT)

    DTIC Science & Technology

    2011-10-01

    MED Rm-Bed. 1001 -2 AdmitDt 08/02/2011 MRN: 000160138 MIS ASSESSMENT . HED I DATABAS Tr<"Jnscription I wound/ Ostomy Consu v.ol · HEO Pre-op Checklist...Attending: WILSON. MICHAEL ... Fac- Dept Age: 65 yr Diagnosis: ~ ADMINRX, SUE Revtew Charting SessiOn DATABASE PART 1 Wound/ Ostomy Consult E NIH...I Lab Wound/ Ostomy Consult Eval· HED I Home Health Intake Consult· HED I Cardiac Rehab Pt Teaching· HED NIH Stloke Scale· HED I ICU/CCU Daily

  14. Getting to Zero Yield: The Evolution of the U.S. Position on the CTBT

    NASA Astrophysics Data System (ADS)

    Zimmerman, Peter D.

    1998-03-01

    In 1994 the United States favored a Comprehensive Test Ban Treaty (CTBT) which permitted tiny "hydronuclear" experiments with a nuclear energy release of four pounds or less. Other nuclear powers supported yield limits as high as large fractions of a kiloton, while most non-nuclear nations participating in the discussions at the United Nations Conference on Disarmament wanted to prohibit all nuclear explosions -- some even favoring an end to computer simulations. On the other hand, China wished an exception to permit high yield "peaceful" nuclear explosions. For the United States to adopt a new position favoring a "true zero" several pieces had to fall into place: 1) The President had to be assured that the U.S. could preserve the safety and reliability of the enduring stockpile without yield testing; 2) the U.S. needed to be sure that the marginal utility of zero-yield experiments was at least as great for this country as for any other; 3) that tests with any nuclear yield might have more marginal utility for nuclear proliferators than for the United States, thus marginally eroding this country's position; 4) the United States required a treaty which would permit maintenance of the capacity to return to testing should a national emergency requiring a nuclear test arise; and 5) all of the five nuclear weapons states had to realize that only a true-zero CTBT would have the desired political effects. This paper will outline the physics near zero yield and show why President Clinton was persuaded by arguments from many viewpoints to endorse a true test ban in August, 1996 and to sign the CTBT in September, 1997.

  15. Contribution of the infrasound technology to characterize large scale atmospheric disturbances and impact on infrasound monitoring

    NASA Astrophysics Data System (ADS)

    Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter

    2016-04-01

    The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.

  16. Travel-time source-specific station correction improves location accuracy

    NASA Astrophysics Data System (ADS)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  17. Reviews of the Comprehensive Nuclear-Test-Ban Treaty and U.S. security

    NASA Astrophysics Data System (ADS)

    Jeanloz, Raymond

    2017-11-01

    Reviews of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) by the National Academy of Sciences concluded that the United States has the technical expertise and physical means to i) maintain a safe, secure and reliable nuclear-weapons stockpile without nuclear-explosion testing, and ii) effectively monitor global compliance once the Treaty enters into force. Moreover, the CTBT is judged to help constrain proliferation of nuclear-weapons technology, so it is considered favorable to U.S. security. Review of developments since the studies were published, in 2002 and 2012, show that the study conclusions remain valid and that technical capabilities are better than anticipated.

  18. Geologic constraints on clandestine nuclear testing in South Asia

    PubMed Central

    Davis, Dan M.; Sykes, Lynn R.

    1999-01-01

    Cavity decoupling in salt is the most plausible means by which a nation could conduct clandestine testing of militarily significant nuclear weapons. The conditions under which solution-mined salt can be used for this purpose are quite restrictive. The salt must be thick and reasonably pure. Containment of explosions sets a shallow limit on depth, and cavity stability sets a deep limit. These constraints are met in considerably <1% of the total land area of India and Pakistan. Most of that area is too dry for cavity construction by solution mining; disposal of brine in rivers can be detected easily. Salt domes, the most favorable structures for constructing large cavities, are not present in India and Pakistan. Confidence that they are adhering to the Comprehensive Test Ban Treaty (CTBT) is enhanced by their geological conditions, which are quite favorable to verification, not evasion. Thus, their participation in the CTBT is constrained overwhelmingly by political, not scientific, issues. Confidence in the verification of the CTBT could be enhanced if India and Pakistan permitted stations of the various monitoring technologies that are now widely deployed elsewhere to be operated on their territories. PMID:10500134

  19. Seismological investigation of the National Data Centre Preparedness Exercise 2013

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Hartmann, Gernot; Ross, J. Ole; Ceranna, Lars

    2015-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions conducted on Earth - underground, underwater or in the atmosphere. The verification regime of the CTBT is designed to detect any treaty violation. While the data of the International Monitoring System (IMS) is collected, processed and technically analyzed at the International Data Centre (IDC) of the CTBT-Organization, National Data Centres (NDC) of the member states provide interpretation and advice to their government concerning suspicious detections. The NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies. These exercises should help to evaluate the effectiveness of analysis procedures applied at NDCs and the quality, completeness and usefulness of IDC products for example. The exercise trigger of NPE2013 is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The potential connection between the waveform and radionuclide evidence remains unclear for exercise participants. The verification task was to identify the waveform event and to investigate potential sources of the radionuclide findings. The final question was whether the findings are CTBT relevant and justify a request for On-Site-Inspection in "Frisia". The seismic event was not included in the Reviewed Event Bulletin (REB) of the IDC. The available detections from the closest seismic IMS stations lead to a epicenter accuracy of about 24 km which is not sufficient to specify the 1000 km2 inspection area in case of an OSI. With use of data from local stations and adjusted velocity models the epicenter accuracy could be improved to less than 2 km, which demonstrates the crucial role of national technical means for verification tasks. The seismic NPE2013 event could be identified as induced from natural gas production in the source region. Similar waveforms and comparable spectral characteristic as a set of events in the same region are clear indications. The scenario of a possible treaty violation at the location of the seismic NPE2013 event could be disproved.

  20. Geophysics, Remote Sensing, and the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Integrated Field Exercise 2014

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Macleod, G.; Labak, P.; Malich, G.; Rowlands, A. P.; Craven, J.; Sweeney, J. J.; Chiappini, M.; Tuckwell, G.; Sankey, P.

    2015-12-01

    The Integrated Field Exercise of 2014 (IFE14) was an event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of an on-site inspection (OSI) within the CTBT verification regime. During an OSI, up to 40 international inspectors will search an area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of a real OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams (which executed the scenario in which the exercise was played) and those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test and integrate Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, suites of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, in addition to other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection using other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials, and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of the goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.

  1. Technology Advancement and the CTBT: Taking One Step Back from the Nuclear Brink

    NASA Astrophysics Data System (ADS)

    Perry, W. J.

    2016-12-01

    Technology plays a pivotal role in international nuclear security and technological advancement continues to support a path toward stability. One near-term and readily-obtainable step back from the nuclear brink is the Comprehensive Nuclear-test Ban Treaty (CTBT). The technology to independently verify adherence to the CTBT has matured in the 20 years since the Treaty was opened for signature. Technology has also improved the safety and reliability of the US nuclear stockpile in the absence of testing. Due to these advances over the past two decades neither verification nor stockpiles effectiveness should be an impediment to the Treaty's entry into force. Other technical and geo-political evolution in this same period has changed the perceived benefit of nuclear weapons as instruments of security. Recognizing the change technology has brought to deliberation of nuclear security, nations are encouraged to take this one step away from instability.This presentation will reflect on the history and assumptions that have been used to justify the build-up and configuration of nuclear stockpiles, the changes in technology and conditions that alter the basis of these original assumptions, and the re-analysis of security using current and future assumptions that point to the need for revised nuclear policies. The author has a unique and well informed perspective as both the most senior US Defense Official and a technologist.

  2. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  3. Characterization of Xe-133 global atmospheric background: Implications for the International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Generoso, Sylvia; Morin, Mireille; Gross, Philippe; Le Petit, Gilbert; Moulin, Christophe

    2016-05-01

    Monitoring atmospheric concentrations of radioxenons is relevant to provide evidence of atmospheric or underground nuclear weapon tests. However, when the design of the International Monitoring Network (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was set up, the impact of industrial releases was not perceived. It is now well known that industrial radioxenon signature can interfere with that of nuclear tests. Therefore, there is a crucial need to characterize atmospheric distributions of radioxenons from industrial sources—the so-called atmospheric background—in the frame of the CTBT. Two years of Xe-133 atmospheric background have been simulated using 2013 and 2014 meteorological data together with the most comprehensive emission inventory of radiopharmaceutical facilities and nuclear power plants to date. Annual average simulated activity concentrations vary from 0.01 mBq/m3 up to above 5 mBq/m3 nearby major sources. Average measured and simulated concentrations agree on most of the IMS stations, which indicates that the main sources during the time frame are properly captured. Xe-133 atmospheric background simulated at IMS stations turn out to be a complex combination of sources. Stations most impacted are in Europe and North America and can potentially detect Xe-133 every day. Predicted occurrences of detections of atmospheric Xe-133 show seasonal variations, more accentuated in the Northern Hemisphere, where the maximum occurs in winter. To our knowledge, this study presents the first global maps of Xe-133 atmospheric background from industrial sources based on two years of simulation and is a first attempt to analyze its composition in terms of origin at IMS stations.

  4. Fifty Years of Seismic Monitoring in Davao,Philippines

    NASA Astrophysics Data System (ADS)

    McNamara, D. J.

    2016-12-01

    The Manila Observatory was a 150 years old as of 2015. Fiftry years ago it began a seismic monitoring station in the Island of Mindanao, outside the city of Davao, 7 deg. N and 121 deg. E. approxiamtely. This station was chosen not only for its position on the Ring of Fire but also for the fact the the dip angle of the earth's manetic field is zeo at that location. When the CTBT was established and the Republic of the Philippines (RP) a signatory, the Davao station by agreement with RP, began to send its seismic data to the CTBT database in Vienna. This has continued to the present day with support from CTBTO for updates in equipment and maintainence. We discuss if such a private+government model is the way forward for more comprehensive monitoring in the future.

  5. Merging Infrasound and Electromagnetic Signals as a Means for Nuclear Explosion Detection

    NASA Astrophysics Data System (ADS)

    Ashkenazy, Joseph; Lipshtat, Azi; Kesar, Amit S.; Pistinner, Shlomo; Ben Horin, Yochai

    2016-04-01

    The infrasound monitoring network of the CTBT consists of 60 stations. These stations are capable of detecting atmospheric events, and may provide approximate location within time scale of a few hours. However, the nature of these events cannot be deduced from the infrasound signal. More than two decades ago it was proposed to use the electromagnetic pulse (EMP) as a means of discriminating nuclear explosion from other atmospheric events. An EMP is a unique signature of nuclear explosion and is not detected from chemical ones. Nevertheless, it was decided to exclude the EMP technology from the official CTBT verification regime, mainly because of the risk of high false alarm rate, due to lightning electromagnetic pulses [1]. Here we present a method of integrating the information retrieved from the infrasound system with the EMP signal which enables us to discriminate between lightning discharges and nuclear explosions. Furthermore, we show how spectral and other characteristics of the electromagnetic signal emitted from a nuclear explosion are distinguished from those of lightning discharge. We estimate the false alarm probability of detecting a lightning discharge from a given area of the infrasound event, and identifying it as a signature of a nuclear explosion. We show that this probability is very low and conclude that the combination of infrasound monitoring and EMP spectral analysis may produce a reliable method for identifying nuclear explosions. [1] R. Johnson, Unfinished Business: The Negotiation of the CTBT and the End of Nuclear Testing, United Nations Institute for Disarmament Research, 2009.

  6. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  7. Mapping and Imaging Methodologies within the Comprehensive Test Ban Treaty's On-Site Inspection Framework

    NASA Astrophysics Data System (ADS)

    Hawkins, W.; Sussman, A. J.; Kelley, R. E.; Wohletz, K. H.; Schultz-Fellenz, E. S.

    2013-12-01

    On-site inspection (OSI) is the final verification measure of the Comprehensive Nuclear Test Ban Treaty (CTBT). OSIs rely heavily on geologic and geophysical investigations. The objective is to apply methods that are effective, efficient and minimally intrusive. We present a general overview of the OSI as provisioned in the CTBT, specifying the allowed techniques and the timeline for their application. A CTBT OSI relies on many geological, geophysical and radiological methods. The search area for an OSI is mostly defined by uncertainty in the location of a suspect event detected by the International Monitoring System (IMS) and reported through the International Data Center and can be as large as 1000 km2. Thus OSI methods are fundamentally divided into general survey methods that narrow the search area and more focused, detailed survey methods to look for evidence of a potential underground explosion and try to find its location within an area of several km2. The purpose and goal of a CTBT OSI, as specified in the Article IV of the Treaty, is 'to clarify whether a nuclear explosion has been carried out in violation of the Treaty' and to 'gather any facts which might assist in identifying any possible violator.' Through the use of visual, geophysical, and radiological techniques, OSIs can detect and characterize anomalies and artifacts related to the event that triggered the inspection. In the context of an OSI, an 'observable' is a physical property that is important to recognize and document because of its relevance to the purpose of the inspection. Potential observables include: (1) visual observables such as ground/environmental disturbances and manmade features, (2) geophysical techniques that provide measurements of altered and damaged ground and buried artifacts, and (3) radiological measurements on samples. Information provided in this presentation comes from observations associated with historical testing activities that were not intended to go undetected. Every CTBT OSI will be different, and the observables present and detectable within an Inspection Area (IA) will depend on many factors, such as location, geology, emplacement configuration, climate, and the time elapsed after the event before the deployment of the Inspection Team (IT). A successful OSI is contingent on familiarity with potential observables, the suitability of the equipment to detect and characterize relevant observables, and the team's ability to document and integrate all the information into comprehensive, logical, and factual reports. In preparation for an OSI, a variety of types, scales, and generations of open-source digital imagery can be compared using geographic information systems (GIS) to focus on areas of interest. Simple image comparison from various open sources within GIS afford the opportunity to view anthropogenic and natural changes to locations of interest over time, thus remotely elucidating information about a site's use and level of activity.

  8. The Ascension Island hydroacoustic experiment: purpose, data set features and plans for future analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Rock, D; Rodgers, A J

    1999-07-23

    Calibration of hydroacoustic and T-phase stations for Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring will be an important element in establishing new operational stations and upgrading existing stations. Calibration of hydroacoustic stations is herein defined as precision location of the hydrophones and determination of the amplitude response from a known source energy. T-phase station calibration is herein defined as a determination of station site attenuation as a function of frequency, bearing, and distance for known impulsive energy sources in the ocean. To understand how to best conduct calibration experiments for both hydroacoustic and T-phase stations, an experiment was conducted in May, 1999more » at Ascension Island in the South Atlantic Ocean. The experiment made use of a British oceanographic research vessel and collected data that will be used for CTBT issues and for fundamental understanding of the Ascension Island volcanic edifice.« less

  9. National data centre preparedness exercise 2015 (NPE2015): MY-NDC progress result and experience

    NASA Astrophysics Data System (ADS)

    Rashid, Faisal Izwan Abdul; Zolkaffly, Muhammed Zulfakar

    2017-01-01

    Malaysia has established the National Data Centre (MY-NDC) in December 2005. MY-NDC is tasked to perform the Comprehensive Nuclear-Test-Ban-Treaty (CTBT) data management as well as providing relevant information for Treaty related events to the Malaysian Nuclear Agency (Nuclear Malaysia) as the CTBT National Authority. In the late 2015, MY-NDC has participated in the National Data Centre Preparedness Exercise 2015 (NPE 2015) which aims to access the level of readiness at MY-NDC. This paper aims at presenting the progress result of NPE 2015 as well as highlighting MY-NDC experience in NPE 2015 compared to previous participation in NPE 2013. MY-NDC has utilised available resources for NPE 2015. In NPE 2015, MY-NDC has performed five type of analyses compared with only two analyses in NPE 2013. Participation in the NPE 2015 has enabled MY-NDC to assess its capability and identify rooms for improvement.

  10. Three years of operational experience from Schauinsland CTBT monitoring station.

    PubMed

    Zähringer, M; Bieringer, J; Schlosser, C

    2008-04-01

    Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.

  11. Listening to sounds from an exploding meteor and oceanic waves

    NASA Astrophysics Data System (ADS)

    Evers, L. G.; Haak, H. W.

    Low frequency sound (infrasound) measurements have been selected within the Comprehensive Nuclear-Test-Ban Treaty (CTBT) as a technique to detect and identify possible nuclear explosions. The Seismology Division of the Royal Netherlands Meteorological Institute (KNMI) operates since 1999 an experimental infrasound array of 16 micro-barometers. Here we show the rare detection and identification of an exploding meteor above Northern Germany on November 8th, 1999 with data from the Deelen Infrasound Array (DIA). At the same time, sound was radiated from the Atlantic Ocean, South of Iceland, due to the atmospheric coupling of standing ocean waves, called microbaroms. Occurring with only 0.04 Hz difference in dominant frequency, DIA proved to be able to discriminate between the physically different sources of infrasound through its unique lay-out and instruments. The explosive power of the meteor being 1.5 kT TNT is in the range of nuclear explosions and therefore relevant to the CTBT.

  12. The LANL/LLNL/AFTAC Black Thunder Coal Mine regional mine monitoring experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pearson, D.C.; Stump, B.W.; Baker, D.F.

    Cast blasting operations associated with near surface coal recovery provide relatively large explosive sources that generate regional seismograms of interest in monitoring a Comprehensive Test Ban Treaty (CTBT). This paper describes preliminary results of a series of experiments currently being conducted at the Black Thunder Coal Mine in northeast Wyoming as part of the DOE CTBT Research and Development Program. These experiments are intended to provide an integrated set of near-source and regional seismic data for the purposes of quantifying the coupling and source characterization of the explosions. The focus of this paper is on the types of data beingmore » recovered with some preliminary implications. The Black Thunder experiments are designed to assess three major questions: (1) how many mining explosions produce seismograms at regional distances that will have to be detected, located and ultimately identified by the National Data Center and what are the waveform characteristics of these particular mining explosions; (2) can discrimination techniques based on empirical studies be placed on a firm physical basis so that they can be applied to other regions where there is little monitoring experience; (3) can large scale chemical explosions (possibly mining explosions) be used to calibrate source and propagation path effects to regional stations, can source depth of burial and decoupling effects be studied in such a controlled environment? With these key questions in mind and given the cooperation of the Black Thunder Mine, a suite of experiments have been and are currently being conducted. This paper will describe the experiments and their relevance to CTBT issues.« less

  13. Innovative concept for a major breakthrough in atmospheric radioactive xenon detection for nuclear explosion monitoring.

    PubMed

    Le Petit, G; Cagniant, A; Morelle, M; Gross, P; Achim, P; Douysset, G; Taffary, T; Moulin, C

    The verification regime of the comprehensive test ban treaty (CTBT) is based on a network of three different waveform technologies together with global monitoring of aerosols and noble gas in order to detect, locate and identify a nuclear weapon explosion down to 1 kt TNT equivalent. In case of a low intensity underground or underwater nuclear explosion, it appears that only radioactive gases, especially the noble gas which are difficult to contain, will allow identification of weak yield nuclear tests. Four radioactive xenon isotopes, 131m Xe, 133m Xe, 133 Xe and 135 Xe, are sufficiently produced in fission reactions and exhibit suitable half-lives and radiation emissions to be detected in atmosphere at low level far away from the release site. Four different monitoring CTBT systems, ARIX, ARSA, SAUNA, and SPALAX™ have been developed in order to sample and to measure them with high sensitivity. The latest developed by the French Atomic Energy Commission (CEA) is likely to be drastically improved in detection sensitivity (especially for the metastable isotopes) through a higher sampling rate, when equipped with a new conversion electron (CE)/X-ray coincidence spectrometer. This new spectrometer is based on two combined detectors, both exhibiting very low radioactive background: a well-type NaI(Tl) detector for photon detection surrounding a gas cell equipped with two large passivated implanted planar silicon chips for electron detection. It is characterized by a low electron energy threshold and a much better energy resolution for the CE than those usually measured with the existing CTBT equipments. Furthermore, the compact geometry of the spectrometer provides high efficiency for X-ray and for CE associated to the decay modes of the four relevant radioxenons. The paper focus on the design of this new spectrometer and presents spectroscopic performances of a prototype based on recent results achieved from both radioactive xenon standards and air sample measurements. Major improvements in detection sensitivity have been reached and quantified, especially for metastable radioactive isotopes 131m Xe and 133m Xe with a gain in minimum detectable activity (about 2 × 10 -3  Bq) relative to current CTBT SPALAX™ system (air sampling frequency normalized to 8 h) of about 70 and 30 respectively.

  14. Enhanced global Radionuclide Source Attribution for the Nuclear-Test-Ban Verification by means of the Adjoint Ensemble Dispersion Modeling Technique applied at the IDC/CTBTO.

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; de Geer, L.

    2006-05-01

    The Provisional Technical Secretariat (PTS) of the CTBTO Preparatory Commission maintains and permanently updates a source-receptor matrix (SRM) describing the global monitoring capability of a highly sensitive 80 stations radionuclide (RN) network in order to verify states signatories' compliance of the comprehensive nuclear-test-ban treaty (CTBT). This is done by means of receptor-oriented Lagrangian particle dispersion modeling (LPDM) to help determine the region from which suspicious radionuclides may originate. In doing so the LPDM FLEXPART5.1 is integrated backward in time based on global analysis wind fields yielding global source-receptor sensitivity (SRS) fields stored in three-hour frequency and at 1º horizontal resolution. A database of these SRS fields substantially helps in improving the interpretation of the RN samples measurements and categorizations because it enables the testing of source-hypothesis's later on in a pure post-processing (SRM inversion) step being feasible on hardware with specifications comparable to currently sold PC's or Notebooks and at any place (decentralized), provided access to the SRS fields is warranted. Within the CTBT environment it is important to quickly achieve decision-makers confidence in the SRM based backtracking products issued by the PTS in the case of the occurrence of treaty relevant radionuclides. Therefore the PTS has set up a highly automated response system together with the Regional Specialized Meteorological Centers of the World Meteorological Organization in the field of dispersion modeling who committed themselves to provide the PTS with the same standard SRS fields as calculated by their systems for CTBT relevant cases. This system was twice utilized in 2005 in order to perform adjoint ensemble dispersion modeling (EDM) and demonstrated the potential of EDM based backtracking to improve the accuracy of the source location related to singular nuclear events thus serving the backward analogue to the findings of the ensemble dispersion modeling (EDM) technique No. 5 efforts performed by Galmarini et al, 2004 (Atmos. Env. 38, 4607-4617). As the scope of the adjoint EDM methodology is not limited to CTBT verification but can be applied to any kind of nuclear event monitoring and location it bears the potential to improve the design of manifold emergency response systems towards preparedness concepts as needed for mitigation of disasters (like Chernobyl) and pre-emptive estimation of pollution hazards.

  15. Precipitation Nowcast using Deep Recurrent Neural Network

    NASA Astrophysics Data System (ADS)

    Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.

    2016-12-01

    An accurate precipitation nowcast (0-6 hours) with a fine temporal and spatial resolution has always been an important prerequisite for flood warning, streamflow prediction and risk management. Most of the popular approaches used for forecasting precipitation can be categorized into two groups. One type of precipitation forecast relies on numerical modeling of the physical dynamics of atmosphere and another is based on empirical and statistical regression models derived by local hydrologists or meteorologists. Given the recent advances in artificial intelligence, in this study a powerful Deep Recurrent Neural Network, termed as Long Short-Term Memory (LSTM) model, is creatively used to extract the patterns and forecast the spatial and temporal variability of Cloud Top Brightness Temperature (CTBT) observed from GOES satellite. Then, a 0-6 hours precipitation nowcast is produced using a Precipitation Estimation from Remote Sensing Information using Artificial Neural Network (PERSIANN) algorithm, in which the CTBT nowcast is used as the PERSIANN algorithm's raw inputs. Two case studies over the continental U.S. have been conducted that demonstrate the improvement of proposed approach as compared to a classical Feed Forward Neural Network and a couple simple regression models. The advantages and disadvantages of the proposed method are summarized with regard to its capability of pattern recognition through time, handling of vanishing gradient during model learning, and working with sparse data. The studies show that the LSTM model performs better than other methods, and it is able to learn the temporal evolution of the precipitation events through over 1000 time lags. The uniqueness of PERSIANN's algorithm enables an alternative precipitation nowcast approach as demonstrated in this study, in which the CTBT prediction is produced and used as the inputs for generating precipitation nowcast.

  16. The National Data Center Preparedness Exercise 2009 - First Results

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Bönnemann, Christian; Ceranna, Lars; Wotawa, Gerhard

    2010-05-01

    The NDC preparedness initiative was initiated by 8 signature states. It has now a history of more than 2 years with two successful exercises and subsequent fruitful discussions during the NDC Evaluation Workshops of the CTBTO. The first exercise was carried out in 2007 (NPE07). The objectives of and the idea behind this exercise have been described in the working paper CTBT/WGB-28/DE-IT/1 of the CTBTO. The exercise simulates a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of analysis procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the NPE is a measure for the readiness of the NDCs to fulfil their duties in regard of the CTBT verification: the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. The NPE09 has started on 1 October 2009, 00:00 UTC. In addition to the previous exercises, three technologies (seismology, infrasound, and radionuclide) have been taken into account leading to tentative mock events generated by strong explosions in open pit mines. Consequently, the first event, which fulfils all previously defined criteria, was close to the Kara-Zhyra mine in Eastern Kazakhstan and occurred on 28 November 2009 at 07:20:31 UTC. It generated seismic signals as well as infrasound signals at the closest IMS stations. The forward atmospheric transport modelling indicated that a sufficient number of radionuclide stations were also affected to enable the application of a negative testing scenario. First results of the seismo-acoustic analysis of the NPE09 event were presented along with details on the event selection process.

  17. Alternatives for Laboratory Measurement of Aerosol Samples from the International Monitoring System of the CTBT

    NASA Astrophysics Data System (ADS)

    Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.

    2013-12-01

    The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.

  18. CTBT infrasound network performance to detect the 2013 Russian fireball event

    DOE PAGES

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; ...

    2015-03-18

    The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individualmore » noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.« less

  19. UK National Data Centre archive of seismic recordings of (presumed) underground nuclear tests 1964-1996

    NASA Astrophysics Data System (ADS)

    Young, John; Peacock, Sheila

    2016-04-01

    The year 1996 has particular significance for forensic seismologists. This was the year when the Comprehensive Test Ban Treaty (CTBT) was signed in September at the United Nations, setting an international norm against nuclear testing. Blacknest, as a long time seismic centre for research into detecting and identifying underground explosions using seismology, provided significant technical advice during the CTBT negotiations. Since 1962 seismic recordings of both presumed nuclear explosions and earthquakes from the four seismometer arrays Eskdalemuir, Scotland (EKA), Yellowknife, Canada (YKA), Gauribidanur, India (GBA), and Warramunga, Australia (WRA) have been copied, digitised, and saved. There was a possibility this archive would be lost. It was decided to process the records and catalogue them for distribution to other groups and institutions. This work continues at Blacknest but the archive is no longer under threat. In addition much of the archive of analogue tape recordings has been re-digitised with modern equipment, allowing sampling rates of 100 rather than 20 Hz.

  20. Scenario design and basic analysis of the National Data Centre Preparedness Exercise 2013

    NASA Astrophysics Data System (ADS)

    Ross, Ole; Ceranna, Lars; Hartmann, Gernot; Gestermann, Nicolai; Bönneman, Christian

    2014-05-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and for the mutual exchange of information between NDC and also with the IDC. The NPE2010 and NPE2012 trigger scenarios were based on selected seismic events from the Reviewed Event Bulletin (REB) serving as starting point for fictitious Radionuclide dispersion. The main task was the identification of the original REB event and the discrimination between earthquakes and explosions as source. The scenario design of NPE2013 differs from those of previous NPEs. The waveform event selection is not constrained to events in the REB. The exercise trigger is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The synthetic radionuclide detections start in Vienna (8 Sept, I-131) and Schauinsland (11 Sept, Xe-133) with rather low activity concentrations and are most prominent in Stockholm and Spitsbergen mid of September 2013. Smaller concentrations in Asia follow later on. The potential connection between the waveform and radionuclide evidence remains unclear. The verification task is to identify the waveform event and to investigate potential sources of the radionuclide findings. Finally the potential conjunction between the sources and the CTBT-relevance of the whole picture has to be evaluated. The overall question is whether requesting an On-Site-Inspection in "Frisia" would be justified. The poster presents the NPE2013 scenario and gives a basic analysis of the initial situation concerning both waveform detections and atmospheric dispersion conditions in Central Europe in early September 2013. The full NPE2013 scenario will be presented at the NDC Workshop mid of May 2014.

  1. OSI Passive Seismic Experiment at the Former Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweeney, J J; Harben, P

    On-site inspection (OSI) is one of the four verification provisions of the Comprehensive Nuclear Test Ban Treaty (CTBT). Under the provisions of the CTBT, once the Treaty has entered into force, any signatory party can request an on-site inspection, which can then be carried out after approval (by majority voting) of the Executive Council. Once an OSI is approved, a team of 40 inspectors will be assembled to carry out an inspection to ''clarify whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of Article I''. One challenging aspect of carrying outmore » an on-site inspection (OSI) in the case of a purported underground nuclear explosion is to detect and locate the underground effects of an explosion, which may include an explosion cavity, a zone of damaged rock, and/or a rubble zone associated with an underground collapsed cavity. The CTBT (Protocol, Section II part D, paragraph 69) prescribes several types of geophysical investigations that can be carried out for this purpose. One of the methods allowed by the CTBT for geophysical investigation is referred to in the Treaty Protocol as ''resonance seismometry''. This method, which was proposed and strongly promoted by Russia during the Treaty negotiations, is not described in the Treaty. Some clarification about the nature of the resonance method can be gained from OSI workshop presentations by Russian experts in the late 1990s. Our understanding is that resonance seismometry is a passive method that relies on seismic reverberations set up in an underground cavity by the passage of waves from regional and teleseismic sources. Only a few examples of the use of this method for detection of underground cavities have been presented, and those were done in cases where the existence and precise location of an underground cavity was known. As is the case with many of the geophysical methods allowed during an OSI under the Treaty, how resonance seismology really works and its effectiveness for OSI purposes has yet to be determined. For this experiment, we took a broad approach to the definition of ''resonance seismometry''; stretching it to include any means that employs passive seismic methods to infer the character of underground materials. In recent years there have been a number of advances in the use of correlation and noise analysis methods in seismology to obtain information about the subsurface. Our objective in this experiment was to use noise analysis and correlation analysis to evaluate these techniques for detecting and characterizing the underground damage zone from a nuclear explosion. The site that was chosen for the experiment was the Mackerel test in Area 4 of the former Nevada Test Site (now named the Nevada National Security Site, or NNSS). Mackerel was an underground nuclear test of less than 20 kT conducted in February of 1964 (DOENV-209-REV 15). The reason we chose this site is because there was a known apical cavity occurring at about 50 m depth above a rubble zone, and that the site had been investigated by the US Geological Survey with active seismic methods in 1965 (Watkins et al., 1967). Note that the time delay between detonation of the explosion (1964) and the time of the present survey (2010) is nearly 46 years - this would not be typical of an expected OSI under the CTBT.« less

  2. WOSMIP II- Workshop on Signatures of Medical and Industrial Isotope Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Murray; Achim, Pascal; Auer, M.

    2011-11-01

    Medical and industrial fadioisotopes are fundamental tools used in science, medicine and industry with an ever expanding usage in medical practice where their availability is vital. Very sensitive environmental radionuclide monitoring networks have been developed for nuclear-security-related monitoring [particularly Comprehensive Test-Ban-Treaty (CTBT) compliance verification] and are now operational.

  3. Incorporating atmospheric uncertainties into estimates of the detection capability of the IMS infrasound network

    NASA Astrophysics Data System (ADS)

    Le Pichon, Alexis; Ceranna, Lars; Taillepied, Doriane

    2015-04-01

    To monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a dedicated network is being deployed. Multi-year observations recorded by the International Monitoring System (IMS) infrasound network confirm that its detection capability is highly variable in space and time. Today, numerical modeling techniques provide a basis to better understand the role of different factors describing the source and the atmosphere that influence propagation predictions. Previous studies estimated the radiated source energy from remote observations using frequency dependent attenuation relation and state-of-the-art specifications of the stratospheric wind. In order to account for a realistic description of the dynamic structure of the atmosphere, model predictions are further enhanced by wind and temperature error distributions as measured in the framework of the ARISE project (http://arise-project.eu/). In the context of the future verification of the CTBT, these predictions quantify uncertainties in the spatial and temporal variability of the IMS infrasound network performance in higher resolution, and will be helpful for the design and prioritizing maintenance of any arbitrary infrasound monitoring network.

  4. Incorporating atmospheric uncertainties into estimates of the detection capability of the IMS infrasound network

    NASA Astrophysics Data System (ADS)

    Le Pichon, Alexis; Blanc, Elisabeth; Rüfenacht, Rolf; Kämpfer, Niklaus; Keckhut, Philippe; Hauchecorne, Alain; Ceranna, Lars; Pilger, Christoph; Ross, Ole

    2014-05-01

    To monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a dedicated network is being deployed. Multi-year observations recorded by the International Monitoring System (IMS) infrasound network confirm that its detection capability is highly variable in space and time. Today, numerical modeling techniques provide a basis to better understand the role of different factors describing the source and the atmosphere that influence propagation predictions. Previous studies estimated the radiated source energy from remote observations using frequency dependent attenuation relation and state-of-the-art specifications of the stratospheric wind. In order to account for a realistic description of the dynamic structure of the atmosphere, model predictions are further enhanced by wind and temperature error distributions as measured in the framework of the ARISE project (http://arise-project.eu/). In the context of the future verification of the CTBT, these predictions quantify uncertainties in the spatial and temporal variability of the IMS infrasound network performance in higher resolution, and will be helpful for the design and prioritizing maintenance of any arbitrary infrasound monitoring network.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, J.R.; Marshall, M.E.; Barker, B.W.

    In situations where cavity decoupling of underground nuclear explosions is a plausible evasion scenario, comprehensive seismic monitoring of any eventual CTBT will require the routine identification of many small seismic events with magnitudes in the range 2.0 < m sub b < 3.5. However, since such events are not expected to be detected teleseismically, their magnitudes will have to be estimated from regional recordings using seismic phases and frequency bands which are different from those employed in the teleseismic m sub b scale which is generally used to specify monitoring capability. Therefore, it is necessary to establish the m submore » b equivalences of any selected regional magnitude measures in order to estimate the expected detection statistics and thresholds of proposed CTBT seismic monitoring networks. In the investigations summarized in this report, this has been accomplished through analyses of synthetic data obtained by theoretically scaling observed regional seismic data recorded in Scandinavia and Central Asia from various tamped nuclear tests to obtain estimates of the corresponding seismic signals to be expected from small cavity decoupled nuclear tests at those same source locations.« less

  6. The Nuclear Non-Proliferation Treaty and the Comprehensive Nuclear-Test-Ban Treaty, the relationship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, Thomas Jr.

    The Nuclear Non-Proliferation Treaty (NPT) is the most important international security arrangement that we have that is protecting the world community and this has been true for many years. But it did not happen by accident, it is a strategic bargain in which 184 states gave up the right forever to acquire the most powerful weapon ever created in exchange for a commitment from the five states allowed to keep nuclear weapons under the NPT (U.S., U.K., Russia, France and China), to share peaceful nuclear technology and to engage in disarmament negotiations aimed at the ultimate elimination of their nuclearmore » stockpiles. The most important part of this is the comprehensive nuclear test ban (CTBT); the thinking by the 184 NPT non-nuclear weapon states was and is that they understand that the elimination of nuclear weapon stockpiles is a long way off, but at least the NPT nuclear weapon states could stop testing the weapons. The CTBT has been ratified by 161 states but by its terms it can only come into force if 44 nuclear potential states ratify; 36 have of the 44 have ratified it, the remaining eight include the United States and seven others, most of whom are in effect waiting for the United States. No state has tested a nuclear weapon-except for complete outlier North Korea-in 15 years. There appears to be no chance that the U.S. Senate will approve the CTBT for ratification in the foreseeable future, but the NPT may not survive without it. Perhaps it is time to consider an interim measure, for the UN Security Council to declare that any future nuclear weapon test any time, anywhere is a 'threat to peace and security', in effect a violation of international law, which in today's world it clearly would be.« less

  7. Use of Geophysical and Remote Sensing Techniques During the Comprehensive Test Ban Treaty Organization's Integrated Field Exercise 2014

    NASA Astrophysics Data System (ADS)

    Labak, Peter; Sussman, Aviva; Rowlands, Aled; Chiappini, Massimo; Malich, Gregor; MacLeod, Gordon; Sankey, Peter; Sweeney, Jerry; Tuckwell, George

    2016-04-01

    The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI). During an OSI, up to 40 inspectors search a 1000km2 inspection area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of an OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams to execute the scenario in which the exercise was played, to those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, a number of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force Group (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, as well as other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection by other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.

  8. Improvements in Calibration and Analysis of the CTBT-relevant Radioxenon Isotopes with High Resolution SiPIN-based Electron Detectors

    NASA Astrophysics Data System (ADS)

    Khrustalev, K.

    2016-12-01

    Current process for the calibration of the beta-gamma detectors used for radioxenon isotope measurements for CTBT purposes is laborious and time consuming. It uses a combination of point sources and gaseous sources resulting in differences between energy and resolution calibrations. The emergence of high resolution SiPIN based electron detectors allows improvements in the calibration and analysis process to be made. Thanks to high electron resolution of SiPIN detectors ( 8-9 keV@129 keV) compared to plastic scintillators ( 35 keV@129keV) there are a lot more CE peaks (from radioxenon and radon progenies) can be resolved and used for energy and resolution calibration in the energy range of the CTBT-relevant radioxenon isotopes. The long term stability of the SiPIN energy calibration allows one to significantly reduce the time of the QC measurements needed for checking the stability of the E/R calibration. The currently used second order polynomials for the E/R calibration fitting are unphysical and shall be replaced by a linear energy calibration for NaI and SiPIN, owing to high linearity and dynamic range of the modern digital DAQ systems, and resolution calibration functions shall be modified to reflect the underlying physical processes. Alternatively, one can completely abandon the use of fitting functions and use only point-values of E/R (similar to the efficiency calibration currently used) at the energies relevant for the isotopes of interest (ROI - Regions Of Interest ). Current analysis considers the detector as a set of single channel analysers, with an established set of coefficients relating the positions of ROIs with the positions of the QC peaks. The analysis of the spectra can be made more robust using peak and background fitting in the ROIs with a single free parameter (peak area) of the potential peaks from the known isotopes and a fixed E/R calibration values set.

  9. Broadband seismology and the detection and verification of underground nuclear explosions

    NASA Astrophysics Data System (ADS)

    Tinker, Mark Andrew

    1997-10-01

    On September 24, 1996, President Clinton signed the Comprehensive Test Ban Treaty (CTBT), which bans the testing of all nuclear weapons thereby limiting their future development. Seismology is the primary tool used for the detection and identification of underground explosions and thus, will play a key role in monitoring a CTBT. The detection and identification of low yield explosions requires seismic stations at regional distances (<1500 km). However, because the regional wavefield propagates within the extremely heterogeneous crustal waveguide, the seismic waveforms are also very complicated. Therefore, it is necessary to have a solid understanding of how the phases used in regional discriminants develop within different tectonic regimes. Thus, the development of the seismic phases Pn and Lg, which compose the seismic discriminant Pn/Lg, within the western U.S. from the Non-Proliferation Experiment are evaluated. The most fundamental discriminant is event location as 90% of all seismic sources occur too deep within the earth to be unnatural. France resumed its nuclear testing program after a four year moratorium and conducted six tests during a five month period starting in September of 1995. Using teleseismic data, a joint hypocenter determination algorithm was used to determine the hypocenters of these six explosions. One of the most important problems in monitoring a CTBT is the detection and location of small seismic events. Although seismic arrays have become the central tool for event detection, in the context of a global monitoring treaty, there will be some dependence on sparse regional networks of three-component broadband seismic stations to detect low yield explosions. However, the full power of the data has not been utilized, namely using phases other than P and S. Therefore, the information in the surface wavetrain is used to improve the locations of small seismic events recorded on a sparse network in Bolivia. Finally, as a discrimination example in a complex region, P to S ratios are used to determine source parameters of the Msb{w} 8.3 deep Bolivia earthquake.

  10. The Nuclear Non-Proliferation Treaty and the Comprehensive Nuclear-Test-Ban Treaty, the relationship

    NASA Astrophysics Data System (ADS)

    Graham, Thomas, Jr.

    2014-05-01

    The Nuclear Non-Proliferation Treaty (NPT) is the most important international security arrangement that we have that is protecting the world community and this has been true for many years. But it did not happen by accident, it is a strategic bargain in which 184 states gave up the right forever to acquire the most powerful weapon ever created in exchange for a commitment from the five states allowed to keep nuclear weapons under the NPT (U.S., U.K., Russia, France and China), to share peaceful nuclear technology and to engage in disarmament negotiations aimed at the ultimate elimination of their nuclear stockpiles. The most important part of this is the comprehensive nuclear test ban (CTBT); the thinking by the 184 NPT non-nuclear weapon states was and is that they understand that the elimination of nuclear weapon stockpiles is a long way off, but at least the NPT nuclear weapon states could stop testing the weapons. The CTBT has been ratified by 161 states but by its terms it can only come into force if 44 nuclear potential states ratify; 36 have of the 44 have ratified it, the remaining eight include the United States and seven others, most of whom are in effect waiting for the United States. No state has tested a nuclear weapon-except for complete outlier North Korea-in 15 years. There appears to be no chance that the U.S. Senate will approve the CTBT for ratification in the foreseeable future, but the NPT may not survive without it. Perhaps it is time to consider an interim measure, for the UN Security Council to declare that any future nuclear weapon test any time, anywhere is a "threat to peace and security", in effect a violation of international law, which in today's world it clearly would be.

  11. Comprehensive Nuclear-Test-Ban Treaty: Background and Current Developments

    DTIC Science & Technology

    2012-08-03

    Academy of Sciences Study and Its Critics ....................................................... 39 Chronology... criticisms of that report. On February 13, the Administration rolled out its FY2013 budget request, which included funds for the CTBT Organization...So he has not ruled out testing in the future, but there are no plans to do so.”4 Critics expressed concern about the implications of these policies

  12. Machine Learning and Data Mining for Comprehensive Test Ban Treaty Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, S; Vaidya, S

    2009-07-30

    The Comprehensive Test Ban Treaty (CTBT) is gaining renewed attention in light of growing worldwide interest in mitigating risks of nuclear weapons proliferation and testing. Since the International Monitoring System (IMS) installed the first suite of sensors in the late 1990's, the IMS network has steadily progressed, providing valuable support for event diagnostics. This progress was highlighted at the recent International Scientific Studies (ISS) Conference in Vienna in June 2009, where scientists and domain experts met with policy makers to assess the current status of the CTBT Verification System. A strategic theme within the ISS Conference centered on exploring opportunitiesmore » for further enhancing the detection and localization accuracy of low magnitude events by drawing upon modern tools and techniques for machine learning and large-scale data analysis. Several promising approaches for data exploitation were presented at the Conference. These are summarized in a companion report. In this paper, we introduce essential concepts in machine learning and assess techniques which could provide both incremental and comprehensive value for event discrimination by increasing the accuracy of the final data product, refining On-Site-Inspection (OSI) conclusions, and potentially reducing the cost of future network operations.« less

  13. 13C-tryptophan breath test detects increased catabolic turnover of tryptophan along the kynurenine pathway in patients with major depressive disorder

    PubMed Central

    Teraishi, Toshiya; Hori, Hiroaki; Sasayama, Daimei; Matsuo, Junko; Ogawa, Shintaro; Ota, Miho; Hattori, Kotaro; Kajiwara, Masahiro; Higuchi, Teruhiko; Kunugi, Hiroshi

    2015-01-01

    Altered tryptophan–kynurenine (KYN) metabolism has been implicated in major depressive disorder (MDD). The l-[1-13C]tryptophan breath test (13C-TBT) is a noninvasive, stable-isotope tracer method in which exhaled 13CO2 is attributable to tryptophan catabolism via the KYN pathway. We included 18 patients with MDD (DSM-IV) and 24 age- and sex-matched controls. 13C-tryptophan (150 mg) was orally administered and the 13CO2/12CO2 ratio in the breath was monitored for 180 min. The cumulative recovery rate during the 180-min test (CRR0–180; %), area under the Δ13CO2-time curve (AUC; %*min), and the maximal Δ13CO2 (Cmax; %) were significantly higher in patients with MDD than in the controls (p = 0.004, p = 0.008, and p = 0.002, respectively). Plasma tryptophan concentrations correlated negatively with Cmax in both the patients and controls (p = 0.020 and p = 0.034, respectively). Our results suggest that the 13C-TBT could be a novel biomarker for detecting a subgroup of MDD with increased tryptophan–KYN metabolism. PMID:26524975

  14. On the usability of frequency distributions and source attribution of Cs-137 detections encountered in the IMS radio-nuclide network for radionuclide event screening and climate change monitoring

    NASA Astrophysics Data System (ADS)

    Becker, A.; Wotawa, G.; Zähringer, M.

    2009-04-01

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), airborne radioactivity is measured by means of high purity Germanium gamma ray detectors deployed in a global monitoring network. Almost 60 of the scheduled 80 stations have been put in provisional operations by the end of 2008. Each station daily sends the 24 hour samples' spectroscopic data to the Vienna based Provisional Technical Secretariat (PTS) of the CTBT Organization (CTBTO) for review for treaty-relevant nuclides. Cs-137 is one of these relevant isotopes. Its typical minimum detectable concentration is in the order of a few Bq/m3. However, this isotope is also known to occur in atmospheric trace concentrations, due to known non CTBT relevant processes and sources related to, for example, the re-suspension of cesium from historic nuclear tests and/or the Chernobyl reactor disaster, temporarily enhanced by bio-mass burning (Wotawa et al. 2006). Properly attributed cesium detections can be used as a proxy to detect Aeolian dust events (Igarashi et al, 2001) that potentially carry cesium from all aforementioned sources but are also known to play an important role for the radiative forcing in the atmosphere (shadow effect), at the surface (albedo) and the carbon dioxide cycle when interacting with oceanic phytoplankton (Mikami and Shi, 2005). In this context this paper provides a systematic attribution of recent Cs-137 detections in the PTS monitoring network in order to Characterize those stations which are regularly affected by Cs-137 Provide input for procedures that distinguish CTBT relevant detection from other sources (event screening) Explore on the capability of certain stations to use their Cs-137 detections as a proxy to detect aeolian dust events and to flag the belonging filters to be relevant for further investigations in this field (-> EGU-2009 Session CL16/AS4.6/GM10.1: Aeolian dust: initiator, player, and recorder of environmental change). References Igarashi, Y., M. Aoyama, K. Hirose,M. Takashi and S. Yabuki, 2001: Is It Possible to Use 90Sr and 137Cs As Tracers for the Aeolian Dust Transport? Water, Air, & Soil Pollution 130, 349-354. Mikami, M. and G. Shi, 2005: Preliminary summary of aeolian dust experiment on climate impact -Japan-Sino joint project ADEC. Geophysical Research Abstracts, 7, 05985 Wotawa, G., L.-E. De Geer, A. Becker, R.D'Amours, M. Jean, R. Servranck and K. Ungar, 2006: Inter- and intra-continental transport of radioactive cesium released by boreal forest fires, Geophys. Res. Lett. 33, L12806, doi: 10.1029/2006GL026206 Disclaimer The views expressed in this publication are those of the author and do not necessarily reflect the views of the CTBTO Preparatory Commission.

  15. On the impact of RN network coverage on event selection and data fusion during the 2009 National Data Centres Preparedness Exercise

    NASA Astrophysics Data System (ADS)

    Becker, Andreas; Krysta, Monika; Auer, Matthias; Brachet, Nicolas; Ceranna, Lars; Gestermann, Nicolai; Nikkinen, Mika; Zähringer, Matthias

    2010-05-01

    The so-called National Data Centres (NDCs) to the Provisional Technical Secretariat of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Organization are in charge to provide for the final judgement on the CTBT relevance of explosion events encountered in the PTS International Monitoring System (IMS). The latter is a 321 stations network set-up by the PTS (to date completion level: 80%) in order to globally monitor for occurrence of CTBT relevant seismo-acoustic and radionuclide signals. In doing so, NDCs learn about any seismo-acoustic or radionuclide event by active retrieval or subscription to corresponding event lists and products provided by the International Data Centre (IDC) to the PTS. To prepare for their instrumental role in case of a CTBT relevant event, the NDCs jointly conduct annually so-called NDC Preparedness Exercises. In 2009, NDC Germany was in charge to lead the exercise and to choose a seismo-acoustic event out of the list of events provided by the PTS (Gestermann et al., EGU2010-13067). The novelty in this procedure was that also the infrasound readings and the monitoring coverage of existing (certified) radionuclide stations into the area of consideration were taken into account during the event selection process (Coyne et al., EGU2010-12660). Hence, the event finally chosen and examined took place near Kara-Zhyra mine in Eastern Kazakhstan on 28 November 2009 around 07:20:31 UTC (Event-ID 5727516). NDC Austria performed forward atmospheric transport modelling in order to predict RN measurements that should have occurred in the radionuclide IMS. In doing so the fictitious case that there would have been a release of radionuclides taking place at the same location (Wotawa and Schraik, 2010; EGU2010-4907) in a strength being typical for a non-contained nuclear explosion is examined. The stations indicated should then be analysed for their actual radionuclide readings in order to confirm the non nuclear character of the event (negative testing scenario). Obviously only stations already set up and ‘certified' of being capable of full operations, could be recruited for this. In doing so an issue was encountered with regard to the availability of RN data at certified RN stations. Despite the support to the event selection, PTS also supplied so-called data fusion bulletins that apply a method to collocate the RN and seismo-acoustic source location results (Krysta and Becker, EGU2010-10218). In this paper we demonstrate the impact of gaps in network coverage that appear due to the aforementioned reduced RN data availability for source location capacities. Network coverage assessments for the set of certified stations and the reduced set of stations actually sending data shall therefore be discussed. Furthermore, the capabilities and constraints of the data fusion method to make up for the RN source location accuracy losses related to reduced RN data availability at certified stations shall be presented.

  16. Construction of 3-D Earth Models for Station Specific Path Corrections by Dynamic Ray Tracing

    DTIC Science & Technology

    2001-10-01

    the numerical eikonal solution method of Vidale (1988) being used by the MIT led consortium. The model construction described in this report relies...assembled. REFERENCES Barazangi, M., Fielding, E., Isacks, B. & Seber, D., (1996), Geophysical And Geological Databases And Ctbt...preprint download6). Fielding, E., Isacks, B.L., and Baragangi. M. (1992), A Network Accessible Geological and Geophysical Database for

  17. An Improved Method for Seismic Event Depth and Moment Tensor Determination: CTBT Related Application

    NASA Astrophysics Data System (ADS)

    Stachnik, J.; Rozhkov, M.; Baker, B.

    2016-12-01

    According to the Protocol to CTBT, International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event. Determination of seismic event source mechanism and its depth is a part of these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. We show preliminary results using the latter approach from an improved software design and applied on a moderately powered computer. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK 2009, 2013 and 2016 events and shallow earthquakes using a new implementation of waveform fitting of teleseismic P waves. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Moment tensors for DPRK events show isotropic percentages greater than 50%. Depth estimates for the DPRK events range from 1.0-1.4 km. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.

  18. Monitoring Research in the Context of CTBT Negotiations and Networks,

    DTIC Science & Technology

    1995-08-14

    1995) estimates, using infrasound and satellite data, that these sources generate explosion-like signals worldwide at a rate of approximately 1/yr at...coupling and the waveform appearance of atmospheric explosions. In infrasound there is the development of new array designs and of new automatic detection ...sensors. The principal daily use of the hydroacoustic network is for purposes of simple discrimination of those oceanic earthquakes detected by the seismic

  19. Xenon monitoring and the Comprehensive Nuclear-Test-Ban Treaty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowyer, Theodore W.

    How do you monitor (verify) a CTBT? It is a difficult challenge to monitor the entire world for nuclear tests, regardless of size. Nuclear tests 'normally' occur underground, above ground or underwater. Setting aside very small tests (let's limit our thinking to 1 kiloton or more), nuclear tests shake the ground, emit large amounts of radioactivity, and make loud noises if in the atmosphere (or hydroacoustic waves if underwater)

  20. Comprehensive test ban negotiations

    NASA Astrophysics Data System (ADS)

    Grab, G. Allen; Heckrotte, Warren

    1983-10-01

    Although it has been a stated policy goal of American and Soviet leaders since 1958 (with the exception of Ronald Reagan), the world today is still without a Comprehensive Test Ban Treaty. Throughout their history, test an negotiatins have been plagued by a number of persistent problems. Chief among these is East-West differences on the verification question, with the United States concerned about the problem of possible Soviet cheating and the USSR concerned about the protection of its national sovereignty. In addition, internal bureaucratic politics have played a major role in preventing the successful conclusion of an agreement. Despite these problems, the superpowers have concluded several significant partial meausres: a brief (1958-1961) total moratorium on nuclear weapons tests; the Limited Test Ban Treaty of 1963, banning tests in the air, water and outer space; the Threshold Test Ban Treaty of 1974 (150 KT limit on underground explosions); and the Peaceful Nuclear Explosions Treaty of 1976 (150 KT limit on individal PNEs). Today, the main U.S. objections to a CTBT center is the nuclear weapons laboratories, the Department of Energy, and the Pentagon, who all stress the issues of stockpile reliability and verification. Those who remain committed to a CTBT emphasize and the potential political leverage it offers in checking both horizontal and vertical proliferation.

  1. Post-installation activities in the Comprehensive Nuclear Test Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Hoffmann, T. L.; Campus, P.; Bell, M.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.; Wu, Sean F.

    2002-11-01

    The provisional operation and maintenance of IMS infrasound stations after installation and subsequent certification has the objective to prepare the infrasound network for entry into force of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The goal is to maintain and fine tune the technical capabilities of the network, to repair faulty equipment, and to ensure that stations continue to meet the minimum specifications through evaluation of data quality and station recalibration. Due to the globally dispersed nature of the network, this program constitutes a significant undertaking that requires careful consideration of possible logistic approaches and their financial implications. Currently, 11 of the 60 IMS infrasound stations are transmitting data in the post-installation Testing & Evaluation mode. Another 5 stations are under provisional operation and are maintained in post-certification mode. It is expected that 20% of the infrasound network will be certified by the end of 2002. This presentation will focus on the different phases of post-installation activities of the IMS infrasound program and the logistical challenges to be tackled to ensure a cost-efficient management of the network. Specific topics will include Testing & Evaluation and Certification of Infrasound Stations, as well as Configuration Management and Network Sustainment.

  2. Towards a new daily in-situ precipitation data set supporting parameterization of wet-deposition of CTBT relevant radionuclides

    NASA Astrophysics Data System (ADS)

    Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Lehner, K.; Rudolf, B.

    2012-04-01

    As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Schneider et al, 2011; Becker et al. 2012, Ziese et al, EGU2012-5442) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide (RN) monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the RN detection capability of the CTBT network. Besides the climatological kind of wet deposition assessment for threshold monitoring purposes, there are also singular release events like the Fukushima accident that need to be classified as bogus by a properly working RN verification regime. For these kinds of events a higher temporal resolution of the precipitation data sets is needed. In course of the research project 'Global DAily Precipitation Analysis for the validation of medium-range CLImate Predictions (DAPACLIP) within the Framework Research Programme MiKlip (Mittelfristige Klimaprognose), funded by the German ministry for research (BMBF), a new quality controlled and globally gridded daily precipitation data set is built up, where GPCC will serve the land-surface compartment. The data set is primarily constructed to study decadal behaviour of the essential climate variable precipitation, but as a collateral benefit it will also serve RN verification regime needs. The Fukushima accident has also provided impetus to construct even hourly in-situ precipitation data sets as will be presented in the same session by Yatagai (2012). A comprehensive overview on available precipitation data sets based on in-situ (rain gauge), satellite measurements or the combination of both systems is available from the International Precipitation Working Group (IPWG) web pages (http://www.isac.cnr.it/~ipwg/data/datasets.html).

  3. Nuclear Weapons: Comprehensive Test Ban Treaty

    DTIC Science & Technology

    2007-10-29

    which has been done. Critics raised concerns about the implications of these policies for testing and new weapons. At present, Congress addresses...Comprehensive Test Ban Treaty Most Recent Developments On October 24, Senator Jon Kyl delivered a speech critical of the CTBT and of Section 3122 in...future, but there are no plans to do so.’”5 Critics expressed concern about the implications of these policies for testing and new weapons. A statement

  4. Comprehensive Nuclear-Test-Ban Treaty: Updated ’Safeguards’ and Net Assessments

    DTIC Science & Technology

    2009-06-03

    measures that this nation can take unilaterally within the treaty to protect its nuclear security. To compensate for “disadvantages and risk” they...and strategic forces, and could be augmented with implementation measures . While Safeguards may be part of a future CTBT debate, both supporters and...A second path involves efforts to alter the net assessment through measures intended to mitigate perceived risks of the treaty. This path has been

  5. China’s Foreign Policy Toward North Korea: The Nuclear Issue

    DTIC Science & Technology

    2012-12-01

    Comprehensive National Power CTBT Comprehensive Test Ban Treaty CVID Complete, Verifiable, and Irreversible Dismantlement EAS East Asia Summit ETIM...realized that it had to take some measures to stop North Korea’s nuclear testing .5 Hong-seo Park analyzes China’s policy change from a perspective of...community have failed to find a consensus, and North Korea conducted a nuclear test in 2006. China had shown different responses between the first and

  6. North Korea’s 2009 Nuclear Test: Containment, Monitoring, Implications

    DTIC Science & Technology

    2010-04-02

    inspections as prima facie evidence of a violation. One generally-accepted means of evading detection of nuclear tests, especially low-yield tests...In an attempt to extend these bans to cover all nuclear tests, negotiations on the CTBT were completed in 1996. The treaty’s basic obligation is to...Verification refers to determining whether a nation is in compliance with its treaty obligations , which in this case means determining whether a suspicious

  7. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the International Monitoring System of CTBTO and by small-aperture seismic array Mikhnevo (MHVAR) operated by the Institute of Geosphere Dynamics, Russian Academy of Sciences. Our approach demonstrated a good ability of separation of seismic sources with very close origin times and locations (hundreds of meters), and/or having close arrival times (fractions of seconds), and recovering their waveforms from the mixture. Perspectives and limitations of the method are discussed.

  8. WWW.NMDB.EU: The real-time Neutron Monitor databas

    NASA Astrophysics Data System (ADS)

    Klein, Karl-Ludwig; Steigies, Christian; Steigies, Christian T.; Wimmer-Schweingruber, Robert F.; Kudela, Karel; Strharsky, Igor; Langer, Ronald; Usoskin, Ilya; Ibragimov, Askar; Flückiger, Erwin O.; Bütikofer, Rolf; Eroshenko, Eugenia; Belov, Anatoly; Yanke, Victor; Klein, Karl-Ludwig; Fuller, Nicolas; Mavromichalaki, Helen; Papaioannou, Athana-Sios; Sarlanis, Christos; Souvatzoglou, George; Plainaki, Christina; Geron-Tidou, Maria; Papailiou, Maria-Christina; Mariatos, George; Chilingaryan, Ashot; Hovsepyan, G.; Reymers, Artur; Parisi, Mario; Kryakunova, Olga; Tsepakina, Irina; Nikolayevskiy, Nikolay; Dor-Man, Lev; Pustil'Nik, Lev; García-Población, Oscar

    The Real time database for high-resolution neutron monitor measurements(NMDB), which was supported by the 7th Framework Programme of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. Besides real-time data and historical data over several decades in a unified format, it offers data products such as galactic cosmic ray spectra and applications including solar energetic particle alerts and the calculation of ionisation rates in the atmosphere and effective radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to become familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.

  9. SUPER-FOCUS: A tool for agile functional analysis of shotgun metagenomic data

    DOE PAGES

    Silva, Genivaldo Gueiros Z.; Green, Kevin T.; Dutilh, Bas E.; ...

    2015-10-09

    Analyzing the functional profile of a microbial community from unannotated shotgun sequencing reads is one of the important goals in metagenomics. Functional profiling has valuable applications in biological research because it identifies the abundances of the functional genes of the organisms present in the original sample, answering the question what they can do. Currently, available tools do not scale well with increasing data volumes, which is important because both the number and lengths of the reads produced by sequencing platforms keep increasing. Here, we introduce SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reducedmore » reference database to report the subsystems present in metagenomic datasets and profile their abundances. We tested SUPER-FOCUS with over 70 real metagenomes, the results showing that it accurately predicts the subsystems present in the profiled microbial communities, and is up to 1000 times faster than other tools.« less

  10. An Agile Functional Analysis of Metagenomic Data Using SUPER-FOCUS.

    PubMed

    Silva, Genivaldo Gueiros Z; Lopes, Fabyano A C; Edwards, Robert A

    2017-01-01

    One of the main goals in metagenomics is to identify the functional profile of a microbial community from unannotated shotgun sequencing reads. Functional annotation is important in biological research because it enables researchers to identify the abundance of functional genes of the organisms present in the sample, answering the question, "What can the organisms in the sample do?" Most currently available approaches do not scale with increasing data volumes, which is important because both the number and lengths of the reads provided by sequencing platforms keep increasing. Here, we present SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reduced reference database to report the subsystems present in metagenomic datasets and profile their abundances. SUPER-FOCUS was tested with real metagenomes, and the results show that it accurately predicts the subsystems present in the profiled microbial communities, is computationally efficient, and up to 1000 times faster than other tools. SUPER-FOCUS is freely available at http://edwards.sdsu.edu/SUPERFOCUS .

  11. Comprehensive Nuclear-Test-Ban Treaty: Background and Current Developments

    DTIC Science & Technology

    2008-04-30

    resumed testing, and has no plans to test. It has reduced the time needed to conduct a nuclear test. Critics raised concerns about the implications of...lieu of the current treaty.1 On October 24, Senator Jon Kyl delivered a speech critical of the CTBT and of Section 3122 in H.R. 1585, the FY2008...2007. Critics expressed concern about the implications of these policies for testing and new weapons. A statement by Physicians for Social

  12. The Crustal Structure And CTBT Monitoring Of India: New Insights From Deep Seismic Profiling

    DTIC Science & Technology

    2000-09-01

    transitional type crust as a major source of Deccan trap flows. The Narmada-Son lineament is the most conspicuous linear geological feature in the... Deccan proto-continents) buckling of the upper and middle crustal layers of the proto-continents took place, resulting in the western block’s lower...crustal column subducting below the Deccan proto-continents. Thus, the collision process was of such severe magnitude that the impact was seen in both

  13. SPALAX new generation: New process design for a more efficient xenon production system for the CTBT noble gas network.

    PubMed

    Topin, Sylvain; Greau, Claire; Deliere, Ludovic; Hovesepian, Alexandre; Taffary, Thomas; Le Petit, Gilbert; Douysset, Guilhem; Moulin, Christophe

    2015-11-01

    The SPALAX (Système de Prélèvement Automatique en Ligne avec l'Analyse du Xénon) is one of the systems used in the International Monitoring System of the Comprehensive Nuclear Test Ban Treaty (CTBT) to detect radioactive xenon releases following a nuclear explosion. Approximately 10 years after the industrialization of the first system, the CEA has developed the SPALAX New Generation, SPALAX-NG, with the aim of increasing the global sensitivity and reducing the overall size of the system. A major breakthrough has been obtained by improving the sampling stage and the purification/concentration stage. The sampling stage evolution consists of increasing the sampling capacity and improving the gas treatment efficiency across new permeation membranes, leading to an increase in the xenon production capacity by a factor of 2-3. The purification/concentration stage evolution consists of using a new adsorbent Ag@ZSM-5 (or Ag-PZ2-25) with a much larger xenon retention capacity than activated charcoal, enabling a significant reduction in the overall size of this stage. The energy consumption of the system is similar to that of the current SPALAX system. The SPALAX-NG process is able to produce samples of almost 7 cm(3) of xenon every 12 h, making it the most productive xenon process among the IMS systems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Worldwide measurements of radioxenon background near isotope production facilities, a nuclear power plant and at remote sites: the ‘‘EU/JA-II’’ Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saey, P. R.J.; Ringbom, Anders; Bowyer, Ted W.

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) specifies that radioxenon measurements should be performed at 40 or more stations worldwide within the International Monitoring System (IMS). Measuring radioxenon is one of the principle techniques to detect underground nuclear explosions. Specifically, presence and ratios of different radioxenon isotopes allows determining whether a detection event under consideration originated from a nuclear explosion or a civilian source. However, radioxenon monitoring on a global scale is a novel technology and the global civil background must be characterized sufficiently. This paper lays out a study, based on several unique measurement campaigns, of the worldwide concentrations and sourcesmore » of verification relevant xenon isotopes. It complements the experience already gathered with radioxenon measurements within the CTBT IMS programme and focuses on locations in Belgium, Germany, Kuwait, Thailand and South Africa where very little information was available on ambient xenon levels or interesting sites offered opportunities to learn more about emissions from known sources. The findings corroborate the hypothesis that a few major radioxenon sources contribute in great part to the global radioxenon background. Additionally, the existence of independent sources of 131mXe (the daughter of 131I) has been demonstrated, which has some potential to bias the isotopic signature of signals from nuclear explosions.« less

  15. Short-range quantitative precipitation forecasting using Deep Learning approaches

    NASA Astrophysics Data System (ADS)

    Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.

    2017-12-01

    Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.

  16. DTRA's Nuclear Explosion Monitoring Research and Development Program

    NASA Astrophysics Data System (ADS)

    Nichols, J.; Dainty, A.; Phillips, J.

    2001-05-01

    The Defense Threat Reduction Agency (DTRA) has a Program in Basic Research and Development for Nuclear Explosion Technology within the Nuclear Treaties Branch of the Arms Control Technology Division. While the funding justification is Arms Control Treaties (i.e., Comprehensive Nuclear-Test-Ban Treaty, CTBT), the results are made available for any user. Funding for the Program has averaged around \\10m per year recently. By Congressional mandate, the program has disbursed money through competitive, peer-reviewed, Program Research and Development Announcements (PRDAs); there is usually (but not always) a PRDA each year. Typical awards have been for about three years at ~\\100,000 per year, currently there are over 60 contracts in place. In addition to the "typical" awards, there was an initiative 2000 to fund seismic location calibration of the International Monitoring System (IMS) of the CTBT; there are three three-year contracts of ~\\$1,000,000 per year to perform such calibration for Eurasia, and North Africa and the Middle East. Scientifically, four technological areas have been funded, corresponding to the four technologies in the IMS: seismic, infrasound, hydroacoustic, and radionuclide, with the lion's share of the funding going to the seismic area. The scientific focus of the Program for all four technologies is detection of signals, locating their origin, and trying to determine of they are unambiguously natural in origin ("event screening"). Location has been a particular and continuing focus within the Program.

  17. Spalax™ new generation: A sensitive and selective noble gas system for nuclear explosion monitoring.

    PubMed

    Le Petit, G; Cagniant, A; Gross, P; Douysset, G; Topin, S; Fontaine, J P; Taffary, T; Moulin, C

    2015-09-01

    In the context of the verification regime of the Comprehensive nuclear Test ban Treaty (CTBT), CEA is developing a new generation (NG) of SPALAX™ system for atmospheric radioxenon monitoring. These systems are able to extract more than 6cm(3) of pure xenon from air samples each 12h and to measure the four relevant xenon radioactive isotopes using a high resolution detection system operating in electron-photon coincidence mode. This paper presents the performances of the SPALAX™ NG prototype in operation at Bruyères-le-Châtel CEA centre, integrating the most recent CEA developments. It especially focuses on an innovative detection system made up of a gas cell equipped with two face-to-face silicon detectors associated to one or two germanium detectors. Minimum Detectable activity Concentrations (MDCs) of environmental samples were calculated to be approximately 0.1 mBq/m(3) for the isotopes (131m)Xe, (133m)Xe, (133)Xe and 0.4 mBq/m(3) for (135)Xe (single germanium configuration). The detection system might be used to simultaneously measure particulate and noble gas samples from the CTBT International Monitoring System (IMS). That possibility could lead to new capacities for particulate measurements by allowing electron-photon coincidence detection of certain fission products. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Completing and sustaining IMS network for the CTBT Verification Regime

    NASA Astrophysics Data System (ADS)

    Meral Ozel, N.

    2015-12-01

    The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.

  19. Proceedings of the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, James W., LTC

    2000-09-15

    These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate,more » and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.« less

  20. Prevalence and Clinical Relevance of Exon 2 Deletion of COMMD1 in Bedlington Terriers in Korea.

    PubMed

    Kim, Y G; Kim, S Y; Kim, J H; Lee, K K; Yun, Y M

    2016-11-01

    Deletion of exon 2 of copper metabolism domain containing 1 (COMMD1) results in copper toxicosis in Bedlington terriers (CT-BT). This study was conducted to identify the prevalence and clinical relevance of the COMMD1 mutation in Bedlington terriers in Korea. A total of 105 purebred Bedlington terriers (50 males, 55 females) from the kennels and pet dog clubs in Korea were examined during the period 2008-2013. A multiplex PCR was carried out to detect exon 2 deletion of COMMD1. Clinical analysis was performed on each genetic group, and clinical status of the dogs was followed up to estimate survival probability. Of the 105 samples, 52 (49%) were wild-type homozygote, 47 (45%) were heterozygote, and 6 (6%) were mutant-type homozygote. Plasma alanine aminotransferase (ALT) activity was increased in the mutant-type homozygous group >2 years of age (P < .0001). The survival probability of 6 mutant-type homozygotes surviving 2.5 years was 0.67, and 4 years was 0.5. Results show the prevalence and clinical relevance of exon 2 deletion of COMMD1 and could help establish a structured selective breeding program to prevent CT-BT in Korea. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  1. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, R.L.; Gross, D.; Pearson, D.C.

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic,more » videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.« less

  2. Obtaining Unique, Comprehensive Deep Seismic Sounding Data Sets for CTBT Monitoring and Broad Seismological Studies

    DTIC Science & Technology

    2007-07-02

    TYPE Final Report 3. DATES COVERED (From - To) 26-Sep-01 to 26-Jun-07 4. TITLE AND SUBTITLE OBTAINING UNIQUE, COMPREHENSIVE DEEP SEISMIC ... seismic records from 12 major Deep Seismic Sounding (DSS) projects acquired in 1970-1980’s in the former Soviet Union. The data include 3-component...records from 22 Peaceful Nuclear Explosions (PNEs) and over 500 chemical explosions recorded by a grid of linear, reversed seismic profiles covering a

  3. Seismological Investigations of the National Data Centre Preparedness Exercise 2015 (NPE2015)

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Hartmann, Gernot; Ross, Jens-Ole

    2017-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. The National Data Centre Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and national technical means. These exercises should help to evaluate the effectiveness of analysis procedures applied at NDCs and the quality, completeness and usefulness of IDC products. The NPE2015 is a combined radionuclide-waveform scenario. Fictitious particulate radionuclide and radioxenon measurements at stations of the IMS (International Monitoring System) of the CTBTO were reported to the international community. The type of isotopes and concentrations could arise from an underground nuclear explosion (UNE). The task of the exercise is to identify the scenario behind the provided data. The source region and time domain of a possible treaty violation activity was determined from ATM in backtracking mode with input data from the fictitious data. A time slot in October and a region around the mining area of Lubin could be identified as the possible source area of the fictitious measurements. The seismicity of the determined source region was investigated in detail to identify events which cannot be classified as natural or induced within the relevant time interval. The comparison of spectral characteristics and a cluster analysis was applied to search for a non-characteristic event within a number of known induced events in the area. The results reveal that none of the candidate events had an explosion like characteristic. All candidate events are part of event cluster with a minimum of seven events with comparable signature. The possibility of a treaty violation would be very low in a real scenario. If the nature of a suspicious event cannot be clarified with data of the IMS or national technical means, an on-site inspection (OSI) can be requested by the member states. Taking into account the results of the seismological investigations it could be decided that an OSI is not necessary for the possible source region to exclude the possibility of a fictitious clandestine underground nuclear explosion.

  4. On the exploitation of seismic resonances for cavity detection

    NASA Astrophysics Data System (ADS)

    Schneider, Felix M.; Esterhazy, Sofi; Perugia, Ilaria; Bokelmann, Götz

    2017-04-01

    We study the interaction of a seismic wave-field with a spherical acoustic gas- or fluid-filled cavity. The intention of this study is to clarify whether seismic resonances can be expected, a characteristic feature, which may help detecting cavities in the subsurface. This is important for many applications, as in particular the detection of underground nuclear explosions which are to be prohibited by the Comprehensive-Test-Ban-Treaty (CTBT). On-Site Inspections (OSI) should assure possible violation of the CTBT to be convicted after detection of a suspicious event from a nuclear explosion by the international monitoring system (IMS). One primary structural target for the field team during an OSI is the detection of cavities created by underground nuclear explosions. The application of seismic resonances of the cavity for its detection has been proposed in the CTBT by mentioning "resonance seismometry" as possible technique during OSIs. In order to calculate the full seismic wave-field from an incident plane wave that interacts with the cavity, we considered an analytic formulation of the problem. The wave-field interaction consists of elastic scattering and the wave-field interaction between the acoustic and elastic media. Acoustic resonant modes, caused by internal reflections in the acoustic cavity, show up as spectral peaks in the frequency domain. The resonant peaks are in close correlation to the eigenfrequencies of the undamped system described by the particular acoustic medium bounded in a sphere with stiff walls. The filling of the cavity could thus be determined by the observation of spectral peaks from acoustic resonances. By energy transmission from the internal oscillations back into the elastic domain and intrisic attenuation, the oscillations experience damping, resulting in a frequency shift and a limitation of the resonance amplitudes. In case of a gas-filled cavity the impedance contrast is high resulting in very narrow, high-amplitude resonances. In synthetic seismograms calculated in the surrounding elastic domain, the acoustic resonances of gas-filled cavities show up as persisting oscillations. However, due to the weak acoustic-elastic coupling in this case the amplitudes of the oscillations are very low. Due to a lower impedance contrast, a fluid-filled cavity has a stronger acoustic-elastic coupling, which results in wide spectral peaks of lower amplitudes. In the synthetic seismograms derived in the surrounding medium of fluid-filled cavities, acoustic resonances show up as strong but fast decaying reverberations. Based on the analytical modeling methods for exploitation of these resonance features are developed and discussed.

  5. Analysis, comparison, and modeling of radar interferometry, date of surface deformation signals associated with underground explosions, mine collapses and earthquakes. Phase I: underground explosions, Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foxall, W; Vincent, P; Walter, W

    1999-07-23

    We have previously presented simple elastic deformation modeling results for three classes of seismic events of concern in monitoring the CTBT--underground explosions, mine collapses and earthquakes. Those results explored the theoretical detectability of each event type using synthetic aperture radar interferometry (InSAR) based on commercially available satellite data. In those studies we identified and compared the characteristics of synthetic interferograms that distinguish each event type, as well the ability of the interferograms to constrain source parameters. These idealized modeling results, together with preliminary analysis of InSAR data for the 1995 mb 5.2 Solvay mine collapse in southwestern Wyoming, suggested thatmore » InSAR data used in conjunction with regional seismic monitoring holds great potential for CTBT discrimination and seismic source analysis, as well as providing accurate ground truth parameters for regional calibration events. In this paper we further examine the detectability and ''discriminating'' power of InSAR by presenting results from InSAR data processing, analysis and modeling of the surface deformation signals associated with underground explosions. Specifically, we present results of a detailed study of coseismic and postseismic surface deformation signals associated with underground nuclear and chemical explosion tests at the Nevada Test Site (NTS). Several interferograms were formed from raw ERS-1/2 radar data covering different time spans and epochs beginning just prior to the last U.S. nuclear tests in 1992 and ending in 1996. These interferograms have yielded information about the nature and duration of the source processes that produced the surface deformations associated with these events. A critical result of this study is that significant post-event surface deformation associated with underground nuclear explosions detonated at depths in excess of 600 meters can be detected using differential radar interferometry. An immediate implication of this finding is that underground nuclear explosions may not need to be captured coseismically by radar images acquired before and after an event in order to be detectable. This has obvious advantages in CTBT monitoring since suspect seismic events--which usually can be located within a 100 km by 100 km area of an ERS-1/2 satellite frame by established seismic methods-can be imaged after the event has been identified and located by existing regional seismic networks. Key Words: InSAR, SLC images, interferogram, synthetic interferogram, ERS-1/2 frame, phase unwrapping, DEM, coseismic, postseismic, source parameters.« less

  6. The European Infrasound Bulletin

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Vergoz, Julien; Le Pichon, Alexis; Brachet, Nicolas; Blanc, Elisabeth; Kero, Johan; Liszka, Ludwik; Gibbons, Steven; Kvaerna, Tormod; Näsholm, Sven Peter; Marchetti, Emanuele; Ripepe, Maurizio; Smets, Pieter; Evers, Laslo; Ghica, Daniela; Ionescu, Constantin; Sindelarova, Tereza; Ben Horin, Yochai; Mialle, Pierrick

    2018-05-01

    The European Infrasound Bulletin highlights infrasound activity produced mostly by anthropogenic sources, recorded all over Europe and collected in the course of the ARISE and ARISE2 projects (Atmospheric dynamics Research InfraStructure in Europe). Data includes high-frequency (> 0.7 Hz) infrasound detections at 24 European infrasound arrays from nine different national institutions complemented with infrasound stations of the International Monitoring System for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Data were acquired during 16 years of operation (from 2000 to 2015) and processed to identify and locate ˜ 48,000 infrasound events within Europe. The source locations of these events were derived by combining at least two corresponding station detections per event. Comparisons with ground-truth sources, e.g., Scandinavian mining activity, are provided as well as comparisons with the CTBT Late Event Bulletin (LEB). Relocation is performed using ray-tracing methods to estimate celerity and back-azimuth corrections for source location based on meteorological wind and temperature values for each event derived from European Centre for Medium-range Weather Forecast (ECMWF) data. This study focuses on the analysis of repeating, man-made infrasound events (e.g., mining blasts and supersonic flights) and on the seasonal, weekly and diurnal variation of the infrasonic activity of sources in Europe. Drawing comparisons to previous studies shows that improvements in terms of detection, association and location are made within this study due to increasing the station density and thus the number of events and determined source regions. This improves the capability of the infrasound station network in Europe to more comprehensively estimate the activity of anthropogenic infrasound sources in Europe.

  7. Multi-Detection Events, Probability Density Functions, and Reduced Location Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Schrom, Brian T.

    2016-03-01

    Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.

  8. Silicon PIN diode based electron-gamma coincidence detector system for Noble Gases monitoring.

    PubMed

    Khrustalev, K; Popov, V Yu; Popov, Yu S

    2017-08-01

    We present a new second generation SiPIN based electron-photon coincidence detector system developed by Lares Ltd. for use in the Noble Gas measurement systems of the International Monitoring System and the On-site Inspection verification regimes of the Comprehensive Nuclear-Test Ban Treaty (CTBT). The SiPIN provide superior energy resolution for electrons. Our work describes the improvements made in the second generation detector cells and the potential use of such detector systems for other applications such as In-Situ Kr-85 measurements for non-proliferation purposes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Use of IMS data and its potential for research through global noble gases concentration maps

    NASA Astrophysics Data System (ADS)

    Terzi, Lucrezia; Kalinowski, Martin; Gueibe, Christophe; Camps, Johan; Gheddou, Abdelhakim; Kusmierczyk-Michulec, Jolanta; Schoeppner, Michael

    2017-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) established for verification purposes a global monitoring system for atmospheric radioisotopes and noble gas radioactivity. Daily activity concentrations have been collected worldwide for over 15 years providing unique data sets with long term time series that can be used for atmospheric circulation dynamics analysis. In this study, we want to emphasize the value of worldwide noble gas data by reconstructing global xenon concentration maps and comparing these observations with ATM simulations. By creating a residual plot, we can improve our understanding of our source estimation level for each region.

  10. Ongoing research experiments at the former Soviet nuclear test site in eastern Kazakhstan

    USGS Publications Warehouse

    Leith, William S.; Kluchko, Luke J.; Konovalov, Vladimir; Vouille, Gerard

    2002-01-01

    Degelen mountain, located in EasternKazakhstan near the city of Semipalatinsk, was once the Soviets most active underground nuclear test site. Two hundred fifteen nuclear tests were conducted in 181 tunnels driven horizontally into its many ridges--almost twice the number of tests as at any other Soviet underground nuclear test site. It was also the site of the first Soviet underground nuclear test--a 1-kiloton device detonated on October 11, 1961. Until recently, the details of testing at Degelen were kept secret and have been the subject of considerable speculation. However, in 1991, the Semipalatinsk test site became part of the newly independent Republic of Kazakhstan; and in 1995, the Kazakhstani government concluded an agreement with the U.S. Department of Defense to eliminate the nuclear testing infrastructure in Kazakhstan. This agreement, which calls for the "demilitarization of the infrastructure directly associated with the nuclear weapons test tunnels," has been implemented as the "Degelen Mountain Tunnel Closure Program." The U.S. Defense Threat Reduction Agency, in partnership with the Department of Energy, has permitted the use of the tunnel closure project at the former nuclear test site as a foundation on which to support cost-effective, research-and-development-funded experiments. These experiments are principally designed to improve U.S. capabilities to monitor and verify the Comprehensive Test Ban Treaty (CTBT), but have provided a new source of information on the effects of nuclear and chemical explosions on hard, fractured rock environments. These new data extends and confirms the results of recent Russian publications on the rock environment at the site and the mechanical effects of large-scale chemical and nuclear testing. In 1998, a large-scale tunnel closure experiment, Omega-1, was conducted in Tunnel 214 at Degelen mountain. In this experiment, a 100-ton chemical explosive blast was used to test technologies for monitoring the Comprehensive Nuclear Test Ban Treaty, and to calibrate a portion of the CTBT's International Monitoring System. This experiment has also provided important benchmark data on the mechanical behavior of hard, dense, fractured rock, and has demonstrated the feasibility of fielding large-scale calibration explosions, which are specified as a "confidence-building measure" in the CTBT Protocol. Two other large-scale explosion experiments, Omega-2 and Omega-3, are planned for the summer of 1999 and 2000. Like the Tunnel 214 test, the 1999 experiment will include close-in monitoring of near-source effects, as well as contributing to the calibration of key seismic stations for the Comprehensive Test Ban Treaty. The Omega-3 test will examine the effect of multiple blasts on the fractured rock environment.

  11. Nuclear Explosion Monitoring History and Research and Development

    NASA Astrophysics Data System (ADS)

    Hawkins, W. L.; Zucca, J. J.

    2008-12-01

    Within a year after the nuclear detonations over Hiroshima and Nagasaki the Baruch Plan was presented to the newly formed United Nations Atomic Energy Commission (June 14, 1946) to establish nuclear disarmament and international control over all nuclear activities. These controls would allow only the peaceful use of atomic energy. The plan was rejected through a Security Council veto primarily because of the resistance to unlimited inspections. Since that time there have been many multilateral, and bilateral agreements, and unilateral declarations to limit or eliminate nuclear detonations. Almost all of theses agreements (i.e. treaties) call for some type of monitoring. We will review a timeline showing the history of nuclear testing and the more important treaties. We will also describe testing operations, containment, phenomenology, and observations. The Comprehensive Nuclear Test Ban Treaty (CTBT) which has been signed by 179 countries (ratified by 144) established the International Monitoring System global verification regime which employs seismic, infrasound, hydroacoustic and radionuclide monitoring techniques. The CTBT also includes on-site inspection to clarify whether a nuclear explosion has been carried out in violation of the Treaty. The US Department of Energy (DOE) through its National Nuclear Security Agency's Ground-Based Nuclear Explosion Monitoring R&D Program supports research by US National Laboratories, and universities and industry internationally to detect, locate, and identify nuclear detonations. This research program builds on the broad base of monitoring expertise developed over several decades. Annually the DOE and the US Department of Defense jointly solicit monitoring research proposals. Areas of research include: seismic regional characterization and wave propagation, seismic event detection and location, seismic identification and source characterization, hydroacoustic monitoring, radionuclide monitoring, infrasound monitoring, and data processing and analysis. Reports from the selected research projects are published in the proceedings of the annual Monitoring Research Review conference.

  12. Detection capability of the IMS seismic network based on ambient seismic noise measurements

    NASA Astrophysics Data System (ADS)

    Gaebler, Peter J.; Ceranna, Lars

    2016-04-01

    All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection threshold can be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.

  13. Atmospheric Transport Modelling confining potential source location of East-Asian radionuclide detections in May 2010

    NASA Astrophysics Data System (ADS)

    Ross, J. Ole; Ceranna, Lars

    2016-04-01

    The radionuclide component of the International Monitoring System (IMS) to verify compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) is in place to detect tiny traces of fission products from nuclear explosions in the atmosphere. The challenge for the interpretation of IMS radionuclide data is to discriminate radionuclide sources of CTBT relevance against emissions from nuclear facilities. Remarkable activity concentrations of Ba/La-140 occurred at the IMS radionuclide stations RN 37 (Okinawa) and RN 58 (Ussurysk) mid of May 2010. In those days also an elevated Xe-133 level was measured at RN 38 (Takasaki). Additional regional measurements of radioxenon were reported in the press and further analyzed in various publications. The radionuclide analysis gives evidence for the presence of a nuclear fission source between 10 and 12 May 2010. Backward Atmospheric Transport Modelling (ATM) with HYSPLIT driven by 0.2° ECMWF meteorological data for the IMS samples indicates that, assuming a single source, a wide range of source regions is possible including the Korean Peninsula, the Sea of Japan (East Sea), and parts of China and Russia. Further confinement of the possible source location can be provided by atmospheric backtracking for the assumed sampling periods of the reported regional xenon measurements. New studies indicate a very weak seismic event at the DPRK test site on early 12 May 2010. Forward ATM for a pulse release caused by this event shows fairly good agreement with the observed radionuclide signature. Nevertheless, the underlying nuclear fission scenario remains quite unclear and speculative even if assuming a connection between the waveform and the radionuclide event.

  14. SUPER-FOCUS: a tool for agile functional analysis of shotgun metagenomic data

    PubMed Central

    Green, Kevin T.; Dutilh, Bas E.; Edwards, Robert A.

    2016-01-01

    Summary: Analyzing the functional profile of a microbial community from unannotated shotgun sequencing reads is one of the important goals in metagenomics. Functional profiling has valuable applications in biological research because it identifies the abundances of the functional genes of the organisms present in the original sample, answering the question what they can do. Currently, available tools do not scale well with increasing data volumes, which is important because both the number and lengths of the reads produced by sequencing platforms keep increasing. Here, we introduce SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reduced reference database to report the subsystems present in metagenomic datasets and profile their abundances. SUPER-FOCUS was tested with over 70 real metagenomes, the results showing that it accurately predicts the subsystems present in the profiled microbial communities, and is up to 1000 times faster than other tools. Availability and implementation: SUPER-FOCUS was implemented in Python, and its source code and the tool website are freely available at https://edwards.sdsu.edu/SUPERFOCUS. Contact: redwards@mail.sdsu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26454280

  15. SUPER-FOCUS: a tool for agile functional analysis of shotgun metagenomic data.

    PubMed

    Silva, Genivaldo Gueiros Z; Green, Kevin T; Dutilh, Bas E; Edwards, Robert A

    2016-02-01

    Analyzing the functional profile of a microbial community from unannotated shotgun sequencing reads is one of the important goals in metagenomics. Functional profiling has valuable applications in biological research because it identifies the abundances of the functional genes of the organisms present in the original sample, answering the question what they can do. Currently, available tools do not scale well with increasing data volumes, which is important because both the number and lengths of the reads produced by sequencing platforms keep increasing. Here, we introduce SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reduced reference database to report the subsystems present in metagenomic datasets and profile their abundances. SUPER-FOCUS was tested with over 70 real metagenomes, the results showing that it accurately predicts the subsystems present in the profiled microbial communities, and is up to 1000 times faster than other tools. SUPER-FOCUS was implemented in Python, and its source code and the tool website are freely available at https://edwards.sdsu.edu/SUPERFOCUS. redwards@mail.sdsu.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  16. Special event discrimination analysis: The TEXAR blind test and identification of the August 16, 1997 Kara Sea event. Final report, 13 September 1995--31 January 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgardt, D.

    1998-03-31

    The International Monitoring System (IMS) for the Comprehensive Test Ban Treaty (CTBT) faces the serious challenge of being able to accurately and reliably identify seismic events in any region of the world. Extensive research has been performed in recent years on developing discrimination techniques which appear to classify seismic events into broad categories of source types, such as nuclear explosion, earthquake, and mine blast. This report examines in detail the problem of effectiveness of regional discrimination procedures in the application of waveform discriminants to Special Event identification and the issue of discriminant transportability.

  17. Global Monitoring of the CTBT: Progress, Capabilities and Plans (Invited)

    NASA Astrophysics Data System (ADS)

    Zerbo, L.

    2013-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), established in 1996, is tasked with building up the verification regime of the CTBT. The regime includes a global system for monitoring the earth, the oceans and the atmosphere for nuclear tests, and an on-site inspection (OSI) capability. More than 80% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. Concurrent with the build-up of the global monitoring networks, near-field geophysical methods are being developed and tested for OSIs. The monitoring system is currently operating in a provisional mode, as the Treaty has not yet entered into force. Progress in installing and operating the IMS and the IDC and in building up an OSI capability will be described. The capabilities of the monitoring networks have progressively improved as stations are added to the IMS and IDC processing techniques refined. Detection thresholds for seismic, hydroacoustic, infrasound and radionuclide events have been measured and in general are equal to or lower than the predictions used during the Treaty negotiations. The measurements have led to improved models and tools that allow more accurate predictions of future capabilities and network performance under any configuration. Unplanned tests of the monitoring network occurred when the DPRK announced nuclear tests in 2006, 2009, and 2013. All three tests were well above the detection threshold and easily detected and located by the seismic monitoring network. In addition, noble gas consistent with the nuclear tests in 2006 and 2013 (according to atmospheric transport models) was detected by stations in the network. On-site inspections of these tests were not conducted as the Treaty has not entered into force. In order to achieve a credible and trustworthy Verification System, increased focus is being put on the development of OSI operational capabilities while operating and sustaining the existing monitoring system, increasing the data availability and quality, and completing the remaining facilities of the IMS. Furthermore, as mandated by the Treaty, the CTBTO also seeks to continuously improve its technologies and methods through interaction with the scientific community. Workshops and scientific conferences such as the CTBT Science and Technology Conference series provide venues for exchanging ideas, and mechanisms have been developed for sharing IMS data with researchers who are developing and testing new and innovative methods pertinent to the verification regime. While progress is steady on building up the verification regime, there is also progress in gaining entry into force of the Treaty, which requires the signatures and ratifications of the DPRK, India and Pakistan; it also requires the ratifications of China, Egypt, Iran, Israel and the United States. Thirty-six other States, whose signatures and ratifications are needed for entry into force have already done so.

  18. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  19. Make the World Safer from Nuclear Weapons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowyer, Ted

    Senior Nuclear Scientist Ted Bowyer knows firsthand the challenges associated with protecting our nation. Ted and his colleagues help detect the proliferation of nuclear weapons. They developed award-winning technologies that give international treaty verification authorities “eyes and ears” around the globe. The instruments, located in 80 countries, help ensure compliance with the Comprehensive Nuclear Test-Ban Treaty, or CTBT. They are completely automated radionuclide monitoring systems that would detect airborne radioactive particles if a nuclear detonation occurred in the air, underground or at sea. Some samples collected through these technologies are sent to PNNL’s Shallow Underground Laboratory—the only certified U.S. radionuclidemore » laboratory for the CTBT’s International Monitoring System Organization.« less

  20. Magnitude Based Discrimination of Manmade Seismic Events From Naturally Occurring Earthquakes in Utah, USA

    NASA Astrophysics Data System (ADS)

    Koper, K. D.; Pechmann, J. C.; Burlacu, R.; Pankow, K. L.; Stein, J. R.; Hale, J. M.; Roberson, P.; McCarter, M. K.

    2016-12-01

    We investigate the feasibility of using the difference between local (ML) and coda duration (MC) magnitude as a means of discriminating manmade seismic events from naturally occurring tectonic earthquakes in and around Utah. Using a dataset of nearly 7,000 well-located earthquakes in the Utah region, we find that ML-MC is on average 0.44 magnitude units smaller for mining induced seismicity (MIS) than for tectonic seismicity (TS). MIS occurs within near-surface low-velocity layers that act as a waveguide and preferentially increase coda duration relative to peak amplitude, while the vast majority of TS occurs beneath the near-surface waveguide. A second dataset of more than 3,700 probable explosions in the Utah region also has significantly lower ML-MC values than TS, likely for the same reason as the MIS. These observations suggest that ML-MC, or related measures of peak amplitude versus signal duration, may be useful for discriminating small explosions from earthquakes at local-to-regional distances. ML and MC can be determined for small events with relatively few observations, hence an ML-MC discriminant can be effective in cases where moment tensor inversion is not possible because of low data quality or poorly known Green's functions. Furthermore, an ML-MC discriminant does not rely on the existence of the fast attenuating Rg phase at regional distances. ML-MC may provide a local-to-regional distance extension of the mb-MS discriminant that has traditionally been effective at identifying large nuclear explosions with teleseismic data. This topic is of growing interest in forensic seismology, in part because the Comprehensive Nuclear Test Ban Treaty (CTBT) is a zero tolerance treaty that prohibits all nuclear explosions, no matter how small. If the CTBT were to come into force, source discrimination at local distances would be required to verify compliance.

  1. Implications from Meteoric and Volcanic Infrasound Measured in the Netherlands

    NASA Astrophysics Data System (ADS)

    Evers, L.

    2003-12-01

    Infrasound observations started in the Netherlands in 1986. Since then, several array configurations and instruments have been developed, tested and made operational. Currently, three infrasound arrays are continuously measuring infrasound with in-house developed microbarometers. The array apertures vary from 30 to 1500 meters and the number of instruments from 6 to 16 microbarometers. The inter-array distance ranges from 50 up to 150 km. This dense network of infrasound arrays is used to distinguish between earthquakes and sources in the atmosphere. Sonic booms, for example, can be experienced in the same manner as small (gas induced) earthquakes. Furthermore, Comprehensive Nuclear-Test-Ban Treaty (CTBT) related research is done. Meteors are one of the few natural impulsive sources generating energy in kT TNT equivalent range. Therefore, the study of meteors is essential to the CTBT where infrasound is applied as monitoring technique. Studies of meteors in the Netherlands have shown the capability of infrasound to trace a meteor through the stratosphere. The propagation of infrasound is in first order dependent on the wind and temperature structure of the atmosphere. The meteor's path could be reconstructed by using ECMWF atmospheric models for wind and temperature. The results were compared to visual observations, confirming the location, direction and reported origin time. The accuracy of the localization mainly depends on the applied atmospheric model and array resolution. Successfully applying infrasound depends on the array configuration that should be based on the -frequency depend- spatial coherence of the signals of interest. The array aperture and inter-element distance will play a decisive role in detecting low signal-to-noise ratios. This is shown by results from studies on volcanic infrasound from Mt. Etna (Italy) detected in the Netherlands. Sub-array processing on the 16 element array revealed an increased detectability of infrasound for small aperture, 800 m, arrays, compared to large aperture, 1500 m, arrays.

  2. The Global Detection Capability of the IMS Seismic Network in 2013 Inferred from Ambient Seismic Noise Measurements

    NASA Astrophysics Data System (ADS)

    Gaebler, P. J.; Ceranna, L.

    2016-12-01

    All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection thresholdcan be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.

  3. Verification System: First System-Wide Performance Test

    NASA Astrophysics Data System (ADS)

    Chernobay, I.; Zerbo, L.

    2006-05-01

    System-wide performance tests are essential for the development, testing and evaluation of individual components of the verification system. In addition to evaluating global readiness it helps establishing the practical and financial requirements for eventual operations. The first system-wide performance test (SPT1) was conducted in three phases: - A preparatory phase in May-June 2004 - A performance testing phase in April-June 2005 - An evaluation phase in the last half of 2005. The preparatory phase was developmental in nature. The main objectives for the performance testing phase included establishment of performance baseline under current provisional mode of operation (CTBT/PC- 19/1/Annex II, CTBT/WGB-21/1), examination of established requirements and procedures for operation and maintenance. To establish a system-wide performance baseline the system configuration was fixed for April-May 2005. The third month (June 2005) was used for implementation of 21 test case scenarios to examine either particular operational procedures or the response of the system components to the failures simulated under controlled conditions. A total of 163 stations and 5 certified radionuclide laboratories of International Monitoring System (IMS) participated in the performance testing phase - about 50% of the eventual IMS network. 156 IMS facilities and 40 National Data Centres (NDCs) were connected to the International Data Centre (IDC) via Global Communication Infrastructure (GCI) communication links. In addition, 12 legacy stations in the auxiliary seismic network sent data to the IDC over the Internet. During the performance testing phase, the IDC produced all required products, analysed more than 6100 seismic events and 1700 radionuclide spectra. Performance of all system elements was documented and analysed. IDC products were compared with results of data processing at the NDCs. On the basis of statistics and information collected during the SPT1 a system-wide performance baseline under current guidelines for provisional Operation and Maintenance was established. The test provided feedback for further development of the draft IMS and IDC Operational Manuals and identified priority areas for further system development.

  4. The workshop on signatures of medical and industrial isotope production - WOSMIP; Strassoldo, Italy, 1-3 July 2009.

    PubMed

    Matthews, K M; Bowyer, T W; Saey, P R J; Payne, R F

    2012-08-01

    Radiopharmaceuticals make contributions of inestimable value to medical practice. With growing demand new technologies are being developed and applied worldwide. Most diagnostic procedures rely on (99m)Tc and the use of uranium targets in reactors is currently the favored method of production, with 95% of the necessary (99)Mo parent currently being produced by four major global suppliers. Coincidentally there are growing concerns for nuclear security and proliferation. New disarmament treaties such as the Comprehensive Nuclear-Test-Ban Treaty (CTBT) are coming into effect and treaty compliance-verification monitoring is gaining momentum. Radioxenon emissions (isotopes Xe-131, 133, 133m and 135) from radiopharmaceutical production facilities are of concern in this context because radioxenon is a highly sensitive tracer for detecting nuclear explosions. There exists, therefore, a potential for confusing source attribution, with emissions from radiopharmaceutical-production facilities regularly being detected in treaty compliance-verification networks. The CTBT radioxenon network currently under installation is highly sensitive with detection limits approaching 0.1 mBq/m³ and, depending on transport conditions and background, able to detect industrial release signatures from sites thousands of kilometers away. The method currently employed to distinguish between industrial and military radioxenon sources involves plots of isotope ratios (133m)Xe/(131m)Xe versus (135)Xe/(133)Xe, but source attribution can be ambiguous. Through the WOSMIP Workshop the environmental monitoring community is gaining a better understanding of the complexities of the processes at production facilities, and the production community is recognizing the impact their operations have on monitoring systems and their goal of nuclear non-proliferation. Further collaboration and discussion are needed, together with advances in Xe trapping technology and monitoring systems. Such initiatives will help in addressing the dichotomy which exists between expanding production and improving monitoring sensitivity, with the ultimate aim of enabling unambiguous distinction between different nuclide signatures. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Using the DOE Knowledge Base for Special Event Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by spatial proximity searches or through waveform correlation processing. The locations and waveforms of these events can then be made available for side-by-side comparison and processing. If synthetic modeling is thought to be warranted, a wide variety of rele- vant contextu~l information (e.g. crustal thickness and layering, seismic velocities, attenuation factors) can be retrieved and sent to the appropriate applications. Once formedj the synthetics can then be brought in for side-by-side comparison and fhrther processing. Based on our study, we make two general recommendations. First, proper inter-process communication between sensor data analysis software and contextual data analysis sofisvare should be developed. Second, some of the Knowl- edge Base data sets should be prioritized or winnowed to streamline comparison with observed quantities.« less

  6. Measurement of 37Ar to support technology for On-site Inspection under the Comprehensive Nuclear-Test-Ban Treaty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aalseth, Craig E.; Day, Anthony R.; Haas, Derek A.

    On-Site Inspection (OSI) is a key component of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Measurements of radionuclide isotopes created by an underground nuclear explosion are a valuable signature of a Treaty violation. Argon-37 is produced from neutron interaction with calcium in soil, 40Ca(n,α)37Ar. For OSI, the 35-day half-life of 37Ar provides both high specific activity and sufficient time for completion of an inspection before decay limits sensitivity. This paper presents a low-background internal-source gas proportional counter with an 37Ar measurement sensitivity level equivalent to 45.1 mBq/SCM in whole air.

  7. Characterization of infrasound from lightning

    NASA Astrophysics Data System (ADS)

    Assink, J. D.; Evers, L. G.; Holleman, I.; Paulssen, H.

    2008-08-01

    During thunderstorm activity in the Netherlands, electromagnetic and infrasonic signals are emitted due to the process of lightning and thunder. It is shown that correlating infrasound detections with results from a electromagnetic lightning detection network is successful up to distances of 50 km from the infrasound array. Infrasound recordings clearly show blastwave characteristics which can be related to cloud-ground discharges, with a dominant frequency between 1-5 Hz. Amplitude measurements of CG discharges can partly be explained by the beam pattern of a line source with a dominant frequency of 3.9 Hz, up to a distance of 20 km. The ability to measure lightning activity with infrasound arrays has both positive and negative implications for CTBT verification purposes. As a scientific application, lightning studies can benefit from the worldwide infrasound verification system.

  8. Challenges in Regional CTBT Monitoring: The Experience So Far From Vienna

    NASA Astrophysics Data System (ADS)

    Bratt, S. R.

    2001-05-01

    The verification system being established to monitor the CTBT will include an International Monitoring System (IMS) network of 321 seismic, hydroacoustic, infrasound and radionuclide stations, transmitting digital data to the International Data Centre (IDC) in Vienna, Austria over a Global Communications Infrastructure (GCI). The IDC started in February 2000 to disseminate a wide range of products based on automatic processing and interactive analysis of data from about 90 stations from the four IMS technologies. The number of events in the seismo-acoustic Reviewed Event Bulletins (REB) was 18,218 for the year 2000, with the daily number ranging from 30 to 360. Over 300 users from almost 50 Member States are now receiving an average of 18,000 data and product deliveries per month from the IDC. As the IMS network expands (40 - 60 new stations are scheduled start transmitting data this year) and as GCI communications links bring increasing volumes of new data into Vienna (70 new GCI sites are currently in preparation), the monitoring capability of the IMS and IDC has the potential to improve significantly. To realize this potential, the IDC must continue to improve its capacity to exploit regional seismic data from events defined by few stations with large azimuthal gaps. During 2000, 25% of the events in the REB were defined by five or fewer stations. 48% were defined by at least one regional phase, and 24% were defined by at least three. 34% had gaps in azimuthal coverage of more than 180 degrees. The fraction of regional, sparsely detected events will only increase as new, sensitive stations come on-line, and the detection threshold drops. This will be offset, to some extent, because stations within the denser network that detect near-threshold events will be at closer distances, on average. Thus to address the challenges of regional monitoring, the IDC must integrate "tuned" station and network processing parameters for new stations; enhanced and/or new methods for estimating location, depth and uncertainty bounds; and validated, regionally-calibrated travel times, event characterization parameters and screening criteria. A new IDC program to fund research to calibrate regional seismic travel paths seeks to address, in cooperation with other national efforts, one item on this list. More effective use of the full waveform data and cross-technology synergies must be explored. All of this work must be integrated into modular software systems that can be maintained and improved over time. To motivate these regional monitoring challenges and possible improvements, the experience from the IDC will be presented via a series of illustrative, sample events. Challenges in the technical and policy arenas must be addressed as well. IMS data must first be available at the IDC before they can be analyzed. The encouraging experience to date is that the availability of data arriving via the GCI is significantly higher (~95%) than the availability (~70%) from the same stations prior to GCI installation, when they were transmitting data via other routes. Within the IDC, trade-offs must be considered between the desired levels of product quality and timeliness, and the investment in personnel and system development to support the levels sought. Another high-priority objective is to develop a policy for providing data and products to scientific and disaster alert organizations. It is clear that broader exploitation of these rich and unique assets could be of great, mutual benefit, and is, perhaps, a necessity for the CTBT verification system to achieve its potential.

  9. Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network

    NASA Astrophysics Data System (ADS)

    Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey

    2010-05-01

    The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.

  10. An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.; Semin, K.

    2009-12-01

    Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK event, we saw a very clear P arrival at 01:05:47 (GMT) at BRTR SP array. The result of the f-k analysis performed in Geotool software, installed at NDC facilities in 2008 and is in full use currently, was also indicating that the arrival belongs to the DPRK event. When comparing our f-k results (calculated at 1-2 Hz) with IDC-REB, however, we have noticed that our calculation and therefore corresponding residuals (calculated with reference to REB residuals) are much better in comparison to REB. The reasons of this ambiguity have been explored and for the first time a comprehensive seismological analysis of a Nuclear Test has been conducted in Turkey. CTBT has an important role for the implementation of the non-proliferation of nuclear weapons and it is a key element for the pursuit of nuclear disarmament. In this study, we would like to reflect the technical and scientific aspects of the 25 May 2009 DPRK event analysis, together with our involvement in CTBT(O) affairs, which we believe it brings new dimensions to Turkey especially in the area of Geophysics.

  11. Seismic resonances of acoustic cavities

    NASA Astrophysics Data System (ADS)

    Schneider, F. M.; Esterhazy, S.; Perugia, I.; Bokelmann, G.

    2016-12-01

    The goal of an On-Site Inspection (OSI) is to clarify at a possible testsite whether a member state of the Comprehensive nuclear Test Ban Treaty (CTBT)has violated its rules by conducting a underground nuclear test. Compared toatmospheric and underwater tests underground nuclear explosions are the mostdifficult to detect.One primary structural target for the field team during an OSI is the detectionof an underground cavity, created by underground nuclear explosions. Theapplication of seismic-resonances of the cavity for its detection has beenproposed in the CTBT by mentioning "resonance seismometry" as possibletechnique during OSIs. We modeled the interaction of a seismic wave-field withan underground cavity by a sphere filled with an acoustic medium surrounded byan elastic full space. For this setting the solution of the seismic wave-fieldcan be computed analytically. Using this approach the appearance of acousticresonances can be predicted in the theoretical calculations. Resonance peaksappear in the spectrum derived for the elastic domain surrounding the acousticcavity, which scale in width with the density of the acoustic medium. For lowdensities in the acoustic medium as for an gas-filled cavity, the spectralpeaks become very narrow and therefore hard to resolve. The resonancefrequencies, however can be correlated to the discrete set of eigenmodes of theacoustic cavity and can thus be predicted if the dimension of the cavity isknown. Origin of the resonance peaks are internal reverberations of wavescoupling in the acoustic domain and causing an echoing signal that couples outto the elastic domain again. In the gas-filled case the amplitudes in timedomain are very low.Beside theoretical considerations we seek to find real data examples fromsimilar settings. As example we analyze a 3D active seismic data set fromFelsőpetény, Hungary that has been conducted between 2012 and 2014 on behalf ofthe CTBTO. In the subsurface of this area a former clay mine is situated, whichis connected to a karst cave of 30 m diameter in 70 m depth. Our aim is toinvestigate whether resonances predicted from theoretical models can be alsoobserved in data from such real experiments. Observation of spectral resonantpeaks could serve as the foundation of a cavity detection method that could beutilized for nuclear verification.

  12. Seismic wave interaction with underground cavities

    NASA Astrophysics Data System (ADS)

    Schneider, Felix M.; Esterhazy, Sofi; Perugia, Ilaria; Bokelmann, Götz

    2016-04-01

    Realization of the future Comprehensive Nuclear Test Ban Treaty (CTBT) will require ensuring its compliance, making the CTBT a prime example of forensic seismology. Following indications of a nuclear explosion obtained on the basis of the (IMS) monitoring network further evidence needs to be sought at the location of the suspicious event. For such an On-Site Inspection (OSI) at a possible nuclear test site the treaty lists several techniques that can be carried out by the inspection team, including aftershock monitoring and the conduction of active seismic surveys. While those techniques are already well established, a third group of methods labeled as "resonance seismometry" is less well defined and needs further elaboration. A prime structural target that is expected to be present as a remnant of an underground nuclear explosion is a cavity at the location and depth the bomb was fired. Originally "resonance seismometry" referred to resonant seismic emission of the cavity within the medium that could be stimulated by an incident seismic wave of the right frequency and observed as peaks in the spectrum of seismic stations in the vicinity of the cavity. However, it is not yet clear which are the conditions for which resonant emissions of the cavity could be observed. In order to define distance-, frequency- and amplitude ranges at which resonant emissions could be observed we study the interaction of seismic waves with underground cavities. As a generic model for possible resonances we use a spherical acoustic cavity in an elastic full-space. To solve the forward problem for the full elastic wave field around acoustic spherical inclusions, we implemented an analytical solution (Korneev, 1993). This yields the possibility of generating scattering cross-sections, amplitude spectrums and synthetic seismograms for plane incident waves. Here, we focus on the questions whether or not we can expect resonant responses in the wave field scattered from the cavity. We show results for varying input parameters such as dimensions, densities, and seismic velocities in and around the cavity, in order to discuss the applicability of such observations during an OSI.

  13. Monitoring and Reporting Tools of the International Data Centre and International Monitoring System

    NASA Astrophysics Data System (ADS)

    Lastowka, L.; Anichenko, A.; Galindo, M.; Villagran Herrera, M.; Mori, S.; Malakhova, M.; Daly, T.; Otsuka, R.; Stangel, H.

    2007-05-01

    The Comprehensive Test-Ban Treaty (CTBT) which prohibits all nuclear explosions was opened for signature in 1996. Since then, the Preparatory Commission for the CTBT Organization has been working towards the establishment of a global verification regime to monitor compliance with the ban on nuclear testing. The International Monitoring System (IMS) comprises facilities for seismic, hydroacoustic, infrasound and radionuclide monitoring, and the means of communication. This system is supported by the International Data Centre (IDC), which provides objective products and services necessary for effective global monitoring. Upon completion of the IMS, 321 stations will be contributing to both near real-time and reviewed data products. Currently there are 194 facilities in IDC operations. This number is expected to increase by about 40% over the next few years, necessitating methods and tools to effectively handle the expansion. The requirements of high data availability as well as operational transparency are fundamental principals of IMS network operations, therefore, a suite of tools for monitoring and reporting have been developed. These include applications for monitoring Global Communication Infrastructure (GCI) links, detecting outages in continuous and segmented data, monitoring the status of data processing and forwarding to member states, and for systematic electronic communication and problem ticketing. The operation of the IMS network requires the help of local specialists whose cooperation is in some cases ensured by contracts or other agreements. The PTS (Provisional Technical Secretariat) strives to make the monitoring of the IMS as standardized and efficient as possible, and has therefore created the Operations Centre in which the use of most the tools are centralized. Recently the tasks of operations across all technologies, including the GCI, have been centralized within a single section of the organization. To harmonize the operations, an ongoing State of Health monitoring project will provide an integrated view of network, station and GCI performance and will provide system metrics. Comprehensive procedures will be developed to utilize this tool. However, as the IMS network expands, easier access to more information will cause additional challenges, mainly with human resources, to analyze and manage these metrics.

  14. Comparison of Radionuclide Ratios in Atmospheric Nuclear Explosions and Nuclear Releases from Chernobyl and Fukushima seen in Gamma Ray Spectormetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friese, Judah I.; Kephart, Rosara F.; Lucas, Dawn D.

    2013-05-01

    The Comprehensive Nuclear Test Ban Treaty (CTBT) has remote radionuclide monitoring followed by an On Site Inspection (OSI) to clarify the nature of a suspect event. An important aspect of radionuclide measurements on site is the discrimination of other potential sources of similar radionuclides such as reactor accidents or medical isotope production. The Chernobyl and Fukushima nuclear reactor disasters offer two different reactor source term environmental inputs that can be compared against historical measurements of nuclear explosions. The comparison of whole-sample gamma spectrometry measurements from these three events and the analysis of similarities and differences are presented. This analysis ismore » a step toward confirming what is needed for measurements during an OSI under the auspices of the Comprehensive Test Ban Treaty.« less

  15. A Discussion of Procedures and Equipment for the Comprehensive Test Ban Treaty On-Site Inspection Environmental Sampling and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wogman, Ned A.; Milbrath, Brian D.; Payne, Rosara F.

    This paper is intended to serve as a scientific basis to start discussions of the available environmental sampling techniques and equipment that have been used in the past that could be considered for use within the context of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) on-site inspections (OSI). This work contains information on the techniques, equipment, costs, and some operational procedures associated with environmental sampling that have actually been used in the past by the United States for the detection of nuclear explosions. This paper also includes a discussion of issues, recommendations, and questions needing further study within the context of themore » sampling and analysis of aquatic materials, atmospheric gases, atmospheric particulates, vegetation, sediments and soils, fauna, and drill-back materials.« less

  16. Definition of Exclusion Zones Using Seismic Data

    NASA Astrophysics Data System (ADS)

    Bartal, Y.; Villagran, M.; Ben Horin, Y.; Leonard, G.; Joswig, M.

    - In verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), there is a motivation to be effective, efficient and economical and to prevent abuse of the right to conduct an On-site Inspection (OSI) in the territory of a challenged State Party. In particular, it is in the interest of a State Party to avoid irrelevant search in specific areas. In this study we propose several techniques to determine `exclusion zones', which are defined as areas where an event could not have possibly occurred. All techniques are based on simple ideas of arrival time differences between seismic stations and thus are less prone to modeling errors compared to standard event location methods. The techniques proposed are: angular sector exclusion based on a tripartite micro array, half-space exclusion based on a station pair, and closed area exclusion based on circumferential networks.

  17. Trends in Nuclear Explosion Monitoring Research & Development - A Physics Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maceira, Monica; Blom, Philip Stephen; MacCarthy, Jonathan K.

    This document entitled “Trends in Nuclear Explosion Monitoring Research and Development – A Physics Perspective” reviews the accessible literature, as it relates to nuclear explosion monitoring and the Comprehensive Nuclear-Test-Ban Treaty (CTBT, 1996), for four research areas: source physics (understanding signal generation), signal propagation (accounting for changes through physical media), sensors (recording the signals), and signal analysis (processing the signal). Over 40 trends are addressed, such as moving from 1D to 3D earth models, from pick-based seismic event processing to full waveform processing, and from separate treatment of mechanical waves in different media to combined analyses. Highlighted in the documentmore » for each trend are the value and benefit to the monitoring mission, key papers that advanced the science, and promising research and development for the future.« less

  18. Crustal Seismic Attenuation in Germany Measured with Acoustic Radiative Transfer Theory

    NASA Astrophysics Data System (ADS)

    Gaebler, Peter J.; Eulenfeld, Tom; Wegler, Ulrich

    2017-04-01

    This work is carried out in the context of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty a verification regime was introduced to detect, locate and characterize nuclear explosion testings. The study of seismology can provide essential information in the form of broadband waveform recordings for the identification and verification of these critical events. A profound knowledge of the Earth's subsurface between source and receiver is required for a detailed description of the seismic wave field. In addition to underground parameters such as seismic velocity or anisotropy, information about seismic attenuation values of the medium are required. Goal of this study is the creation of a comprehensive model of crustal seismic attenuation in Germany and adjacent areas. Over 20 years of earthquake data from the German Central Seismological Observatory data archive is used to estimate the spatial dependent distribution of seismic intrinsic and scattering attenuation of S-waves for frequencies between 0.5 and 20 Hz. The attenuation models are estimated by fitting synthetic seismogram envelopes calculated with acoustic radiative transfer theory to observed seismogram envelopes. This theory describes the propagation of seismic S-energy under the assumption of multiple isotropic scattering, the crustal structure of the scattering medium is hereby represented by a half-space model. We present preliminary results of the spatial distribution of intrinsic attenuation represented by the absorption path length, as well as of scattering attenuation in terms of the mean free path and compare the outcomes to results from previous studies. Furthermore catalog magnitudes are compared to moment magnitudes estimated during the inversion process. Additionally site amplification factors of the stations are presented.

  19. An IMS Station life cycle from a sustainment point of view

    NASA Astrophysics Data System (ADS)

    Brely, Natalie; Gautier, Jean-Pierre; Foster, Daniel

    2014-05-01

    The International Monitoring System (IMS) is to consist of 321 monitoring facilities, composed of four different technologies with a variety of designs and equipment types, deployed in a range of environments around the globe. The International Monitoring System is conceived to operate in perpetuity through maintenance, replacement and recapitalization of IMS facilities' infrastructure and equipment when the end of service life is reached [CTBT/PTS/INF.1163]. Life Cycle techniques and modellization are being used by the PTS to plan and forecast life cycle sustainment requirements of IMS facilities. Through historical data analysis, Engineering inputs and Feedback from experienced Station Operators, the PTS currently works towards increasing the level of confidence on these forecasts and sustainment requirements planning. Continued validation, feedback and improvement of source data from scientific community and experienced users is sought and essential in order to ensure limited effect on data availability and optimal costs (human and financial).

  20. New evaluated radioxenon decay data and its implications in nuclear explosion monitoring.

    PubMed

    Galan, Monica; Kalinowski, Martin; Gheddou, Abdelhakim; Yamba, Kassoum

    2018-03-07

    This work presents the last updated evaluations of the nuclear and decay data of the four radioxenon isotopes of interest for the Comprehensive Nuclear-Test-Ban Treaty (CTBT): Xe-131 m, Xe-133, Xe-133 m and Xe-135. This includes the most recent measured values on the half-lives, gamma emission probabilities (Pγ) and internal conversion coefficients (ICC). The evaluation procedure has been made within the Decay Data Evaluation Project (DDEP) framework and using the latest available versions of nuclear and atomic data evaluation software tools and compilations. The consistency of the evaluations was confirmed by the very close result between the total available energy calculated with the present evaluated data and the tabulated Q-value. The article also analyzes the implications on the variation of the activity ratio calculations from radioxenon monitoring facilities depending on the nuclear database of reference. Copyright © 2018. Published by Elsevier Ltd.

  1. Atmospheric Transport Modelling assessing radionuclide detection chances after the nuclear test announced by the DPRK in January 2016

    NASA Astrophysics Data System (ADS)

    Ross, J. Ole; Ceranna, Lars

    2016-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. The International Monitoring System (IMS) is in place and at about 90% complete to verify compliance with the CTBT. The stations of the waveform technologies are capable to detect seismic, hydro-acoustic and infrasonic signals for detection, localization, and characterization of explosions. The seismic signals of the DPRK event on 6 January 2016 were detected by many seismic stations around the globe and allow for localization of the event and identification as explosion (see poster by G. Hartmann et al.). However, the direct evidence for a nuclear explosion is only possible through the detection of nuclear fission products which may be released. For that 80 Radionuclide (RN) Stations are part of the designed IMS, about 60 are already operational. All RN stations are highly sensitive for tiny traces of particulate radionuclides in large volume air samplers. There are 40 of the RN stations designated to be equipped with noble gas systems detecting traces of radioactive xenon isotopes which are more likely to escape from an underground test cavity than particulates. Already 30 of the noble gas systems are operational. Atmospheric Transport Modelling supports the interpretation of radionuclide detections (and as appropriate non-detections) by connecting the activity concentration measurements with potential source locations and release times. In our study forecasts with the Lagrangian Particle Dispersion Model HYSPLIT (NOAA) and GFS (NCEP) meteorological data are considered to assess the plume propagation patterns for hypothetical releases at the known DPRK nuclear test site. The results show a considerable sensitivity of the IMS station RN 38 Takasaki (Japan) to a potential radionuclide release at the test site in the days and weeks following the explosion in January 2016. In addition, backtracking simulations with ECMWF analysis data in 0.2° horizontal resolution are performed for selected samples to get a complementary estimation of the sensitivities and the connected thresholds for detectable releases.The meteorological situation is compared to the aftermath of the nuclear explosion on 12 February 2013 after which a specific occurrence of an unusual 131mXe signature at RN 38 eight weeks after the test could be very likely attributed to a late release from the DPRK event.

  2. Entering the New Millennium: Dilemmas in Arms Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BROWN,JAMES

    The end of the Cold War finds the international community no longer divided into two opposing blocks. The concerns that the community now faces are becoming more fluid, less focused, and, in many ways, much less predictable. Issues of religion, ethnicity, and nationalism; the possible proliferation of Weapons of Mass Destruction; and the diffusion of technology and information processing throughout the world community have greatly changed the international security landscape in the last decade. Although our challenges appear formidable, the United Nations, State Parties, nongovernmental organizations, and the arms control community are moving to address and lessen these concerns throughmore » both formal and informal efforts. Many of the multilateral agreements (e.g., NPT, BWC, CWC, CTBT, MTCR), as well as the bilateral efforts that are taking place between Washington and Moscow employ confidence-building and transparency measures. These measures along with on-site inspection and other verification procedures lessen suspicion and distrust and reduce uncertainty, thus enhancing stability, confidence, and cooperation.« less

  3. Isotopic signature of atmospheric xenon released from light water reactors.

    PubMed

    Kalinowski, Martin B; Pistner, Christoph

    2006-01-01

    A global monitoring system for atmospheric xenon radioactivity is being established as part of the International Monitoring System to verify compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The isotopic activity ratios of (135)Xe, (133m)Xe, (133)Xe and (131m)Xe are of interest for distinguishing nuclear explosion sources from civilian releases. Simulations of light water reactor (LWR) fuel burn-up through three operational reactor power cycles are conducted to explore the possible xenon isotopic signature of nuclear reactor releases under different operational conditions. It is studied how ratio changes are related to various parameters including the neutron flux, uranium enrichment and fuel burn-up. Further, the impact of diffusion and mixing on the isotopic activity ratio variability are explored. The simulations are validated with reported reactor emissions. In addition, activity ratios are calculated for xenon isotopes released from nuclear explosions and these are compared to the reactor ratios in order to determine whether the discrimination of explosion releases from reactor effluents is possible based on isotopic activity ratios.

  4. Working Towards Deep-Ocean Temperature Monitoring by Studying the Acoustic Ambient Noise Field in the South Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Sambell, K.; Evers, L. G.; Snellen, M.

    2017-12-01

    Deriving the deep-ocean temperature is a challenge. In-situ observations and satellite observations are hardly applicable. However, knowledge about changes in the deep ocean temperature is important in relation to climate change. Oceans are filled with low-frequency sound waves created by sources such as underwater volcanoes, earthquakes and seismic surveys. The propagation of these sound waves is temperature dependent and therefore carries valuable information that can be used for temperature monitoring. This phenomenon is investigated by applying interferometry to hydroacoustic data measured in the South Pacific Ocean. The data is measured at hydrophone station H03 which is part of the International Monitoring System (IMS). This network consists of several stations around the world and is in place for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The station consists of two arrays located north and south of Robinson Crusoe Island separated by 50 km. Both arrays consist of three hydrophones with an intersensor distance of 2 km located at a depth of 1200 m. This depth is in range of the SOFAR channel. Hydroacoustic data measured at the south station is cross-correlated for the time period 2014-2017. The results are improved by applying one-bit normalization as a preprocessing step. Furthermore, beamforming is applied to the hydroacoustic data in order to characterize ambient noise sources around the array. This shows the presence of a continuous source at a backazimuth between 180 and 200 degrees throughout the whole time period, which is in agreement with the results obtained by cross-correlation. Studies on source strength show a seasonal dependence. This is an indication that the sound is related to acoustic activity in Antarctica. Results on this are supported by acoustic propagation modeling. The normal mode technique is used to study the sound propagation from possible source locations towards station H03.

  5. Bayesian Monitoring Systems for the CTBT: Historical Development and New Results

    NASA Astrophysics Data System (ADS)

    Russell, S.; Arora, N. S.; Moore, D.

    2016-12-01

    A project at Berkeley, begun in 2009 in collaboration with CTBTO andmore recently with LLNL, has reformulated the global seismicmonitoring problem in a Bayesian framework. A first-generation system,NETVISA, has been built comprising a spatial event prior andgenerative models of event transmission and detection, as well as aMonte Carlo inference algorithm. The probabilistic model allows forseamless integration of various disparate sources of information,including negative information (the absence of detections). Workingfrom arrivals extracted by traditional station processing fromInternational Monitoring System (IMS) data, NETVISA achieves areduction of around 60% in the number of missed events compared withthe currently deployed network processing system. It also finds manyevents that are missed by the human analysts who postprocess the IMSoutput. Recent improvements include the integration of models forinfrasound and hydroacoustic detections and a global depth model fornatural seismicity trained from ISC data. NETVISA is now fullycompatible with the CTBTO operating environment. A second-generation model called SIGVISA extends NETVISA's generativemodel all the way from events to raw signal data, avoiding theerror-prone bottom-up detection phase of station processing. SIGVISA'smodel automatically captures the phenomena underlying existingdetection and location techniques such as multilateration, waveformcorrelation matching, and double-differencing, and integrates theminto a global inference process that also (like NETVISA) handles denovo events. Initial results for the Western US in early 2008 (whenthe transportable US Array was operating) shows that SIGVISA finds,from IMS data only, more than twice the number of events recorded inthe CTBTO Late Event Bulletin (LEB). For mb 1.0-2.5, the ratio is more than10; put another way, for this data set, SIGVISA lowers the detectionthreshold by roughly one magnitude compared to LEB. The broader message of this work is that probabilistic inference basedon a vertically integrated generative model that directly expressesgeophysical knowledge can be a much more effective approach forinterpreting scientific data than the traditional bottom-up processingpipeline.

  6. A fast and robust method for moment tensor and depth determination of shallow seismic events in CTBT related studies.

    NASA Astrophysics Data System (ADS)

    Baker, Ben; Stachnik, Joshua; Rozhkov, Mikhail

    2017-04-01

    International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event according to the protocol to the Protocol to the Comprehensive Nuclear Test Ban Treaty. Determination of seismic event source mechanism and its depth is closely related to these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. In this presentation we demonstrate preliminary results obtained with the latter approach from an improved software design. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Posterior distributions of moment tensor parameters show narrow peaks where a significant number of reliable surface wave observations are available. For earthquake examples, fault orientation (strike, dip, and rake) posterior distributions also provide results consistent with published catalogues. Inclusion of observations on horizontal components will provide further constraints. In addition, the calculation of teleseismic P wave Green's Functions are improved through prior analysis to determine an appropriate attenuation parameter for each source-receiver path. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK events and shallow earthquakes using a new implementation of teleseismic P waves waveform fitting. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.

  7. The Seismic Aftershock Monitoring System (SAMS) for OSI - Experiences from IFE14

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Sick, Benjamin; Häge, Martin; Blake, Thomas; Labak, Peter; Joswig, Manfred

    2016-04-01

    An on-site inspection (OSI) is the third of four elements of the verification regime of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The sole purpose of an OSI is to confirm whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of the treaty and to gather any facts which might assist in identifying any possible violator. It thus constitutes the final verification measure under the CTBT if all other available measures are not able to confirm the nature of a suspicious event. The Provisional Technical Secretariat (PTS) carried out the Integrated Field Exercise 2014 (IFE14) in the Dead Sea Area of Jordan from 3 November to 9. December 2014. It was a fictitious OSI whose aim was to test the inspection capabilities in an integrated manner. The technologies allowed during an OSI are listed in the Treaty. The aim of the Seismic Aftershock Monitoring System (SAMS) is to detect and localize aftershocks of low magnitudes of the triggering event or collapses of underground cavities. The locations of these events are expected in the vicinity of a possible previous explosion and help to narrow down the search area within an inspection area (IA) of an OSI. The success of SAMS depends on the main elements, hardware, software, deployment strategy, the search logic and not least the effective use of personnel. All elements of SAMS were tested and improved during the Built-Up Exercises (BUE) which took place in Austria and Hungary. IFE14 provided more realistic climatic and hazardous terrain conditions with limited resources. Significant variations in topography of the IA of IFE14 in the mountainous Dead Sea Area of Jordan led to considerable challenges which were not expected from experiences encountered during BUE. The SAMS uses mini arrays with an aperture of about 100 meters and with a total of 4 elements. The station network deployed during IFE14 and results of the data analysis will be presented. Possible aftershocks of the triggering event are expected in a very low magnitude range. Therefore the detection threshold of the network is one of the key parameters of SAMS and crucial for the success of the monitoring. One of the objectives was to record magnitude values down to -2.0 ML. The threshold values have been compared with historical seismicity in the region and those monitored during IFE14. Results of the threshold detection estimation and experiences of the exercise will be presented.

  8. NG09 And CTBT On-Site Inspection Noble Gas Sampling and Analysis Requirements

    NASA Astrophysics Data System (ADS)

    Carrigan, Charles R.; Tanaka, Junichi

    2010-05-01

    A provision of the Comprehensive Test Ban Treaty (CTBT) allows on-site inspections (OSIs) of suspect nuclear sites to determine if the occurrence of a detected event is nuclear in origin. For an underground nuclear explosion (UNE), the potential success of an OSI depends significantly on the containment scenario of the alleged event as well as the application of air and soil-gas radionuclide sampling techniques in a manner that takes into account both the suspect site geology and the gas transport physics. UNE scenarios may be broadly divided into categories involving the level of containment. The simplest to detect is a UNE that vents a significant portion of its radionuclide inventory and is readily detectable at distance by the International Monitoring System (IMS). The most well contained subsurface events will only be detectable during an OSI. In such cases, 37 Ar and radioactive xenon cavity gases may reach the surface through either "micro-seepage" or the barometric pumping process and only the careful siting of sampling locations, timing of sampling and application of the most site-appropriate atmospheric and soil-gas capturing methods will result in a confirmatory signal. The OSI noble gas field tests NG09 was recently held in Stupava, Slovakia to consider, in addition to other field sampling and analysis techniques, drilling and subsurface noble gas extraction methods that might be applied during an OSI. One of the experiments focused on challenges to soil-gas sampling near the soil-atmosphere interface. During withdrawal of soil gas from shallow, subsurface sample points, atmospheric dilution of the sample and the potential for introduction of unwanted atmospheric gases were considered. Tests were designed to evaluate surface infiltration and the ability of inflatable well-packers to seal out atmospheric gases during sample acquisition. We discuss these tests along with some model-based predictions regarding infiltration under different near-surface hydrologic conditions. We also consider how naturally occurring as well as introduced (e.g., SF6) soil-gas tracers might be used to guard against the possibility of atmospheric contamination of soil gases while sampling during an actual OSI. The views expressed here do not necessarily reflect the opinion of the United States Government, the United States Department of Energy, or Lawrence Livermore National Laboratory. This work has been performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-418791

  9. Stockpile stewardship past, present, and future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Marvin L., E-mail: mladams@tamu.edu

    2014-05-09

    The U.S. National Academies released a report in 2012 on technical issues related to the Comprehensive Test Ban Treaty. One important question addressed therein is whether the U.S. could maintain a safe, secure, and reliable nuclear-weapons stockpile in the absence of nuclear-explosion testing. Here we discuss two main conclusions from the 2012 Academies report, which we paraphrase as follows: 1) Provided that sufficient resources and a national commitment to stockpile stewardship are in place, the U.S. has the technical capabilities to maintain a safe, secure, and reliable stockpile of nuclear weapons into the foreseeable future without nuclear-explosion testing. 2) Doingmore » this would require: a) a strong weapons science and engineering program that addresses gaps in understanding; b) an outstanding workforce that applies deep and broad weapons expertise to deliver solutions to stockpile problems; c) a vigorous, stable surveillance program that delivers the requisite data; d) production facilities that meet stewardship needs. We emphasize that these conclusions are independent of CTBT ratification-they apply provided only that the U.S. continues its nuclear-explosion moratorium.« less

  10. Characterization of a Commercial Silicon Beta Cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foxe, Michael P.; Hayes, James C.; Mayer, Michael F.

    Silicon detectors are of interest for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) due to their enhanced energy resolution compared to plastic scintillators beta cells. Previous work developing a figure-of-merit (FOM) for comparison of beta cells suggests that the minimum detectable activity (MDA) could be reduced by a factor of two to three with the use of silicon detectors. Silicon beta cells have been developed by CEA (France) and Lares Ltd. (Russia), with the PIPSBox developed by CEA being commercially available from Canberra for approximately $35k, but there is still uncertainty about the reproducibility of the capabilities in themore » field. PNNL is developing a high-resolution beta-gamma detector system in the shallow underground laboratory, which will utilize and characterize the operation of the PIPSBox detector. Throughout this report, we examine the capabilities of the PIPSBox as developed by CEA. The lessons learned through the testing and use of the PIPSBox will allow PNNL to strategically develop a silicon detector optimized to better suit the communities needs in the future.« less

  11. Development of a new aerosol monitoring system and its application in Fukushima nuclear accident related aerosol radioactivity measurement at the CTBT radionuclide station in Sidney of Canada.

    PubMed

    Zhang, Weihua; Bean, Marc; Benotto, Mike; Cheung, Jeff; Ungar, Kurt; Ahier, Brian

    2011-12-01

    A high volume aerosol sampler ("Grey Owl") has been designed and developed at the Radiation Protection Bureau, Health Canada. Its design guidance is based on the need for a low operational cost and reliable sampler to provide daily aerosol monitoring samples that can be used as reference samples for radiological studies. It has been developed to provide a constant air flow rate at low pressure drops (∼3 kPa for a day sampling) with variations of less than ±1% of the full scale flow rate. Its energy consumption is only about 1.5 kW for a filter sampling over 22,000 standard cubic meter of air. It has been demonstrated in this Fukushima nuclear accident related aerosol radioactivity monitoring study at Sidney station, B.C. that the sampler is robust and reliable. The results provided by the new monitoring system have been used to support decision-making in Canada during an emergency response. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Radioxenon detections in the CTBT international monitoring system likely related to the announced nuclear test in North Korea on February 12, 2013.

    PubMed

    Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G

    2014-02-01

    Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. New Horizons and New Strategies in Arms Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J. editor

    In the last ten years, since the break-up of the Soviet Union, remarkable progress in arms control and disarmament has occurred. The Nuclear Non-Proliferation Treaty (NPT), the completion of the Comprehensive Test Ban Treaty (CTBT), and the Chemical Weapons Treaty (CWC) are indicative of the great strides made in the non- proliferation arena. Simultaneously, the Intermediate Nuclear Forces Treaty (INF), the Conventional Forces Treaty in Europe (CFE), and the Strategic Arms Reduction Treaties (START), all associated with US-Soviet Union (now Russia) relations have assisted in redefining European relations and the security landscape. Finally, it now appears that progress is inmore » the offing in developing enhanced compliance measures for the Biological and Toxin Weapons Convention (BTWC). In sum, all of these achievements have set the stage for the next round of arms control activities, which may lead to a much broader, and perhaps more diffused multilateral agenda. In this new and somewhat unpredictable international setting, arms control and disarmament issues will require solutions that are both more creative and innovative than heretofore.« less

  14. Acoustic-Seismic Coupling of Broadband Signals - Analysis of Potential Disturbances during CTBT On-Site Inspection Measurements

    NASA Astrophysics Data System (ADS)

    Liebsch, Mattes; Altmann, Jürgen

    2015-04-01

    For the verification of the Comprehensive Nuclear Test Ban Treaty (CTBT) the precise localisation of possible underground nuclear explosion sites is important. During an on-site inspection (OSI) sensitive seismic measurements of aftershocks can be performed, which, however, can be disturbed by other signals. To improve the quality and effectiveness of these measurements it is essential to understand those disturbances so that they can be reduced or prevented. In our work we focus on disturbing signals caused by airborne sources: When the sound of aircraft (as often used by the inspectors themselves) hits the ground, it propagates through pores in the soil. Its energy is transferred to the ground and soil vibrations are created which can mask weak aftershock signals. The understanding of the coupling of acoustic waves to the ground is still incomplete. However, it is necessary to improve the performance of an OSI, e.g. to address potential consequences for the sensor placement, the helicopter trajectories etc. We present our recent advances in this field. We performed several measurements to record sound pressure and soil velocity produced by various sources, e.g. broadband excitation by jet aircraft passing overhead and signals artificially produced by a speaker. For our experimental set-up microphones were placed close to the ground and geophones were buried in different depths in the soil. Several sensors were shielded from the directly incident acoustic signals by a box coated with acoustic damping material. While sound pressure under the box was strongly reduced, the soil velocity measured under the box was just slightly smaller than outside of it. Thus these soil vibrations were mostly created outside the box and travelled through the soil to the sensors. This information is used to estimate characteristic propagation lengths of the acoustically induced signals in the soil. In the seismic data we observed interference patterns which are likely caused by the superposition of acoustically induced seismic waves with reflections at a layer boundary. Their frequencies of increased/decreased amplitudes depend on the angle of incidence of the acoustic signal. So these patterns can be used to estimate the path(s) of propagation of acoustically induced soil vibrations. The frequency-dependent phase offset between different sensors is used to estimate the propagation velocity of soil. The research aims to deliver a better understanding of the interaction of acoustic waves and the ground when hitting the surface, the transfer of energy from sound waves into the soil and the possible excitation of seismic surface waves. The goal is to develop recommendations for sensitive seismic measurements during CTBTO on-site inspections to reduce disturbing vibrations caused by airborne sources.

  15. Monitoring of reported sudden emission rate changes of major radioxenon emitters in the northern and southern hemispheres in 2008 to assess their contribution to the respective radioxenon backgrounds

    NASA Astrophysics Data System (ADS)

    Saey, P. R. J.; Auer, M.; Becker, A.; Colmanet, S.; Hoffmann, E.; Nikkinen, M.; Schlosser, C.; Sonck, M.

    2009-04-01

    Atmospheric radioxenon monitoring is a key component of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Radiopharmaceutical production facilities (RPF) have recently been identified of emitting the major part of the environmental radioxenon measured at globally distributed monitoring sites deployed to strengthen the radionuclide part of the CTBT verification regime. Efforts to raise a global radioxenon emission inventory revealed that the global total emission from RPF's is 2-3 orders of magnitude higher than the respective emissions related to maintenance of all nuclear power plants (NPP). Given that situation we have seen in 2008 two peculiar hemisphere-specific situations: 1) In the northern hemisphere, a joint shutdown of the global largest four radiopharmaceutical facilities revealed the contribution of the normally 'masked' NPP related emissions. Due to an incident, the Molybdenum production at the "Institut des Radioéléments" (IRE) in Fleurus, Belgium, was shut down between Monday 25 August and 2 December 2008. IRE is the third largest global producer of medical isotopes. In the same period, but for different reasons, the other three worldwide largest producers (CRL in Canada, HFR in The Netherlands and NTP in South Africa) also had scheduled and unscheduled shutdowns. The activity concentrations of 133Xe measured at the Schauinsland Mountain station near Freiburg in Germany (situated 380 km SW of Fleurus) which have a mean of 4.8 mBq/m3 for the period February 2004 - August 2008, went down to 0.87 mBq/m3 for the period September - November 2008. 2) In the southern hemisphere, after a long break, the only radiopharmaceutical facility in Australia started up test production in late November 2008. In the period before the start-up, the background of radioxenon in Australia (Melbourne and Darwin) was below measurable quantities. During six test runs of the renewed RPF at ANSTO in Lucas Heights, up to 6 mBq/m3 of 133Xe were measured in the station at Melbourne, 700 km SW from the facility. This paper confirms the hypothesis that radiopharmaceutical production facilities are the major emitters of radioxenon first of all. Moreover it demonstrates how the temporal shut down of these facilities indicates the scale of their contribution to the European radioxenon background, which decreased 6 fold. Finally we have studied the contribution of the start-up of a renewed RFP to the buildup of a radioxenon background across Australia and the southern hemisphere. Disclaimer The views expressed in this publication are those of the authors and do not necessarily reflect the views of the CTBTO Preparatory Commission or any of the participating institutions.

  16. Fusion of waveform events and radionuclide detections with the help of atmospheric transport modelling

    NASA Astrophysics Data System (ADS)

    Krysta, Monika; Kushida, Noriyuki; Kotselko, Yuriy; Carter, Jerry

    2016-04-01

    Possibilities of associating information from four pillars constituting CTBT monitoring and verification regime, namely seismic, infrasound, hydracoustic and radionuclide networks, have been explored by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) for a long time. Based on a concept of overlying waveform events with the geographical regions constituting possible sources of the detected radionuclides, interactive and non-interactive tools were built in the past. Based on the same concept, a design of a prototype of a Fused Event Bulletin was proposed recently. One of the key design elements of the proposed approach is the ability to access fusion results from either the radionuclide or from the waveform technologies products, which are available on different time scales and through various different automatic and interactive products. To accommodate various time scales a dynamic product evolving while the results of the different technologies are being processed and compiled is envisioned. The product would be available through the Secure Web Portal (SWP). In this presentation we describe implementation of the data fusion functionality in the test framework of the SWP. In addition, we address possible refinements to the already implemented concepts.

  17. Status report on the establishment of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Campus, P.; Bell, M.; Hoffmann, T. L.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.

    2002-11-01

    The infrasound component of the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Ban Treaty verification aims for global detection and localization of low-frequency sound waves originating from atmospheric nuclear explosions. The infrasound network will consist of 60 array stations, distributed as evenly as possible over the globe to assure at least two-station detection capability for 1-kton explosions at any point on earth. This network will be larger and more sensitive than any other previously operated infrasound network. As of today, 85% of the site surveys for IMS infrasound stations have been completed, 25% of the stations have been installed, and 8% of the installations have been certified and are transmitting high-quality continuous data to the International Data Center in Vienna. By the end of 2002, 20% of the infrasound network is expected to be certified and operating in post-certification mode. This presentation will discuss the current status and progress made in the site survey, installation, and certification programs for IMS infrasound stations. A review will be presented of the challenges and difficulties encountered in these programs, together with practical solutions to these problems.

  18. High-pressure swing system for measurements of radioactive fission gases in air samples

    NASA Astrophysics Data System (ADS)

    Schell, W. R.; Vives-Battle, J.; Yoon, S. R.; Tobin, M. J.

    1999-01-01

    Radionuclides emitted from nuclear reactors, fuel reprocessing facilities and nuclear weapons tests are distributed widely in the atmosphere but have very low concentrations. As part of the Comprehensive Test Ban Treaty (CTBT), identification and verification of the emission of radionuclides from such sources are fundamental in maintaining nuclear security. To detect underground and underwater nuclear weapons tests, only the gaseous components need to be analyzed. Equipment has now been developed that can be used to collect large volumes of air, separate and concentrate the radioactive gas constituents, such as xenon and krypton, and measure them quantitatively. By measuring xenon isotopes with different half-lives, the time since the fission event can be determined. Developments in high-pressure (3500 kPa) swing chromatography using molecular sieve adsorbents have provided the means to collect and purify trace quantities of the gases from large volumes of air automatically. New scintillation detectors, together with timing and pulse shaping electronics, have provided the low-background levels essential in identifying the gamma ray, X-ray, and electron energy spectra of specific radionuclides. System miniaturization and portability with remote control could be designed for a field-deployable production model.

  19. Infrasonic waves from volcanic eruptions on the Kamchatka peninsula

    NASA Astrophysics Data System (ADS)

    Gordeev, E. I.; Firstov, P. P.; Kulichkov, S. N.; Makhmudov, E. R.

    2013-07-01

    The IS44 station operates at the observation point of Nachiki on the Kamchatka peninsula, which is part of the International Monitoring System (IMS), and it helps verify compliance with the Comprehensive Nuclear Test-Ban Treaty (CTBT). The Kamchatka Branch, Geophysical Service, Russian Academy of Sciences (KB GS RAS), has a station operating in the village of Paratunka. Both of these stations allow one to monitor strong explosive eruptions of andesitic volcanoes.1 Both kinematic and dynamic parameters of acoustic signals accompanying the eruptions of the Bezymyannyi volcano (at a distance of 361 km from Nachiki) in 2009-2010 and the Kizimen volcano (at a distance of 275 km) on December 31, 2011, are considered. A low-frequency rarefaction phase 60 s in length has been revealed in the initial portion of the record of acoustic signals accompanying such strong eruptions. It is shown that the rarefaction phase occurs due to the rapid condensation of superheated juvenile vapor2 that enters the atmosphere during such explosions.3 The amount of volcanic ash emitted into the atmosphere has been estimated within (3.2-7.3) 106 m3 on the basis of acoustic signals recorded during the eruptions under consideration.

  20. Radionuclide observables for the Platte underground nuclear explosive test on 14 April 1962

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnett, Jonathan L.; Milbrath, Brian D.

    2016-11-01

    Past nuclear weapons tests provide invaluable information for understanding the radionuclide observables and data quality objectives expected during an On-site Inspection (OSI) for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). These radioactive signatures are complex and subject to spatial and temporal variability. The Platte Underground Nuclear Test on 14 April 1962 provides extensive environmental monitoring data that can be modelled and used to assess an OSI. The 1.6 kT test is especially useful as it released the highest amounts of recorded activity during Operation Nougat at the Nevada Test Site – now known as the Nevada National Security Site (NNSS). It hasmore » been estimated that 0.36% of the activity was released, and dispersed in a northerly direction. The deposition ranged from 1 x 10-11 to 1 x 10-9 of the atmospheric release (per m2), and has been used to evaluate a hypothetical OSI at 1 week to 2 years post-detonation. Radioactive decay reduces the activity of the 17 OSI relevant radionuclides by 99.7%, such that detection throughout the inspection is only achievable close to the explosion where deposition was highest.« less

  1. LLNL's Regional Model Calibration and Body-Wave Discrimination Research in the Former Soviet Union using Peaceful Nuclear Explosions (PNEs)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharyya, J.; Rodgers, A.; Swenson, J.

    2000-07-14

    Long-range seismic profiles from Peaceful Nuclear Explosions (PNE) in the Former Soviet Union (FSU) provide a unique data set to investigate several important issues in regional Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring. The recording station spacing ({approx}15 km) allows for extremely dense sampling of the propagation from the source to {approx} 3300 km. This allows us to analyze the waveforms at local, near- and far-regional and teleseismic distances. These data are used to: (1) study the evolution of regional phases and phase amplitude ratios along the profile; (2) infer one-dimensional velocity structure along the profile; and (3) evaluate the spatial correlationmore » of regional and teleseismic travel times and regional phase amplitude ratios. We analyzed waveform data from four PNE's (m{sub b} = 5.1-5.6) recorded along profile KRATON, which is an east-west trending profile located in northern Sibertil. Short-period regional discriminants, such as P/S amplitude ratios, will be essential for seismic monitoring of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) at small magnitudes (m{sub b} < 4.0). However, P/S amplitude ratios in the short-period band, 0.5-5.0 Hz, show some scatter. This scatter is primarily due to propagation and site effects, which arise from variability in the elastic and anelastic structure of the crustal waveguide. Preliminary results show that Pg and Lg propagate efficiently in north Siberia at regional distances. The amplitude ratios show some variability between adjacent stations that are modeled by simple distance trends. The effect of topography, sediment and crustal thickness, and upper mantle discontinuities on these ratios, after removal of the distance trends, will be investigated. The travel times of the body wave phases recorded on KEATON have been used to compute the one-dimensional structure of the crust and upper mantle in this region. The path-averaged one-dimensional velocity model was computed by minimizing the first arriving P-phase travel-time residuals for all distances ({Delta} = 300-2300 km). A grid search approach was used in the minimization. The most significant features of this model are the negative lid-gradient and a low-velocity zone in the upper mantle between the depths of 100-200 km; precise location of the LVZ is poorly constrained by the travel time data. We will extend our investigation to additional PNE lines to further investigate the amplitude and travel-time variations in eastern and central Eurasia. Finally, the dense station spacing of the PNE profiles allows us to model the spatial correlation of travel times and amplitude ratios through variogram modeling. The statistical analysis suggests that the correlation lengths of the travel-time and amplitude measurements are 12{sup o} and 10{sup o}, respectively.« less

  2. Design and optimization of a noise reduction system for infrasonic measurements using elements with low acoustic impedance.

    PubMed

    Alcoverro, Benoit; Le Pichon, Alexis

    2005-04-01

    The implementation of the infrasound network of the International Monitoring System (IMS) for the enforcement of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) increases the effort in the design of suitable noise reducer systems. In this paper we present a new design consisting of low impedance elements. The dimensioning and the optimization of this discrete mechanical system are based on numerical simulations, including a complete electroacoustical modeling and a realistic wind-noise model. The frequency response and the noise reduction obtained for a given wind speed are compared to statistical noise measurements in the [0.02-4] Hz frequency band. The effects of the constructive parameters-the length of the pipes, inner diameters, summing volume, and number of air inlets-are investigated through a parametric study. The studied system consists of 32 air inlets distributed along an overall diameter of 16 m. Its frequency response is flat up to 4 Hz. For a 2 m/s wind speed, the maximal noise reduction obtained is 15 dB between 0.5 and 4 Hz. At lower frequencies, the noise reduction is improved by the use of a system of larger diameter. The main drawback is the high-frequency limitation introduced by acoustical resonances inside the pipes.

  3. The influence of periodic wind turbine noise on infrasound array measurements

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars

    2017-02-01

    Aerodynamic noise emissions from the continuously growing number of wind turbines in Germany are creating increasing problems for infrasound recording systems. These systems are equipped with highly sensitive micro pressure sensors accurately measuring acoustic signals in a frequency range inaudible to the human ear. Ten years of data (2006-2015) from the infrasound array IGADE in Northern Germany are analysed to quantify the influence of wind turbine noise on infrasound recordings. Furthermore, a theoretical model is derived and validated by a field experiment with mobile micro-barometer stations. Fieldwork was carried out 2004 to measure the infrasonic pressure level of a single horizontal-axis wind turbine and to extrapolate the sound effect for a larger number of nearby wind turbines. The model estimates the generated sound pressure level of wind turbines and thus enables for specifying the minimum allowable distance between wind turbines and infrasound stations for undisturbed recording. This aspect is particularly important to guarantee the monitoring performance of the German infrasound stations I26DE in the Bavarian Forest and I27DE in Antarctica. These stations are part of the International Monitoring System (IMS) verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), and thus have to meet stringent specifications with respect to infrasonic background noise.

  4. Detection, Location, and Characterization of Hydroacoustic Signals Using Seafloor Cable Networks Offshore Japan

    NASA Astrophysics Data System (ADS)

    Suyehiro, K.; Sugioka, H.; Watanabe, T.

    2008-12-01

    The hydroacoustic monitoring by the International Monitoring System for CTBT (Comprehensive Nuclear- Test-Ban Treaty) verification system utilizes hydrophone stations (6) and seismic stations (5 and called T- phase stations) for worldwide detection. Some conspicuous signals of natural origin include those from earthquakes, volcanic eruptions, or whale calls. Among artificial sources are non-nuclear explosions and airgun shots. It is important for the IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressure and seismic sensors) may be utilized to increase the capability of IMS. We use these data to compare some selected event parameters with those by IMS. In particular, there have been several unconventional acoustic signals in the western Pacific,which were also captured by IMS hydrophones across the Pacific in the time period of 2007-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals.

  5. Data mining on long-term barometric data within the ARISE2 project

    NASA Astrophysics Data System (ADS)

    Hupe, Patrick; Ceranna, Lars; Pilger, Christoph

    2016-04-01

    The Comprehensive nuclear-Test-Ban Treaty (CTBT) led to the implementation of an international infrasound array network. The International Monitoring System (IMS) network includes 48 certified stations, each providing data for up to 15 years. As part of work package 3 of the ARISE2 project (Atmospheric dynamics Research InfraStructure in Europe, phase 2) the data sets will be statistically evaluated with regard on atmospheric dynamics. The current study focusses on fluctuations of absolute air pressure. Time series have been analysed for 17 monitoring stations which are located all over the world between Greenland and Antarctica along the latitudes to represent different climate zones and characteristic atmospheric conditions. Hence this enables quantitative comparisons between those regions. Analyses are shown including wavelet power spectra, multi-annual time series of average variances with regard to long-wave scales, and spectral densities to derive characteristics and special events. Evaluations reveal periodicities in average variances on 2 to 20 day scale with a maximum in the winter months and a minimum in summer of the respective hemisphere. This basically applies to time series of IMS stations beyond the tropics where the dominance of cyclones and anticyclones changes with seasons. Furthermore, spectral density analyses illustrate striking signals for several dynamic activities within one day, e.g., the semidiurnal tide.

  6. Natural ³⁷Ar concentrations in soil air: implications for monitoring underground nuclear explosions.

    PubMed

    Riedmann, Robin A; Purtschert, Roland

    2011-10-15

    For on-site inspections (OSI) under the Comprehensive Nuclear-Test-Ban Treaty (CTBT) measurement of the noble gas ³⁷Ar is considered an important technique. ³⁷Ar is produced underground by neutron activation of Calcium by the reaction ⁴⁰Ca(n,α)³⁷Ar. The naturally occurring equilibrium ³⁷Ar concentration balance in soil air is a function of an exponentially decreasing production rate from cosmic ray neutrons with increasing soil depth, diffusive transport in the soil air, and radioactive decay (T(1/2): 35 days). In this paper for the first time, measurements of natural ³⁷Ar activities in soil air are presented. The highest activities of ~100 mBq m⁻³ air are 2 orders of magnitude larger than in the atmosphere and are found in 1.5-2.5 m depth. At depths > 8 m ³⁷Ar activities are < 20 mBq m⁻³ air. After identifying the main ³⁷Ar production and gas transport factors the expected global activity range distribution of ³⁷Ar in shallow subsoil (0.7 m below the surface) was estimated. In high altitude soils, with large amounts of Calcium and with low gas permeability, ³⁷Ar activities may reach values up to 1 Bq m⁻³.

  7. Detection and interpretation of seismoacoustic events at German infrasound stations

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellors, R J

    The Comprehensive Nuclear Test Ban Treaty (CTBT) includes provisions for an on-site inspection (OSI), which allows the use of specific techniques to detect underground anomalies including cavities and rubble zones. One permitted technique is active seismic surveys such as seismic refraction or reflection. The purpose of this report is to conduct some simple modeling to evaluate the potential use of seismic reflection in detecting cavities and to test the use of open-source software in modeling possible scenarios. It should be noted that OSI inspections are conducted under specific constraints regarding duration and logistics. These constraints are likely to significantly impactmore » active seismic surveying, as a seismic survey typically requires considerable equipment, effort, and expertise. For the purposes of this study, which is a first-order feasibility study, these issues will not be considered. This report provides a brief description of the seismic reflection method along with some commonly used software packages. This is followed by an outline of a simple processing stream based on a synthetic model, along with results from a set of models representing underground cavities. A set of scripts used to generate the models are presented in an appendix. We do not consider detection of underground facilities in this work and the geologic setting used in these tests is an extremely simple one.« less

  9. Preliminary report on the Black Thunder, Wyoming CTBT R and D experiment quicklook report: LLNL input from regional stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P.E.; Glenn, L.A.

    This report presents a preliminary summary of the data recorded at three regional seismic stations from surface blasting at the Black Thunder Coal Mine in northeast Wyoming. The regional stations are part of a larger effort that includes many more seismic stations in the immediate vicinity of the mine. The overall purpose of this effort is to characterize the source function and propagation characteristics of large typical surface mine blasts. A detailed study of source and propagation features of conventional surface blasts is a prerequisite to attempts at discriminating this type of blasting activity from other sources of seismic events.more » The Black Thunder Seismic experiment is a joint verification effort to determine seismic source and path effects that result from very large, but routine ripple-fired surface mining blasts. Studies of the data collected will be for the purpose of understanding how the near-field and regional seismic waveforms from these surface mining blasts are similar to, and different from, point shot explosions and explosions at greater depth. The Black Hills Station is a Designated Seismic Station that was constructed for temporary occupancy by the Former Soviet Union seismic verification scientists in accordance with the Threshold Test Ban Treaty protocol.« less

  10. Value of Spaceborne Remotely Sensed Data Products in the Context of the Launch Phase of an On-Site Inspection

    NASA Astrophysics Data System (ADS)

    Labak, P.; Rowlands, A.; Malich, G.; Charlton, A.; Schultz-Fellenz, E. S.; Craven, J.

    2016-12-01

    The availability of data and the ability to effectively interpret those data in the context of an alleged Treaty violation are critical to operations during the launch phase of an inspection. The launch phase encompasses the time when the initial inspection plan is being developed and finalised; this document will set the scene for the inspection and will propose mission activities for the critical first three days of an inspection. While authenticated data products from the CTBT International Data Centre form the basis of the initial inspection plan, other data types, provided as national technical means, can also be used to inform the development of the initial inspection plan. In this context, remotely sensed data and derived products acquired from sensors on satellites feature prominently. Given the environmental setting, optical and/or radar sensors have the potential to provide valuable information to guide mission activities. Such data could provide more than mere backdrops to mapping products. While recognising time constraints and the difficulties associated with integrating data from disparate optical and radar sensors, this abstract uses case studies to illustrate the types of derived data products from sapecborne sensors that have the potential to inform inspectors during the preparation of the initial inspection plan.

  11. Improvements of low-level radioxenon detection sensitivity by a state-of-the art coincidence setup.

    PubMed

    Cagniant, A; Le Petit, G; Gross, P; Douysset, G; Richard-Bressand, H; Fontaine, J-P

    2014-05-01

    The ability to quantify isotopic ratios of 135, 133 m, 133 and 131 m radioxenon is essential for the verification of the Comprehensive Nuclear-Test Ban Treaty (CTBT). In order to improve detection limits, CEA has developed a new on-site setup using photon/electron coincidence (Le Petit et al., 2013. J. Radioanal. Nucl. Chem., DOI : 10.1007/s 10697-013-2525-8.). Alternatively, the electron detection cell equipped with large silicon chips (PIPS) can be used with HPGe detector for laboratory analysis purpose. This setup allows the measurement of β/γ coincidences for the detection of (133)Xe and (135)Xe; and K-shell Conversion Electrons (K-CE)/X-ray coincidences for the detection of (131m)Xe, (133m)Xe and (133)Xe as well. Good energy resolution of 11 keV at 130 keV and low energy threshold of 29 keV for the electron detection were obtained. This provides direct discrimination between K-CE from (133)Xe, (133m)Xe and (131m)Xe. Estimation of Minimum Detectable Activity (MDA) for (131m)Xe is in the order of 1mBq over a 4 day measurement. An analysis of an environmental radioxenon sample using this method is shown. © 2013 The Authors. Published by Elsevier Ltd All rights reserved.

  12. A method for limiting data acquisition in a high-resolution gamma-ray spectrometer during On-Site Inspection activities under the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Aviv, O.; Lipshtat, A.

    2018-05-01

    On-Site Inspection (OSI) activities under the Comprehensive Nuclear-Test-Ban Treaty (CTBT) allow limitations to measurement equipment. Thus, certain detectors require modifications to be operated in a restricted mode. The accuracy and reliability of results obtained by a restricted device may be impaired. We present here a method for limiting data acquisition during OSI. Limitations are applied to a high-resolution high-purity germanium detector system, where the vast majority of the acquired data that is not relevant to the inspection is filtered out. The limited spectrum is displayed to the user and allows analysis using standard gamma spectrometry procedures. The proposed method can be incorporated into commercial gamma-ray spectrometers, including both stationary and mobile-based systems. By applying this procedure to more than 1000 spectra, representing various scenarios, we show that partial data are sufficient for reaching reliable conclusions. A comprehensive survey of potential false-positive identifications of various radionuclides is presented as well. It is evident from the results that the analysis of a limited spectrum is practically identical to that of a standard spectrum in terms of detection and quantification of OSI-relevant radionuclides. A future limited system can be developed making use of the principles outlined by the suggested method.

  13. Using the IMS infrasound network for the identification of mountain-associated waves and gravity waves hotspots

    NASA Astrophysics Data System (ADS)

    Hupe, Patrick; Ceranna, Lars; Pilger, Christoph; Le Pichon, Alexis

    2017-04-01

    The infrasound network of the International Monitoring System (IMS) has been established for monitoring the atmosphere to detect violations of the Comprehensive nuclear-Test-Ban Treaty (CTBT). The IMS comprises 49 certified infrasound stations which are globally distributed. Each station provides data for up to 16 years. Due to the uniform distribution of the stations, the IMS infrasound network can be used to derive global information on atmospheric dynamics' features. This study focuses on mountain-associated waves (MAWs), i.e. acoustic waves in the frequency range between approximately 0.01 Hz and 0.05 Hz. MAWs can be detected in infrasound data by applying the Progressive Multi-Channel Correlation (PMCC) algorithm. As a result of triangulation, global hotspots of MAWs can be identified. Previous studies on gravity waves indicate that global hotspots of gravity waves are similar to those found for MAWs by using the PMCC algorithm. The objective of our study is an enhanced understanding of the excitation sources and of possible interactions between MAWs and gravity waves. Therefore, spatial and temporal correlation analyses will be performed. As a preceding step, we will present (seasonal) hotspots of MAWs as well as hotspots of gravity waves derived by the IMS infrasound network.

  14. Influence of atmospheric transport patterns on xenon detections at the CTBTO radionuclide network

    NASA Astrophysics Data System (ADS)

    Krysta, Monika; Kusmierczyk-Michulec, Jolanta

    2016-04-01

    In order to fulfil its task of monitoring for signals emanating from nuclear explosions, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) operates global International Monitoring System (IMS) comprising seismic, infrasound, hydroacoustic and radionuclide measurement networks. At present, 24 among 80 radionuclide stations foreseen by the Comprehensive Nuclear-Test-Ban Treaty (CTBT) are equipped with certified noble gas measurement systems. Over a past couple of years these systems collected a rich set of measurements of radioactive isotopes of xenon. Atmospheric transport modelling simulations are crucial to an assessment of the origin of xenon detected at the IMS stations. Numerous studies undertaken in the past enabled linking these detections to non Treaty-relevant activities and identifying main contributors. Presence and quantity of xenon isotopes at the stations is hence a result of an interplay of emission patterns and atmospheric circulation. In this presentation we analyse the presence or absence of radioactive xenon at selected stations from an angle of such an interplay. We attempt to classify the stations according to similarity of detection patterns, examine seasonality in those patterns and link them to large scale or local meteorological phenomena. The studies are undertaken using crude hypotheses on emission patterns from known sources and atmospheric transport modelling simulations prepared with the FLEXPART model.

  15. An Overview of the Source Physics Experiments (SPE) at the Nevada National Security Site (NNSS)

    NASA Astrophysics Data System (ADS)

    Snelson, C. M.; Barker, D. L.; White, R. L.; Emmitt, R. F.; Townsend, M. J.; Graves, T. E.; Becker, S. A.; Teel, M. G.; Lee, P.; Antoun, T. H.; Rodgers, A.; Walter, W. R.; Mellors, R. J.; Brunish, W. M.; Bradley, C. R.; Patton, H. J.; Hawkins, W. L.; Corbell, B. H.; Abbott, R. E.; SPE Working Group

    2011-12-01

    Modeling of explosion phenomenology has been primarily empirically based when looking at the seismic, infrasound, and acoustic signals. In order to detect low-yield nuclear explosions under the Comprehensive Nuclear Test-Ban Treaty (CTBT), we must be able to understand and model the explosive source in settings beyond where we have empirical data. The Source Physics Experiments (SPE) at the Nevada National Security Site are the first step in this endeavor to link the empirically based with the physics-based modeling to develop this predictive capability. The current series of tests is being conducted in a granite body called the Climax Stock. This location was chosen for several reasons, including the site's expected "simple geology"-the granite is a fairly homogeneous body. In addition, data are available from underground nuclear tests that were conducted in the same rock body, and the nature of the geology has been well-documented. Among the project goals for the SPE is to provide fully coupled seismic energy to the seismic and acoustic seismic arrays so that the transition between the near and far-field data can be modeled and our scientists can begin to understand how non-linear effects and anisotropy control seismic energy transmission and partitioning. The first shot for the SPE was conducted in May 2011 as a calibration shot (SPE1) with 220 lb (100 kg) of chemical explosives set at a depth of 180 ft (55 m). An array of sensors and diagnostics recorded the shot data, including accelerometers, geophones, rotational sensors, short-period and broadband seismic sensors, Continuous Reflectometry for Radius vs. Time Experiment (CORRTEX), Time of Arrival (TOA), Velocity of Detonation (VOD) as well as infrasound sensors. The three-component accelerometer packages were set at depths of 180 ft (55 m), 150 ft (46 m), and 50 ft (15 m) in two rings around ground zero (GZ); the inner ring was at 10 m and the outer ring was 20 m from GZ. Six sets of surface accelerometers (100 and 500 g) were placed along in an azimuth of SW from GZ every 10 m. Seven infrasound sensors were placed in an array around the GZ, extending from tens of meters to kilometers. Over 100 seismic stations were positioned, most of which were in five radial lines from GZ out to 2 km. Over 400 data channels were recorded for SPE1, and data recovery was about 95% with high signal to noise ratio. Future tests will be conducted in the same shot hole as SPE1. The SPE2 experiment will consist of 2200 lb (1000 kg) of chemical explosives shot at 150 ft (46 m) depth utilizing the above-described instrumentation. Subsequent SPE shots will be the same size, within the same shot hole, and within the damage zone. The ultimate goal of the SPE Project is to develop predictive capability for using seismic energy as a tool for CTBT issues. This work was done by National Security Technologies, LLC, under Contract No. DE AC52 06NA25946 with the U.S. Department of Energy.

  16. ASASSN-17fp rebrightening event and ongoing monitoring

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-05-01

    ASASSN-17fp, discovered on 2017 April 28 and classified as a helium dwarf nova, was observed to be in outburst again on May 16 after fading 2.5 magnitudes from its original outburst. Dr. Tom Marsh (University of Warwick) and Dr. Elme Breedt (University of Cambridge) requested immediate time-series coverage. Dr. Breedt wrote: "The transient was identified as a helium dwarf nova (also known as an AMCVn star) from a spectrum taken by the PESSTO survey and reported in ATel #10334. Since then, we have been observing the target using the New Technology Telescope on La Silla in Chile. We measured a photometric period of 51 minutes in the first few nights during which the object was bright at g=16.03 (Marsh et al., ATel #10354), and then it faded to about g 18. However last night [ May 16] it brightened back to g 16 again, apparently starting a second outburst. Time series observations during this bright state would be very valuable to determine whether the 51 min period we saw in earlier data returns, and whether it is the orbital period of the binary or related to the distortion of the accretion disc in outburst (superhumps). If the 51 min signal is the orbital period or close to it, this would be the helium dwarf nova with the longest orbital period known. Multiple successive outbursts are not uncommon in binaries like this..." Observers should continue to monitor ASASSN-17fp with nightly snapshots for two weeks after it fades, in case it rebrightens again. It appears to have faded, according to an observation in the AAVSO International Database by F.-J. Hambsch (HMB, Mol, Belgium), who observed it remotely from Chile on 2017 May 24.2252 UT at magnitude 19.944 CV ± 0.595. Continue nightly snapshots through June 6 at least, and if it brightens again, resume time series. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Databa! se. See full Alert Notice for more details.

  17. Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty.

    PubMed

    Sloan, Jamison; Sun, Yunwei; Carrigan, Charles

    2016-05-01

    Enforcement of the Comprehensive Nuclear Test Ban Treaty (CTBT) will involve monitoring for radiologic indicators of underground nuclear explosions (UNEs). A UNE produces a variety of radioisotopes which then decay through connected radionuclide chains. A particular species of interest is xenon, namely the four isotopes (131m)Xe, (133m)Xe, (133)Xe, and (135)Xe. Due to their half lives, some of these isotopes can exist in the subsurface for more than 100 days. This convenient timescale, combined with modern detection capabilities, makes the xenon family a desirable candidate for UNE detection. Ratios of these isotopes as a function of time have been studied in the past for distinguishing nuclear explosions from civilian nuclear applications. However, the initial yields from UNEs have been treated as fixed values. In reality, these independent yields are uncertain to a large degree. This study quantifies the uncertainty in xenon ratios as a result of these uncertain initial conditions to better bound the values that xenon ratios can assume. We have successfully used a combination of analytical and sampling based statistical methods to reliably bound xenon isotopic ratios. We have also conducted a sensitivity analysis and found that xenon isotopic ratios are primarily sensitive to only a few of many uncertain initial conditions. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Radionuclide observables for the Platte underground nuclear explosive test on 14 April 1962.

    PubMed

    Burnett, Jonathan L; Milbrath, Brian D

    2016-11-01

    Past nuclear weapon explosive tests provide invaluable information for understanding the radionuclide observables expected during an On-site Inspection (OSI) for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). These radioactive signatures are complex and subject to spatial and temporal variability. The Platte underground nuclear explosive test on 14 April 1962 provides extensive environmental monitoring data that can be modelled and used to calculate the maximum time available for detection of the OSI-relevant radionuclides. The 1.6 kT test is especially useful as it released the highest amounts of recorded activity during Operation Nougat at the Nevada Test Site - now known as the Nevada National Security Site (NNSS). It has been estimated that 0.36% of the activity was released, and dispersed in a northerly direction. The deposition ranged from 1 × 10 -11 to 1 × 10 -9 of the atmospheric release (per m 2 ), and has been used in this paper to evaluate an OSI and the OSI-relevant radionuclides at 1 week to 2 years post-detonation. Radioactive decay reduces the activity of the OSI-relevant radionuclides by 99.7% within 2 years of detonation, such that detection throughout the hypothesized inspection is only achievable close to the explosion where deposition was highest. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Detection of Noble Gas Radionuclides from an Underground Nuclear Explosion During a CTBT On-Site Inspection

    NASA Astrophysics Data System (ADS)

    Carrigan, Charles R.; Sun, Yunwei

    2014-03-01

    The development of a technically sound approach to detecting the subsurface release of noble gas radionuclides is a critical component of the on-site inspection (OSI) protocol under the Comprehensive Nuclear Test Ban Treaty. In this context, we are investigating a variety of technical challenges that have a significant bearing on policy development and technical guidance regarding the detection of noble gases and the creation of a technically justifiable OSI concept of operation. The work focuses on optimizing the ability to capture radioactive noble gases subject to the constraints of possible OSI scenarios. This focus results from recognizing the difficulty of detecting gas releases in geologic environments—a lesson we learned previously from the non-proliferation experiment (NPE). Most of our evaluations of a sampling or transport issue necessarily involve computer simulations. This is partly due to the lack of OSI-relevant field data, such as that provided by the NPE, and partly a result of the ability of computer-based models to test a range of geologic and atmospheric scenarios far beyond what could ever be studied by field experiments, making this approach very highly cost effective. We review some highlights of the transport and sampling issues we have investigated and complete the discussion of these issues with a description of a preliminary design for subsurface sampling that addresses some of the sampling challenges discussed here.

  20. Calibration of an Ultra-Low-Background Proportional Counter for Measuring 37Ar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifert, Allen; Aalseth, Craig E.; Bonicalzi, Ricco

    Abstract. An ultra-low-background proportional counter (ULBPC) design has been developed at Pacific Northwest National Laboratory (PNNL) using clean materials, primarily electrochemically-purified copper. This detector, along with an ultra-low-background counting system (ULBCS), was developed to complement a new shallow underground laboratory (30 meters water-equivalent) constructed at PNNL. The ULBCS design includes passive neutron and gamma shielding, along with an active cosmic-veto system. This system provides a capability for making ultra-sensitive measurements to support applications like age-dating soil hydrocarbons with 14C/3H, age-dating of groundwater with 39Ar, and soil-gas assay for 37Ar to support On-Site Inspection (OSI). On-Site Inspection is a key componentmore » of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Measurements of radionuclides created by an underground nuclear explosion are valuable signatures of a Treaty violation. For OSI, the 35-day half-life of 37Ar, produced from neutron interactions with calcium in soil, provides both high specific activity and sufficient time for inspection before decay limits sensitivity. This work describes the calibration techniques and analysis methods developed to enable quantitative measurements of 37Ar samples over a broad range of pressures. These efforts, along with parallel work in progress on gas chemistry separation, are expected to provide a significant new capability for 37Ar soil gas background studies.« less

  1. Low Noise Results From IMS Site Surveys: A Preliminary New High-Frequency Low Noise Model

    NASA Astrophysics Data System (ADS)

    Ebeling, C.; Astiz, L.; Starovoit, Y.; Tavener, N.; Perez, G.; Given, H. K.; Barrientos, S.; Yamamoto, M.; Hfaiedh, M.; Stewart, R.; Estabrook, C.

    2002-12-01

    Since the establishment of the Provisional Technical Secretariat (PTS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Organization, a vigorous seismic site survey program has been carried out to identify locations as necessary for International Monitoring System (IMS) primary and auxiliary seismic stations listed in Annex 1 to the Protocol to the CTBT. The IMS Seismic Section maintains for this purpose a small pool of seismic equipment comprised of Guralp CMG-3T and CMG-3ESP and Streckeisen STS-2 broadband seismometers, and Reftek and Guralp acquisition systems. Seismic site surveys are carried out by conducting continuous measurements of ground motion at temporary installations for approximately five to seven days. Seismometer installation methods, which depend on instrument type and on local conditions, range from placement within small cement-floored subsurface vaults to near-surface burial. Data are sampled at 40 Hz. Seismic noise levels are evaluated through the analysis of power spectral density distributions. Eleven 10.5-minute-long representative de-trended and mean-removed segments each of daytime and night-time data are chosen randomly, but reviewed to avoid event contamination. Fast Fourier Transforms are calculated for the five windows in each of these segments generated using a 50% overlap for Hanning-tapered sections ~200 s long. Instrument responses are removed. To date, 20 site surveys for primary and auxiliary stations have been carried out by the IMS. The sites surveyed represent a variety of physical and geological environments on most continents. The lowest high frequency (>1.4 Hz) noise levels at five sites with igneous or metamorphic geologies were as much as 6 dB below the USGS New Low Noise Model (NLNM) developed by Peterson (1993). These sites were in Oman (local geology consisting of Ordovician metasediments), Egypt (Precambrian granite), Niger (early Proterozoic tonalite and granodiorite), Saudi Arabia (Precambian metasediments), and Zimbabwe (Archaean granite). Based on a composite of the results from these five surveys, we propose a preliminary IMS Low-Noise Model (pIMS-LNM) consisting of a revision downward of Peterson's NLNM in the passband from 0.1 to about 0.7 s and an extension of Peterson's NLNM above 0.1 to 0.07 s. As these low noise results are derived from data recorded at temporary installations, improved resolution of this model will be possible when data from final installations become available. Preliminary International Monitoring System Low Noise Model (pIMS-LNM) for periods from 0.07 to 0.70 s. Decibels are relative to ground acceleration ((m/s2)2/Hz). Values presented in (Period, dB) format. Figure in bold is from Peterson's NLNM. [(0.07,-167.0),(0.08,-168.0),(0.09,-169.0),(0.10,-169.5), (0.11,-170.5),(0.13,-171.0),(0.14,-171.5),(0.17,-172.0), (0.20,-1 72.5),(0.25,-173.0),(0.30,-173.5),(0.40,-173.0), (0.50,-172.0),(0.60,-171.0),(0.70,-170.0),(0.80,-169.2)] Reference Peterson, J., 1993. Observations and Modeling of Seismic Background Noise, U.S. Geological Survey Open-File Report 93-322, 47 p.

  2. Mapping crustal heterogeneity using Lg propagation efficiency throughout the Middle East, Mediterranean, Southern Europe and Northern Africa

    USGS Publications Warehouse

    McNamara, D.E.; Walter, W.R.

    2001-01-01

    In this paper we describe a technique for mapping the lateral variation of Lg characteristics such as Lg blockage, efficient Lg propagation, and regions of very high attenuation in the Middle East, North Africa, Europe and the Mediterranean regions. Lg is used in a variety of seismological applications from magnitude estimation to identification of nuclear explosions for monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). These applications can give significantly biased results if the Lg phase is reduced or blocked by discontinuous structure or thin crust. Mapping these structures using quantitative techniques for determining Lg amplitude attenuation can break down when the phase is below background noise. In such cases Lg blockage and inefficient propagation zones are often mapped out by hand. With our approach, we attempt to visually simplify this information by imaging crustal structure anomalies that significantly diminish the amplitude of Lg. The visualization of such anomalies is achieved by defining a grid of cells that covers the entire region of interest. We trace Lg rays for each event/ station pair, which is simply the great circle path, and attribute to each cell a value equal to the maximum value of the Lg/P-coda amplitude ratio for all paths traversing that particular cell. The resulting map, from this empirical approach, is easily interpreted in terms of crustal structure and can successfully image small blockage features often missed by analysis of raypaths alone. This map can then be used to screen out events with blocked Lg prior to performing Q tomography, and to avoid using Lg-based methods of event identification for the CTBT in regions where they cannot work. For this study we applied our technique to one of the most tectonically complex regions on the earth. Nearly 9000 earthquake/station raypaths, traversing the vast region comprised of the Middle East, Mediterranean, Southern Europe and Northern Africa, have been analyzed. We measured the amplitude of Lg relative to the P-coda and mapped the lateral variation of Lg propagation efficiency. With the relatively dense coverage provided by the numerous crossing paths we are able to map out the pattern of crustal heterogeneity that gives rise to the observed character of Lg propagation. We observe that the propagation characteristics of Lg within the region of interest are very complicated but are readily correlated with the different tectonic environments within the region. For example, clear strong Lg arrivals are observed for paths crossing the stable continental interiors of Northern Africa and the Arabian Shield. In contrast, weakened to absent Lg is observed for paths crossing much of the Middle East, and Lg is absent for paths traversing the Mediterranean. Regions that block Lg transmission within the Middle East are very localized and include the Caspian Sea, the Iranian Plateau and the Red Sea. Resolution is variable throughout the region and strongly depends on the distribution of seismicity and recording stations. Lg propagation is best resolved within the Middle East where regions of crustal heterogeneity on the order of 100 km are imaged (e.g., South Caspian Sea and Red Sea). Crustal heterogeneity is resolvable but is poorest in seismically quiescent Northern Africa.

  3. Vertically Integrated Seismological Analysis II : Inference

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.; Sudderth, E.

    2009-12-01

    Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.

  4. Infrasound data inversion for atmospheric sounding

    NASA Astrophysics Data System (ADS)

    Lalande, J.-M.; Sèbe, O.; Landès, M.; Blanc-Benon, Ph.; Matoza, R. S.; Le Pichon, A.; Blanc, E.

    2012-07-01

    The International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) continuously records acoustic waves in the 0.01-10 Hz frequency band, known as infrasound. These waves propagate through the layered structure of the atmosphere. Coherent infrasonic waves are produced by a variety of anthropogenic and natural sources and their propagation is controlled by spatiotemporal variations of temperature and wind velocity. Natural stratification of atmospheric properties (e.g. temperature, density and winds) forms waveguides, allowing long-range propagation of infrasound waves. However, atmospheric specifications used in infrasound propagation modelling suffer from lack and sparsity of available data above an altitude of 50 km. As infrasound can propagate in the upper atmosphere up to 120 km, we assume that infrasonic data could be used for sounding the atmosphere, analogous to the use of seismic data to infer solid Earth structure and the use of hydroacoustic data to infer oceanic structure. We therefore develop an inversion scheme for vertical atmospheric wind profiles in the framework of an iterative linear inversion. The forward problem is treated in the high-frequency approximation using a Hamiltonian formulation and complete first-order ray perturbation theory is developed to construct the Fréchet derivatives matrix. We introduce a specific parametrization for the unknown model parameters based on Principal Component Analysis. Finally, our algorithm is tested on synthetic data cases spanning different seasonal periods and network configurations. The results show that our approach is suitable for infrasound atmospheric sounding on a regional scale.

  5. Radionuclide data analysis in connection of DPRK event in May 2009

    NASA Astrophysics Data System (ADS)

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  6. Probing the Atmosphere in Antarctica using continuous microbarom recordings

    NASA Astrophysics Data System (ADS)

    Ceranna, L.; Le Pichon, A.; Blanc, E.

    2009-12-01

    Germany is operating one of the four Antarctic infrasound stations to fulfill the compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). I27DE is a nine element array which is in continuous operation since its deployment in January 2003. Using the PMCC detection algorithm coherent signals are observed in the frequency range from 0.0002 to 4.0 Hz covering a large variety of infrasound sources such as low frequent mountain-associated wave or high frequency ice-quakes. The most prominent signals are related to microbaroms (mb) generated by the strong peri-Antarctic ocean swells. These continuous signals with a dominant period of 5 s show a clear trend in the direction of their detection being well correlated to the prevailing stratospheric winds. For mb-signals a strong increase in trace velocity along with a decrease in the number of detections were observed during the Austral summer 2006 indicating strong variations in the stratospheric duct. However, wind speed profiles at the stations give no evidence for such an anomaly. Nevertheless, strong events of sudden stratospheric warming (SSW) at latitude ranges of the peri-Antarctic belt occurring during Austral winter 2006 together with cooling in the upper stratosphere caused by eruption of the Manam volcano in Indonesia provide a potential explanation for the abnormal ducting conditions. This will be demonstrated computing 2-D numerical simulations for sound propagation from the ocean swell to I27DE using appropriate horizontal wind speed and temperature profiles.

  7. Auroral Infrasound Observed at I53US at Fairbanks, Alaska

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Olson, J. V.

    2003-12-01

    In this presentation we will describe two different types of auroral infrasound recently observed at Fairbanks, Alaska in the pass band from 0.015 to 0.10 Hz. Infrasound signals associated with auroral activity (AIW) have been observed in Fairbanks over the past 30 years with infrasonic microphone arrays. The installation of the new CTBT/IMS infrasonic array, I53US, at Fairbanks has resulted in a greatly increased quality of the infrasonic data with which to study natural sources of infrasound. In the historical data at Fairbanks all the auroral infrasonic waves (AIW) detected were found to be the result of bow waves that are generated by supersonic motion of auroral arcs that contain strong electrojet currents. This infrasound is highly anisotropic, moving in the same direction as that of the auroral arc. AIW bow waves observed in 2003 at I53US will be described. Recently at I53US we have observed many events of very high trace velocity that are comprised of continuous, highly coherent wave trains. These waves occur in the morning hours at times of strong auroral activity. This new type of very high trace velocity AIW appears to be associated with pulsating auroral displays. Pulsating auroras occur predominantly after magnetic midnight (10:00 UT at Fairbanks). They are a usual part of the recovery phase of auroral substorms and are produced by energetic electrons precipitating into the atmosphere. Given proper dark, cloudless sky conditions during the AIW events, bright pulsating auroral forms were sometimes visible overhead.

  8. Infrasound workshop for CTBT monitoring: Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christie, D.; Whitaker, R.

    1998-11-01

    It is expected that the establishment of new infrasound stations in the global IMS network by the Provisional Technical Secretariat of the CTBTO in Vienna will commence in the middle of 1998. Thus, decisions on the final operational design for IMS infrasound stations will have to be made within the next 12 months. Though many of the basic design problems have been resolved, it is clear that further work needs to be carried out during the coming year to ensure that IMS infrasound stations will operate with maximum capability in accord with the specifications determined during the May 1997 PrepCommore » Meeting. Some of the papers presented at the Workshop suggest that it may be difficult to design a four-element infrasound array station that will reliably detect and locate infrasound signals at all frequencies in the specified range from 0.02 to 4.0 Hz in all noise environments. Hence, if the basic design of an infrasound array is restricted to four array elements, the final optimized design may be suited only to the detection and location of signals in a more limited pass-band. Several participants have also noted that the reliable discrimination of infrasound signals could be quite difficult if the detection system leads to signal distortion. Thus, it has been emphasized that the detection system should not, if possible, compromise signal fidelity. This report contains the workshop agenda, a list of participants, and abstracts and viewgraphs from each presentation.« less

  9. Influence of deep sedimentary basins, crustal thining, attenuation, and topography on regional phases: selected examples from theEastern Mediteranean and the Caspian Sea Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, P.; Schultz, C.; Larsen, S.

    1997-07-15

    Monitoring of a CTBT will require transportable seismic identification techniques, especially in regions where there is limited data. Unfortunately, most existing techniques are empirical and can not be used reliably in new regions. Our goal is to help develop transportable regional identification techniques by improving our ability to predict the behavior of regional phases and discriminants in diverse geologic regions and in regions with little or no data. Our approach is to use numerical modeling to understand the physical basis for regional wave propagation phenomena and to use this understanding to help explain observed behavior of regional phases and discriminants.more » In this paper, we focus on results from simulations of data in selected regions and investigate the sensitivity of these regional simulations to various features of the crustal structure. Our initial models use teleseismically estimated source locations, mechanisms, and durations and seismological structures that have been determined by others. We model the Mb 5.9, October 1992, Cairo Egypt earthquake at a station at Ankara Turkey (ANTO) using a two-dimensional crustal model consisting of a water layer over a deep sedimentary basin with a thinning crust beneath the basin. Despite the complex tectonics of the Eastern Mediterranean region, we find surprisingly good agreement between the observed data and synthetics based on this relatively smooth two-dimensional model.« less

  10. Static Corrections to Improve Seismic Monitoring of the North Korean Nuclear Test Site with Regional Arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, N.; Wookey, J. M.; Selby, N. D.

    2017-12-01

    Seismology is an important part of the International Monitoring System (IMS) installed to detect, identify, and locate nuclear detonations in breach of the Comprehensive nuclear Test Ban Treaty (CTBT) prior to and after its entry into force. Seismic arrays in particular provide not only a means of detecting and locating underground nuclear explosions, but in discriminating them from naturally occurring earthquakes of similar magnitude. One potential discriminant is the amplitude ratio of high frequency (> 2 Hz) P waves to S waves (P/S) measured at regional distances (3 - 17 °). Accurate measurement of such discriminants, and the ability to detect low-magnitude seismicity from a suspicious event relies on high signal-to-noise ratio (SNR) data. A correction to the slowness vector of the incident seismic wavefield, and static corrections applied to the waveforms recorded at each receiver within the array can be shown to improve the SNR. We apply codes we have developed to calculate slowness-azimuth station corrections (SASCs) and static corrections to the arrival time and amplitude of the seismic waveform to seismic arrays regional to the DPRK nuclear test site at Punggye-ri, North Korea. We use the F-statistic to demonstrate the SNR improvement to data from the nuclear tests and other seismic events in the vicinity of the test site. We also make new measurements of P/S with the corrected waveforms and compare these with existing measurements.

  11. Phase II: Field Detector Development For Undeclared/Declared Nuclear Testing For Treaty Verfiation Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriz, M.; Hunter, D.; Riley, T.

    2015-10-02

    Radioactive xenon isotopes are a critical part of the Comprehensive Nuclear Test Ban Treaty (CTBT) for the detection or confirmation of nuclear weapons tests as well as on-site treaty verification monitoring. On-site monitoring is not currently conducted because there are no commercially available small/robust field detector devices to measure the radioactive xenon isotopes. Xenon is an ideal signature to detect clandestine nuclear events since they are difficult to contain and can diffuse and migrate through soils due to their inert nature. There are four key radioxenon isotopes used in monitoring: 135Xe (9 hour half-life), 133mXe (2 day half-life), 133Xe (5more » day half-life) and 131mXe (12 day half-life) that decay through beta emission and gamma emission. Savannah River National Laboratory (SRNL) is a leader in the field of gas collections and has developed highly selective molecular sieves that allow for the collection of xenon gas directly from air. Phase I assessed the development of a small, robust beta-gamma coincidence counting system, that combines collection and in situ detection methodologies. Phase II of the project began development of the custom electronics enabling 2D beta-gamma coincidence analysis in a field portable system. This will be a significant advancement for field detection/quantification of short-lived xenon isotopes that would not survive transport time for laboratory analysis.« less

  12. A General Investigation of Optimized Atmospheric Sample Duration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Miley, Harry S.

    2012-11-28

    ABSTRACT The International Monitoring System (IMS) consists of up to 80 aerosol and xenon monitoring systems spaced around the world that have collection systems sensitive enough to detect nuclear releases from underground nuclear tests at great distances (CTBT 1996; CTBTO 2011). Although a few of the IMS radionuclide stations are closer together than 1,000 km (such as the stations in Kuwait and Iran), many of them are 2,000 km or more apart. In the absence of a scientific basis for optimizing the duration of atmospheric sampling, historically scientists used a integration times from 24 hours to 14 days for radionuclidesmore » (Thomas et al. 1977). This was entirely adequate in the past because the sources of signals were far away and large, meaning that they were smeared over many days by the time they had travelled 10,000 km. The Fukushima event pointed out the unacceptable delay time (72 hours) between the start of sample acquisition and final data being shipped. A scientific basis for selecting a sample duration time is needed. This report considers plume migration of a nondecaying tracer using archived atmospheric data for 2011 in the HYSPLIT (Draxler and Hess 1998; HYSPLIT 2011) transport model. We present two related results: the temporal duration of the majority of the plume as a function of distance and the behavior of the maximum plume concentration as a function of sample collection duration and distance. The modeled plume behavior can then be combined with external information about sampler design to optimize sample durations in a sampling network.« less

  13. Subsurface Xenon Migration by Atmospheric Pumping Using an Implicit Non-Iterative Algorithm for a Locally 1D Dual-Porosity Model

    NASA Astrophysics Data System (ADS)

    Annewandter, R.; Kalinowksi, M. B.

    2009-04-01

    An underground nuclear explosion injects radionuclids in the surrounding host rock creating an initial radionuclid distribution. In the case of fractured permeable media, cyclical changes in atmospheric pressure can draw gaseous species upwards to the surface, establishing a ratcheting pump effect. The resulting advective transport is orders of magnitude more significant than transport by molecular diffusion. In the 1990s the US Department of Energy funded the socalled Non-Proliferation Experiment conducted by the Lawrence Livermore National Laboratory to investigate this barometric pumping effect for verifying compliance with respect to the Comprehensive Nuclear Test Ban Treaty. A chemical explosive of approximately 1 kt TNT-equivalent has been detonated in a cavity located 390 m deep in the Rainier Mesa (Nevada Test Site) in which two tracer gases were emplaced. Within this experiment SF6 was first detected in soil gas samples taken near fault zones after 50 days and 3He after 325 days. For this paper a locally one-dimensional dual-porosity model for flow along the fracture and within the permeable matrix was used after Nilson and Lie (1990). Seepage of gases and diffusion of tracers between fracture and matrix are accounted. The advective flow along the fracture and within the matrix block is based on the FRAM filtering remedy and methodology of Chapman. The resulting system of equations is solved by an implicit non-iterative algorithm. Results on time of arrival and subsurface concentration levels for the CTBT-relevant xenons will be presented.

  14. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the keymore » points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the near-term (1-4) years and longer-term (5-10) years planning horizons. Some final observations include acknowledging the enduring nature of several key objectives on the Obama Administration's arms control and nonproliferation agenda. The CTBT, FMCT, bilateral nuclear arms reductions and strengthening the NPT have been sought by successive U.S. Administrations for nearly thirty years. Efforts towards negotiated arms control, although de-emphasized by the G.W. Bush Administration, have remained a pillar of U.S. national security strategy for decades and are likely to be of enduring if not increasing importance for decades to come. Therefore revitalization and expansion of USG capabilities in this area can be a positive legacy no matter what near-term arms control goals are achieved over the next four years. This is why it is important to reconstruct integrated bureaucratic, legislative, budgetary and diplomatic strategies to sustain the arms control and nonproliferation agenda. In this endeavor some past lessons must be taken to heart to avoid bureaucratic overkill and keep interagency policy-making and implementation structures lean and effective. On the Technical side a serious, sustained multilateral program to develop, down select and performance test nuclear weapons dismantlement verification technologies and procedures should be immediately initiated. In order to make this happen the United States and Russia should join with the UK and other interested states in creating a sustained, full-scale research and development program for verification at their respective nuc1ear weapons and defense establishments. The goals include development of effective technologies and procedures for: (1) Attribute measurement systems to certify nuclear warheads and military fissile materials; (2) Chain-of-custody methods to track items after they are authenticated and enter accountability; (3) Transportation monitoring; (4) Storage monitoring; (5) Fissile materials conversion verification. The remainder of this paper focuses on transparency and verification for nuclear arms and fissile material reductions.« less

  15. Lessons learned from the first US/Russian Federation joint tabletop exercise to prepare for conducting on-site inspections under the Comprehensive Nuclear Test Ban Treaty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Filarowski, C; Kreek, S; Smith, A

    1999-03-24

    A U.S./Russian Federation Joint Tabletop Exercise took place in Snezhinsk, Russia, from 19 to 24 October 1998 whose objectives were to examine the functioning of an Inspection Team (IT) in a given scenario, to evaluate the strategies and techniques employed by the IT, to identify ambiguous interpretations of treaty provisions that needed clarification, and to confirm the overall utility of tabletop exercises to assist in developing an effective Comprehensive Test Ban Treaty (CTBT) verification regime. To achieve these objectives, the United States and Russian Federation (RF) agreed that two exercises would be conducted. The first would be developed by themore » RF, who would act as controller and as the inspected State Party (ISP), while the United States would play the role of the IT. The roles would be reversed in the second exercise; the United States would develop the scenario and play the ISP, while the RF would play the IT. A joint control team, comprised of members of both the U.S. and RF control teams, agreed on a number of ground rules for the two exercises and established a joint Evaluation Team to evaluate both of the exercises against the stated objectives. To meet time limitations, the scope of this joint exercise needed to be limited. The joint control team decided that each of the two exercises would not go beyond the first 25 days of an on-site inspection (OSI) and that the focus would be on examining the decision-making of the IT as it utilized the various technologies to clarify whether a nuclear test explosion had taken place. Hence, issues such as logistics, restricted access, and activities prior to Point of Entry (POE) would be played only to the extent needed to provide for a realistic context for the exercises' focus on inspection procedures, sensor deployments, and data interpretation. Each of the exercises began at the POE and proceeded with several iterations of negotiations between the IT and ISP, instrument deployments, and data evaluation by the IT. By the end of each of the exercises, each IT had located the site of the underground nuclear explosion (UNE). While this validated the methods employed by each of the ITS, the Evaluation Team noted that each IT employed different search strategies and that each strategy had both advantages and disadvantages. The exercises also highlighted ambiguities in interpretation of certain treaty provisions related to overflights and seismic monitoring. Likewise, a substantial number of lessons were learned relating to radionuclide monitoring and the impact of logistical constraints on successful OSI execution. These lessons are discussed more fully in the body of this report. Notwithstanding the overall positive assessment by the U.S. and RF participants, as well as by the Evaluation Team, that the exercise had met its objectives, there were a variety of areas identified that could be improved in subsequent OSI exercises. Some of these included reexamination of the methods used to convey visual observation data in an exercise; the amount of time compression employed; and the need for better verification of agreements pertaining to the structure, format, and other rules of the exercise. This report summarizes the lessons learned pertaining to both the technical and operational aspects of an OSI as well as to those pertaining to the planning and execution of an OSI exercise. It concludes with comments from the Evaluation Team and proposed next steps for future U.S./RF interactions on CTBT OSIs.« less

  16. Dynamics of the middle atmosphere as observed by the ARISE project

    NASA Astrophysics Data System (ADS)

    Blanc, E.

    2015-12-01

    It has been strongly demonstrated that variations in the circulation of the middle atmosphere influence weather and climate all the way to the Earth's surface. A key part of this coupling occurs through the propagation and breaking of planetary and gravity waves. However, limited observations prevent to faithfully reproduce the dynamics of the middle atmosphere in numerical weather prediction and climate models. The main challenge of the ARISE (Atmospheric dynamics InfraStructure in Europe) project is to combine existing national and international observation networks including: the International infrasound monitoring system developed for the CTBT (Comprehensive nuclear-Test-Ban Treaty) verification, the NDACC (Network for the Detection of Atmospheric Composition Changes) lidar network, European observation infrastructures at mid latitudes (OHP observatory), tropics (Maïdo observatory), high latitudes (ALOMAR and EISCAT), infrasound stations which form a dense European network and satellites. The ARISE network is unique by its coverage (polar to equatorial regions in the European longitude sector), its altitude range (from troposphere to mesosphere and ionosphere) and the involved scales both in time (from seconds to tens of years) and space (from tens of meters to thousands of kilometers). Advanced data products are produced with the scope to assimilate data in the Weather Prediction models to improve future forecasts over weeks and seasonal time scales. ARISE observations are especially relevant for the monitoring of extreme events such as thunderstorms, volcanoes, meteors and at larger scales, deep convection and stratospheric warming events for physical processes description and study of long term evolution with climate change. Among the applications, ARISE fosters integration of innovative methods for remote detection of non-instrumented volcanoes including distant eruption characterization to provide notifications with reliable confidence indices to the civil aviation.

  17. Participation of the NDC Austria at the NDC Preparedness Exercise 2012

    NASA Astrophysics Data System (ADS)

    Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene

    2013-04-01

    NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.

  18. Evaluation of mean transit time of aerosols from the area of origin to the Arctic with 210Pb/210Po daily monitoring data.

    PubMed

    Zhang, Weihua; Sadi, Baki; Rinaldo, Christopher; Chen, Jing; Spencer, Norman; Ungar, Kurt

    2018-08-01

    In this study, the activity concentrations of 210 Pb and 210 Po on the 22 daily air filter samples, collected at CTBT Yellowknife station from September 2015 to April 2016, were analysed. To estimate the time scale of atmospheric long-range transport aerosol bearing 210 Pb in the Arctic during winter, the mean transit time of aerosol bearing 210 Pb from its origin was determined based on the activity ratios of 210 Po/ 210 Pb and the parent-progeny decay/ingrowth equation. The activity ratios of 210 Po/ 210 Pb varied between 0.06 and 0.21 with a median value of 0.11. The aerosol mean transit time based the activity ratio of 210 Po/ 210 Pb suggests longer mean transit time of 210 Pb aerosols in winter (12 d) than in autumn (3.7 d) and spring (2.9 d). Four years 210 Pb and 212 Pb monitoring results and meteorological conditions at the Yellowknife station indicate that the 212 Pb activity is mostly of local origin, and that 210 Pb aerosol in wintertime are mainly from outside of the Arctic regions in common with other pollutants and sources contributing to the Arctic. The activity concentration ratios of 210 Pb and 212 Pb have a relatively constant value in summer with a significant peak observed in winter, centered in the month of February. Comparison of the 210 Pb/ 212 Pb activity ratios and the estimated mean 210 Pb transit time, the mean aerosol transit times were real reflection of the atmosphere transport characteristics, which can be used as a radio-chronometer for the transport of air masses to the Arctic region. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  19. Using continuous microbarom recordings for probing peri-Antarctica's atmosphere

    NASA Astrophysics Data System (ADS)

    Ceranna, Lars; Le Pichon, Alexis; Blanc, Elisabeth

    2010-05-01

    Germany is operating one of the four Antarctic infrasound stations to fulfil the compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). IS27 is a nine element array which is in continuous operation since its deployment in January 2003. Using the PMCC detection algorithm coherent signals are observed in the frequency range from 0.0002 to 4.0 Hz covering a large variety of infrasound sources such as low frequent mountain-associated wave or high frequency ice-quakes. The most prominent signals are related to microbaroms (mb) generated by the strong peri-Antarctic ocean swells. These continuous signals with a dominant period of 5 s show a clear trend in the direction of their detection being well correlated to the prevailing stratospheric winds. For mb-signals a strong increase in trace velocity along with a decrease in the number of detections were observed during the Austral summer 2006 indicating strong variations in the troposphere and the stratospheric wave duct. However, ECMWF wind speed profiles at the station give no evidence for such an anomaly. Nevertheless, a smaller El-Nino event during Austral winter 2006 together with cooling in the upper stratosphere caused by eruption of the Manam volcano in Indonesia provide a potential explanation for the abnormal ducting conditions. This will be demonstrated with a statistical approach for the dominating ray-parameter launched from the estimated source regions towards IS27 (based on NOAA wave watch III). An increase in gravity wave activity is considered for Austral summer 2006 since a comparison of ECMWF profiles and measured radiosonde data has revealed a cleaning of the numerical profiles with respect to turbulences in the troposphere and lower stratosphere.

  20. Near- and far-field infrasound monitoring in the Mediterranean area

    NASA Astrophysics Data System (ADS)

    Campus, Paola; Marchetti, Emanuele; Le Pichon, Alexis; Wallenstein, Nicolau; Ripepe, Maurizio; Kallel, Mohamed; Mialle, Pierrick

    2013-04-01

    The Mediterranean area is characterized by a number of very interesting sources of infrasound signals and offers a promising playground for the development of a deeper understanding of such sources and of the associated propagation models. The progress in the construction and certification of infrasound arrays belonging to the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) in the vicinity of this area has been complemented, in the last decade, by the construction of infrasound arrays established by several European research groups. The University of Florence (UniFi) plays a crucial role for the detection of infrasound signals in the Mediterranean area, having deployed since several years two infrasound arrays on Stromboli and Etna volcanoes, and, more recently, three infrasound arrays in the Alpine area of NW Italy and one infrasound array on the Apennines (Mount Amiata), designed and established in the framework of the ARISE Project. The IMS infrasound arrays IS42 (Graciosa, Azores, Portugal) and IS48 (Kesra, Tunisia) recorded, since the time of their certification, a number of far-field events which can be correlated with some near-field records of the infrasound arrays belonging to UniFi. An analysis of the results and potentialities of infrasound source's detections in near and far-field realized by IS42, IS48 and UniFi arrays in the Mediterranean area, with special focus on volcanic events is presented. The combined results deriving from the analysis of data recorded by the Unifi arrays and by the IS42 and IS48 arrays, in collaboration with the Department of Analyse et Surveillance (CEA/DASE), will generate a synergy which will certainly contribute to the progress of the ARISE Project.

  1. Inference and analysis of xenon outflow curves under multi-pulse injection in two-dimensional chromatography.

    PubMed

    Shu-Jiang, Liu; Zhan-Ying, Chen; Yin-Zhong, Chang; Shi-Lian, Wang; Qi, Li; Yuan-Qing, Fan

    2013-10-11

    Multidimensional gas chromatography is widely applied to atmospheric xenon monitoring for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). To improve the capability for xenon sampling from the atmosphere, sampling techniques have been investigated in detail. The sampling techniques are designed by xenon outflow curves which are influenced by many factors, and the injecting condition is one of the key factors that could influence the xenon outflow curves. In this paper, the xenon outflow curves of single-pulse injection in two-dimensional gas chromatography has been tested and fitted as a function of exponential modified Gaussian distribution. An inference formula of the xenon outflow curve for six-pulse injection is derived, and the inference formula is also tested to compare with its fitting formula of the xenon outflow curve. As a result, the curves of both the one-pulse and six-pulse injections obey the exponential modified Gaussian distribution when the temperature of the activated carbon column's temperature is 26°C and the flow rate of the carrier gas is 35.6mLmin(-1). The retention time of the xenon peak for one-pulse injection is 215min, and the peak width is 138min. For the six-pulse injection, however, the retention time is delayed to 255min, and the peak width broadens to 222min. According to the inferred formula of the xenon outflow curve for the six-pulse injection, the inferred retention time is 243min, the relative deviation of the retention time is 4.7%, and the inferred peak width is 225min, with a relative deviation of 1.3%. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. The ISC Contribution to Monitoring Research

    NASA Astrophysics Data System (ADS)

    Storchak, D. A.; Bondar, I.; Harris, J.; Gaspà Rebull, O.

    2010-12-01

    The International Seismological Centre (ISC) is a non-governmental organization charged with production of the ISC Bulletin - the definitive global summary of seismicity based on reports from over 4.5 thousand seismic stations worldwide. The ISC data have been extensively used in preparation of the Comprehensive Test Ban Treaty (CTBT). They are now used by the CTBTO Preparatory Technical Secretariat (PTS) and the State Parties as an important benchmark for assessing and monitoring detection capabilities of the International Monitoring System (IMS). The ISC also provides a valuable collection of reviewed waveform readings at academic and operational sites co-located with the IMS stations. To improve the timeliness of its Bulletin, the ISC is making a special effort in collecting preliminary bulletins from a growing number of networks worldwide that become available soon after seismic events occur. Preliminary bulletins are later substituted with the final analysis data once these become available to the ISC from each network. The ISC also collects and maintains data sets that are useful for monitoring research. These are the IASPEI Reference Event List of globally distributed GT0-5 events, the groomed ISC bulletin (EHB), the IDC REB, USArray phase picking data. In cooperation with the World Data Center for Seismology, Denver (USGS), the ISC also maintains the International Seismographic Station Registry that holds parameters of seismic stations used in the international data exchange. The UK Foreign and Commonwealth Office along with partners from several Nordic countries are currently funding a project to make the ISC database securely linked with the computer facilities at PTS and National Data Centres. The ISC Bulletin data are made available via a dedicated software link designed to offer the ISC data in a way convenient to monitoring community.

  3. Volcanic stratigraphy and geochemical variations in Miocene-age rocks in western and southeastern Fort Irwin, California

    NASA Astrophysics Data System (ADS)

    Buesch, D.

    2015-12-01

    Lava flows and tuffaceous deposits ranging in composition from basalt to rhyolite, including basaltic trachyandesite to trachyte, are exposed in 800 km2 of western Fort Irwin area, California, and form the eastern edge of the Eagle Crags volcanic field (ECVF). The main ECVF has 40Ar/39Ar ages from ~18.7-12.4 Ma (mostly 18.7-18.5 Ma; Sabin et al. 1994), and on Fort Irwin, the ages are from 21.0-15.8 Ma (mostly 18.6-15.8 Ma; Schermer et al. 1996). 68 samples (56 lava flow, 4 dome-collapse breccia, 3 ignimbrite, and 5 fallout tephra) were analyzed for major, minor, and trace elements. Typically, stratigraphic sequences dip <30° (mostly <15°) except near faults, with local buttress unconfomities and no large unconfomities. Compositions are moderate-to-high-K type, and similar to Na2O+K2O from Sabin et al. (1994) but with slightly smaller ranges. The generalized stratigraphic sequence is rhyolite (R), dacite (D), or trachyte (T) that form domes, lava flows (up to 3.5 km long), dome-collapse deposits, or pyroclastic deposits, overlain by andesite (A), trachyandesite (TA), basaltic andesite (BA), basaltic trachyandesite (BT), or basalt (B) lava flows (up to 7 km long), and minor cinder cones. A general upward felsic to mafic compositional sequence occurs throughout the area, but is not continuous as B is locally in a R-D sequence and B is at the base of and interstratified with a BA-A sequence. Also, there are compositional variations at different locations along the edges of the field. In the Goldstone Mesa, Pink Canyon, and Stone Ridge areas (~70 km2), B-BA forms the youngest lava flows, but ~21 km to the north in the Garry Owen area (~25 km2), BTA forms the youngest lava flows. Compared to the Stone Ridge area with a D-A-TA-BA trend, ~6 km west in the Pioneer Plateau area is R-TA-D, ~3 km south in the Pink Canyon area is R-B-BA-A, and ~8 km east at Dacite Dome is D only (all areas have slightly different Na2O+K2O in each rock type). A non-ECVF, 5.6 Ma BA flow in SE Fort Irwin also has distinct compositions. Chemical variations indicate the region had similar general evolution of magma sources, but (1) there were numerous small, isolated chambers that fed flows along the edges of the field, (2) several tuffs are similar to local lavas but some differ and might have distant sources, and (3) basalt flows locally encroached into adjacent areas.

  4. The Nature of Scatter at the DARHT Facility and Suggestions for Improved Modeling of DARHT Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morneau, Rachel Anne; Klasky, Marc Louis

    The U.S. Stockpile Stewardship Program [1] is designed to sustain and evaluate the nuclear weapons stockpile while foregoing underground nuclear tests. The maintenance of a smaller, aging U.S. nuclear weapons stockpile without underground testing requires complex computer calculations [14]. These calculations in turn need to be verified and benchmarked [14]. A wide range of research facilities have been used to test and evaluate nuclear weapons while respecting the Comprehensive Nuclear Test-Ban Treaty (CTBT) [2]. Some of these facilities include the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, the Z machine at Sandia National Laboratories, and the Dual Axismore » Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory. This research will focus largely on DARHT (although some information from Cygnus and the Los Alamos Microtron may be used in this research) by modeling it and comparing to experimental data. DARHT is an electron accelerator that employs high-energy flash x-ray sources for imaging hydro-tests. This research proposes to address some of the issues crucial to understanding DARHT Axis II and the analysis of the radiographic images produced. Primarily, the nature of scatter at DARHT will be modeled and verified with experimental data. It will then be shown that certain design decisions can be made to optimize the scatter field for hydrotest experiments. Spectral effects will be briefly explored to determine if there is any considerable effect on the density reconstruction caused by changes in the energy spectrum caused by target changes. Finally, a generalized scatter model will be made using results from MCNP that can be convolved with the direct transmission of an object to simulate the scatter of that object at the detector plane. The region in which with this scatter model is appropriate will be explored.« less

  5. Technical Challenges for a Comprehensive Test Ban: A historical perspective to frame the future (Invited)

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.

    2013-12-01

    In the summer of 1958 scientists from the Soviet block and the US allies met in Geneva to discuss what it would take to monitor a forerunner to a Comprehensive Test Ban Treaty at the 'Conference of Experts to Study the Possibility of Detecting Violations of a Possible Agreement on Suspension of Nuclear Tests'. Although armed with a limited resume of observations, the conference recommended a multi-phenomenology approach (air sampling, acoustics, seismic and electromagnetic) deployed it a network of 170 sites scattered across the Northern Hemisphere, and hypothesized a detection threshold of 1kt for atmospheric tests and 5kt for underground explosions. The conference recommendations spurred vigorous debate, with strong disagreement with the stated detection hypothesis. Nevertheless, the technical challenges posed lead to a very focused effort to improve facilities, methodologies and, most importantly, research and development on event detection, location and identification. In the ensuing 50 years the various challenges arose and were eventually 'solved'; these included quantifying yield determination to enter a Limited Threshold Test Ban, monitoring broad areas of emerging nuclear nations, and after the mid-1990s lowering the global detection threshold to sub-kiloton levels for underground tests. Today there is both an international monitoring regime (ie, the International Monitoring System, or IMS) and a group of countries that have their own national technical means (NTM). The challenges for the international regime are evolving; the IMS has established itself as a very credible monitoring system, but the demand of a CTBT to detect and identify a 'nuclear test' of diminished size (zero yield) poses new technical hurdles. These include signal processing and understanding limits of resolution, location accuracy, integration of heterogeneous data, and accurately characterizing anomalous events. It is possible to extrapolate past technical advances to predict what should be available by 2020; detection of coupled explosions to 100s of tons for all continental areas, as well as a probabilistic assessment of event identification.

  6. From Regional Hazard Assessment to Nuclear-Test-Ban Treaty Support - InSAR Ground Motion Services

    NASA Astrophysics Data System (ADS)

    Lege, T.; Kalia, A.; Gruenberg, I.; Frei, M.

    2016-12-01

    There are numerous scientific applications of InSAR methods in tectonics, earthquake analysis and other geologic and geophysical fields. Ground motion on local and regional scale measured and monitored via the application of the InSAR techniques provide scientists and engineers with plenty of new insights and further understanding of subsurface processes. However, the operational use of InSAR is not yet very widespread. To foster the operational utilization of the Copernicus Sentinel Satellites in the day-to-day business of federal, state and municipal work and planning BGR (Federal Institute for Geosciences and Natural Resources) initiated workshops with potential user groups. Through extensive reconcilement of interests and demands with scientific, technical, economic and governmental stakeholders (e.g. Ministries, Mining Authorities, Geological Surveys, Geodetic Surveys and Environmental Agencies on federal and state level, SMEs, German Aerospace Center) BGR developed the concept of the InSAR based German National Ground Motion Service. One important backbone for the nationwide ground motion service is the so-called Persistent Scatterer Interferometry Wide Area Product (WAP) approach developed with grants of European research funds. The presentation shows the implementation of the ground motion service and examples for product developments for operational supervision of mining, water resources management and spatial planning. Furthermore the contributions of Copernicus Sentinel 1 radar data in the context of CTBT are discussed. The DInSAR processing of Sentinel 1 IW (Interferometric Wide Swath) SAR acquisitions from January 1st and 13th Jan. 2016 allow for the first time a near real time ground motion measurement of the North Korean nuclear test site. The measured ground displacements show a strong spatio-temporal correlation to the calculated epicenter measured by teleseismic stations. We are convinced this way another space technique will soon contribute even further to secure better societal information needs.

  7. Being prepared to verify the CTBT-Atmospheric Transport modeling and radionuclide analysis at the Austrian National Data Centre during the NDC Preparedness Exercise 2009

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; Schraick, Irene

    2010-05-01

    An explosion in the Kara-Zhyra mine in Eastern Kazakhstan on 28 November 2009 around 07:20 UTC was recorded by both the CTBTO seismic and infrasound networks. This event triggered a world-wide preparedness exercise among the CTBTO National Data Centres. Within an hour after the event was selected by the German NDC, a computer program developed by NDC Austria based on weather forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) and from the U.S. National Centers for Environmental Prediction (NCEP) was started to analyse what Radionuclide Stations of the CTBTO International Monitoring System (IMS) would be potentially affected by the release from a nuclear explosion at this place in the course of the following 3-10 days. These calculations were daily updated to consider the observed state of the atmosphere instead of the predicted one. Based on these calculations, automated and reviewed radionuclide reports from the potentially affected stations as produced by the CTBTO International Data Centre (IDC) were looked at. An additional analysis of interesting spectra was provided by the Seibersdorf Laboratories. Based on all the results coming in, no evidence whatsoever was found that the explosion in Kazakhstan was nuclear. This is in accordance with ground truth information saying that the event was caused by the detonation of more than 53 Tons of explosives as part of mining operations. A number of conclusions can be drawn from this exercise. First, the international, bilateral as well as national mechanisms and procedures in place for such an event worked smoothly. Second, the products and services from the CTBTO IDC proved to be very useful to assist the member states in their verification efforts. Last but not least, issues with the availability of data from IMS radionuclide stations do remain.

  8. Issues Involving The OSI Concept of Operation For Noble Gas Radionuclide Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrigan, C R; Sun, Y

    2011-01-21

    The development of a technically sound protocol for detecting the subsurface release of noble gas radionuclides is critical to the successful operation of an on site inspection (OSI) under the CTBT and has broad ramifications for all aspects of the OSI regime including the setting of specifications for both sampling and analysis equipment used during an OSI. With NA-24 support, we are investigating a variety of issues and concerns that have significant bearing on policy development and technical guidance regarding the detection of noble gases and the creation of a technically justifiable OSI concept of operation. The work at LLNLmore » focuses on optimizing the ability to capture radioactive noble gases subject to the constraints of possible OSI scenarios. This focus results from recognizing the difficulty of detecting gas releases in geologic environments - a lesson we learned previously from the LLNL Non-Proliferation Experiment (NPE). Evaluation of a number of important noble gas detection issues, potentially affecting OSI policy, has awaited the US re-engagement with the OSI technical community. Thus, there have been numerous issues to address during the past 18 months. Most of our evaluations of a sampling or transport issue necessarily involve computer simulations. This is partly due to the lack of OSI-relevant field data, such as that provided by the NPE, and partly a result of the ability of LLNL computer-based models to test a range of geologic and atmospheric scenarios far beyond what could ever be studied in the field making this approach very highly cost effective. We review some highlights of the transport and sampling issues we have investigated during the past year. We complete the discussion of these issues with a description of a preliminary design for subsurface sampling that is intended to be a practical solution to most if not all the challenges addressed here.« less

  9. Detection, location, and characterization of hydroacoustic signals using seafloor cable networks offshore Japan (Invited)

    NASA Astrophysics Data System (ADS)

    Sugioka, H.; Suyehiro, K.; Shinohara, M.

    2009-12-01

    The hydroacoustic monitoring by the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Treaty (CTBT) verification system utilize hydrophone stations and seismic stations called T-phase stations for worldwide detection. Some signals of natural origin include those from earthquakes, submarine volcanic eruptions, or whale calls. Among artificial sources there are non-nuclear explosions and air-gun shots. It is important for IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressures, hydrophones and seismic sensors) may be utilized to verify and increase the capability of the IMS. We use these data to compare some selected event parameters with those by Pacific in the time period of 2004-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals. The seafloor cable networks composed of three hydrophones and six seismometers and a temporal dense seismic array detected and located hydroacoustic events offshore Japanese island on 12th of March in 2008, which had been reported by the IMS. We detected not only the reverberated hydroacoustic waves between the sea surface and the sea bottom but also the seismic waves going through the crust associated with the events. The determined source of the seismic waves is almost coincident with the one of hydroacoustic waves, suggesting that the seismic waves are converted very close to the origin of the hydroacoustic source. We also detected very similar signals on 16th of March in 2009 to the ones associated with the event of 12th of March in 2008.

  10. Grid-search Moment Tensor Estimation: Implementation and CTBT-related Application

    NASA Astrophysics Data System (ADS)

    Stachnik, J. C.; Baker, B. I.; Rozhkov, M.; Friberg, P. A.; Leifer, J. M.

    2017-12-01

    This abstract presents a review work related to moment tensor estimation for Expert Technical Analysis at the Comprehensive Test Ban Treaty Organization. In this context of event characterization, estimation of key source parameters provide important insights into the nature of failure in the earth. For example, if the recovered source parameters are indicative of a shallow source with large isotropic component then one conclusion is that it is a human-triggered explosive event. However, an important follow-up question in this application is - does an alternative hypothesis like a deeper source with a large double couple component explain the data approximately as well as the best solution? Here we address the issue of both finding a most likely source and assessing its uncertainty. Using the uniform moment tensor discretization of Tape and Tape (2015) we exhaustively interrogate and tabulate the source eigenvalue distribution (i.e., the source characterization), tensor orientation, magnitude, and source depth. The benefit of the grid-search is that we can quantitatively assess the extent to which model parameters are resolved. This provides a valuable opportunity during the assessment phase to focus interpretation on source parameters that are well-resolved. Another benefit of the grid-search is that it proves to be a flexible framework where different pieces of information can be easily incorporated. To this end, this work is particularly interested in fitting teleseismic body waves and regional surface waves as well as incorporating teleseismic first motions when available. Being that the moment tensor search methodology is well-established we primarily focus on the implementation and application. We present a highly scalable strategy for systematically inspecting the entire model parameter space. We then focus on application to regional and teleseismic data recorded during a handful of natural and anthropogenic events, report on the grid-search optimum, and discuss the resolution of interesting and/or important recovered source properties.

  11. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    DOE PAGES

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; ...

    2017-12-28

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea)more » underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.« less

  12. Frequency and Size of Strombolian Eruptions from the Phonolitic Lava Lake at Erebus Volcano, Antarctica: Insights from Infrasound and Seismic Observations on Bubble Formation and Ascent

    NASA Astrophysics Data System (ADS)

    Rotman, H. M. M.; Kyle, P. R.; Fee, D.; Curtis, A.

    2015-12-01

    Erebus, an active intraplate volcano on Ross Island, commonly produces bubble burst Strombolian explosions from a long-lived, convecting phonolitic lava lake. Persistent lava lakes are rare, and provide direct insights into their underlying magmatic system. Erebus phonolite is H2O-poor and contains ~30% anorthoclase megacrysts. At shallow depths lab measurements suggest the magma has viscosities of ~107 Pa s. This has implications for magma and bubble ascent rates through the conduit and into the lava lake. The bulk composition and matrix glass of Erebus ejecta has remained uniform for many thousands of years, but eruptive activity varies on decadal and shorter time scales. Over the last 15 years, increased activity took place in 2005-2007, and more recently in the 2013 austral summer. In the 2014 austral summer, new infrasound sensors were installed ~700 m from the summit crater hosting the lava lake. These sensors, supplemented by the Erebus network seismic stations, recorded >1000 eruptions between 1 January and 7 April 2015, with an average infrasound daily uptime of 9.6 hours. Over the same time period, the CTBT infrasound station IS55, ~25 km from Erebus, detected ~115 of the >1000 locally observed eruptions with amplitude decreases of >100x. An additional ~200 eruptions were recorded during local infrasound downtime. This represents an unusually high level of activity from the Erebus lava lake, and while instrument noise influences the minimum observable amplitude each day, the eruption infrasound amplitudes may vary by ~3 orders of magnitude over the scale of minutes to hours. We use this heightened period of variable activity and associated seismic and acoustic waveforms to examine mechanisms for bubble formation and ascent, such as rise speed dependence and collapsing foam; repose times for the larger eruptions; and possible eruption connections to lava lake cyclicity.

  13. The Use of Explosion Aftershock Probabilities for Planning and Deployment of Seismic Aftershock Monitoring System for an On-site Inspection

    NASA Astrophysics Data System (ADS)

    Labak, P.; Ford, S. R.; Sweeney, J. J.; Smith, A. T.; Spivak, A.

    2011-12-01

    One of four elements of CTBT verification regime is On-site inspection (OSI). Since the sole purpose of an OSI shall be to clarify whether a nuclear weapon test explosion or any other nuclear explosion has been carried out, inspection activities can be conducted and techniques used in order to collect facts to support findings provided in inspection reports. Passive seismological monitoring, realized by the seismic aftershock monitoring (SAMS) is one of the treaty allowed techniques during an OSI. Effective planning and deployment of SAMS during the early stages of an OSI is required due to the nature of possible events recorded and due to the treaty related constrains on size of inspection area, size of inspection team and length of an inspection. A method, which may help in planning the SAMS deployment is presented. An estimate of aftershock activity due to a theoretical underground nuclear explosion is produced using a simple aftershock rate model (Ford and Walter, 2010). The model is developed with data from the Nevada Test Site and Semipalatinsk Test Site, which we take to represent soft- and hard-rock testing environments, respectively. Estimates of expected magnitude and number of aftershocks are calculated using the models for different testing and inspection scenarios. These estimates can help to plan the SAMS deployment for an OSI by giving a probabilistic assessment of potential aftershocks in the Inspection Area (IA). The aftershock assessment combined with an estimate of the background seismicity in the IA and an empirically-derived map of threshold magnitude for the SAMS network could aid the OSI team in reporting. We tested the hard-rock model to a scenario similar to the 2008 Integrated Field Exercise 2008 deployment in Kazakhstan and produce an estimate of possible recorded aftershock activity.

  14. Modeling Collapse Chimney and Spall Zone Settlement as a Source of Post-Shot Subsidence Detected by Synthetic Aperture Radar Interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foxwall, W.

    2000-07-24

    Ground surface subsidence resulting from the March 1992 JUNCTION underground nuclear test at the Nevada Test Site (NTS) imaged by satellite synthetic aperture radar interferometry (InSAR) wholly occurred during a period of several months after the shot (Vincent et al., 1999) and after the main cavity collapse event. A significant portion of the subsidence associated with the small (less than 20 kt) GALENA and DIVIDER tests probably also occurred after the shots, although the deformation detected in these cases contains additional contributions from coseismic processes, since the radar scenes used to construct the deformation interferogram bracketed these two later events,more » The dimensions of the seas of subsidence resulting from all three events are too large to be solely accounted for by processes confined to the damage zone in the vicinity of the shot point or the collapse chimney. Rather, the subsidence closely corresponds to the span dimensions predicted by Patton's (1990) empirical relationship between spall radius and yield. This suggests that gravitational settlement of damaged rock within the spall zone is an important source of post-shot subsidence, in addition to settlement of the rubble within the collapse chimney. These observations illustrate the potential power of InSAR as a tool for Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring and on-site inspection in that the relatively broad ({approx} 100 m to 1 km) subsidence signatures resulting from small shots detonated at normal depths of burial (or even significantly overburied) are readily detectable within large geographical areas (100 km x 100 km) under favorable observing conditions. Furthermore, the present results demonstrate the flexibility of the technique in that the two routinely gathered satellite radar images used to construct the interferogram need not necessarily capture the event itself, but can cover a time period up to several months following the shot.« less

  15. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    NASA Astrophysics Data System (ADS)

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; Chiswell, S. R.

    2018-03-01

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea) underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.

  16. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea)more » underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.« less

  17. Underground structure characterization using motor vehicles as passive seismic sources

    NASA Astrophysics Data System (ADS)

    Kuzma, H. A.; Liu, Y.; Zhao, Y.; Rector, J.; Vaidya, S.

    2009-12-01

    The ability to detect and characterize underground voids will be critical to the success of On-Site Inspections (OSI) as mandated by the nuclear Comprehensive Test Ban Treaty (CTBT). OSIs may be conducted in order to successfully locate the Ground Zero of underground tests as well as infrastructure related to testing. Recently, our team has shown the potential of a new technique to detect underground objects using the amplitude of seismic surface waves generated by motor vehicles. In an experiment conducted in June, 2009 we were able to detect an abandoned railroad tunnel by recognizing a clear pattern in the surface waves scattered by the tunnel, using a signal generated by driving a car on a dirt road across the tunnel. Synthetic experiments conducted using physically realistic wave-equation models further suggest that the technique can be readily applied to detecting underground features: it may be possible to image structures of importance to OSI simply by laying out an array of geophones (or using an array already in place for passive listening for event aftershocks) and driving vehicles around the site. We present evidence from a set of field experiments and from synthetic modeling and inversion studies to illustrate adaptations of the technique for OSI. Signature of an abandoned underground railroad tunnel at Donner Summit, CA. To produce this image, a line of geophones was placed along a dirt road perpendicular to the tunnel (black box) and a single car was driven along the road. A normalized mean power-spectrum is displayed on a log scale as a function of meters from the center of the tunnel. The top of the tunnel was 18m below ground surface. The tunnel anomaly is made up of a shadow (light) directly above the tunnel and amplitude build-up (dark) on either side of the tunnel. The size of the anomaly (6 orders of magnitude) suggests that the method can be extended to find deep structures at greater distances from the source and receivers.

  18. Use of open source information and commercial satellite imagery for nuclear nonproliferation regime compliance verification by a community of academics

    NASA Astrophysics Data System (ADS)

    Solodov, Alexander

    The proliferation of nuclear weapons is a great threat to world peace and stability. The question of strengthening the nonproliferation regime has been open for a long period of time. In 1997 the International Atomic Energy Agency (IAEA) Board of Governors (BOG) adopted the Additional Safeguards Protocol. The purpose of the protocol is to enhance the IAEA's ability to detect undeclared production of fissile materials in member states. However, the IAEA does not always have sufficient human and financial resources to accomplish this task. Developed here is a concept for making use of human and technical resources available in academia that could be used to enhance the IAEA's mission. The objective of this research was to study the feasibility of an academic community using commercially or publicly available sources of information and products for the purpose of detecting covert facilities and activities intended for the unlawful acquisition of fissile materials or production of nuclear weapons. In this study, the availability and use of commercial satellite imagery systems, commercial computer codes for satellite imagery analysis, Comprehensive Test Ban Treaty (CTBT) verification International Monitoring System (IMS), publicly available information sources such as watchdog groups and press reports, and Customs Services information were explored. A system for integrating these data sources to form conclusions was also developed. The results proved that publicly and commercially available sources of information and data analysis can be a powerful tool in tracking violations in the international nuclear nonproliferation regime and a framework for implementing these tools in academic community was developed. As a result of this study a formation of an International Nonproliferation Monitoring Academic Community (INMAC) is proposed. This would be an independent organization consisting of academics (faculty, staff and students) from both nuclear weapon states (NWS) and non-nuclear weapon states (NNWS). This community analyzes all types of unclassified publicly and commercially available information to aid in detection of violations of the non-proliferation regime. INMAC shares all of this information with the IAEA and the public. Since INMAC is composed solely by members of the academic community, this organization would not demonstrate any biases in its investigations or reporting.

  19. Automatic recovery of aftershock sequences at the International Data Centre: from concept to pipeline

    NASA Astrophysics Data System (ADS)

    Kitov, I.; Bobrov, D.; Rozhkov, M.

    2016-12-01

    Aftershocks of larger earthquakes represent an important source of information on the distribution and evolution of stresses and deformations in pre-seismic, co-seismic and post-seismic phases. For the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO) largest aftershocks sequences are also a challenge for automatic and interactive processing. The highest rate of events recorded by two and more seismic stations of the International Monitoring System from a relatively small aftershock area may reach hundreds per hour (e.g. Sumatra 2004 and Tohoku 2011). Moreover, there are thousands of reflected/refracted phases per hour with azimuth and slowness within the uncertainty limits of the first P-waves. Misassociation of these later phases, both regular and site specific, as the first P-wave results in creation of numerous wrong event hypotheses in automatic IDC pipeline. In turn, interactive review of such wrong hypotheses is direct waste of analysts' resources. Waveform cross correlation (WCC) is a powerful tool to separate coda phases from actual P-wave arrivals and to fully utilize the repeat character of waveforms generated by events close in space. Array seismic stations of the IMS enhance the performance of the WCC in two important aspects - they reduce detection threshold and effectively suppress arrivals from all sources except master events. An IDC specific aftershock tool has been developed and merged with standard IDC pipeline. The tool includes several procedures: creation of master events consisting of waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point of interactive analysis. Since global monitoring of underground nuclear tests is based on historical and synthetic data, each aftershock sequence can be tested for the CTBT violation with big earthquakes as an evasion scenario.

  20. Infrasound analysis of I18DK, northwest Greenland

    NASA Astrophysics Data System (ADS)

    Evers, L. G.; Weemstra, C.

    2010-12-01

    Within the scope of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), four methods are used to verify the treaty. One of these methods is based on the detection of infrasound waves generated by a nuclear explosion. Seismological, hydroacoustical and radionuclide measurements are also applied. The International Monitoring System (IMS) will consist of 60 infrasound stations of which 35 stations are currently operational. Data obtained from an infrasound station situated on the northwestern shoreline of Greenland is analyzed. This station is operated by Denmark and labeled as I18DK. I18DK is situated in an area which receives an ever increasing attention from a geophysical perspective. I18DK has continuously been operational from April 2003 and onwards. The IMS station is an infrasound array with an aperture of about 1200 meters, where air-pressure fluctuations are recorded by eight microbarometers at a sample-rate of 20 Hz. The infrasonic recordings are filtered between 0.1 & 1.0 and 1.0 & 6.0 Hz. The slowness grid is searched for two different configurations in the higher frequency band. Once using all 8 stations and once only taking into account the 5 center stations. Several different source types are known to generate infrasound, for example, calving of icebergs and glaciers, explosions, earthquakes, oceanic wave-wave interaction, volcanic eruptions and aurora. The challenge is to distinguish between these different source types and use the outcome of the array analysis to better understand these phenomena. The rate of occurrence of icequakes, the calving of glaciers and the variation in extent of the sea ice in this area is of interest in relation to global warming. The processing results of the 1 to 6 Hz band seem to show dominating back-azimuths related to these sources. The glaciers south of I18DK produce significant infrasound during summer time. As well, a direct link can be found between the number of warm days in a year and the number of infrasound detections from a north-northeast direction. These signals seem to be generated by run- off of water from the local ice cap north of I18DK.

  1. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.

  2. Numerical survey of pressure wave propagation around and inside an underground cavity with high order FEM

    NASA Astrophysics Data System (ADS)

    Esterhazy, Sofi; Schneider, Felix; Schöberl, Joachim; Perugia, Ilaria; Bokelmann, Götz

    2016-04-01

    The research on purely numerical methods for modeling seismic waves has been more and more intensified over last decades. This development is mainly driven by the fact that on the one hand for subsurface models of interest in exploration and global seismology exact analytic solutions do not exist, but, on the other hand, retrieving full seismic waveforms is important to get insides into spectral characteristics and for the interpretation of seismic phases and amplitudes. Furthermore, the computational potential has dramatically increased in the recent past such that it became worthwhile to perform computations for large-scale problems as those arising in the field of computational seismology. Algorithms based on the Finite Element Method (FEM) are becoming increasingly popular for the propagation of acoustic and elastic waves in geophysical models as they provide more geometrical flexibility in terms of complexity as well as heterogeneity of the materials. In particular, we want to demonstrate the benefit of high-order FEMs as they also provide a better control on the accuracy. Our computations are done with the parallel Finite Element Library NGSOLVE ontop of the automatic 2D/3D mesh generator NETGEN (http://sourceforge.net/projects/ngsolve/). Further we are interested in the generation of synthetic seismograms including direct, refracted and converted waves in correlation to the presence of an underground cavity and the detailed simulation of the comprehensive wave field inside and around such a cavity that would have been created by a nuclear explosion. The motivation of this application comes from the need to find evidence of a nuclear test as they are forbidden by the Comprehensive Nuclear-Test Ban Treaty (CTBT). With this approach it is possible for us to investigate the wave field over a large bandwidth of wave numbers. This again will help to provide a better understanding on the characteristic signatures of an underground cavity, improve the protocols for OSI field deployment and create solid observational strategies for detecting the presence of an underground (nuclear) cavity.

  3. Big Data Solution for CTBT Monitoring Using Global Cross Correlation

    NASA Astrophysics Data System (ADS)

    Gaillard, P.; Bobrov, D.; Dupont, A.; Grenouille, A.; Kitov, I. O.; Rozhkov, M.

    2014-12-01

    Due to the mismatch between data volume and the performance of the Information Technology infrastructure used in seismic data centers, it becomes more and more difficult to process all the data with traditional applications in a reasonable elapsed time. To fulfill their missions, the International Data Centre of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO/IDC) and the Département Analyse Surveillance Environnement of Commissariat à l'Energie atomique et aux énergies alternatives (CEA/DASE) collect, process and produce complex data sets whose volume is growing exponentially. In the medium term, computer architectures, data management systems and application algorithms will require fundamental changes to meet the needs. This problem is well known and identified as a "Big Data" challenge. To tackle this major task, the CEA/DASE takes part during two years to the "DataScale" project. Started in September 2013, DataScale gathers a large set of partners (research laboratories, SMEs and big companies). The common objective is to design efficient solutions using the synergy between Big Data solutions and the High Performance Computing (HPC). The project will evaluate the relevance of these technological solutions by implementing a demonstrator for seismic event detections thanks to massive waveform correlations. The IDC has developed an expertise on such techniques leading to an algorithm called "Master Event" and provides a high-quality dataset for an extensive cross correlation study. The objective of the project is to enhance the Master Event algorithm and to reanalyze 10 years of waveform data from the International Monitoring System (IMS) network thanks to a dedicated HPC infrastructure operated by the "Centre de Calcul Recherche et Technologie" at the CEA of Bruyères-le-Châtel. The dataset used for the demonstrator includes more than 300,000 seismic events, tens of millions of raw detections and more than 30 terabytes of continuous seismic data from the primary IMS stations. In this talk, we will present the Master Event algorithm and the associated workflow, we will give an overview of the designed technical solutions (from the building blocks to the global infrastructure), and we will show the preliminary results at a regional scale.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, Sean R.; Walter, William R.

    Seismic waveform correlation offers the prospect of greatly reducing event detection thresholds when compared with more conventional processing methods. Correlation is applicable for seismic events that in some sense repeat, that is they have very similar waveforms. A number of recent studies have shown that correlated seismic signals may form a significant fraction of seismicity at regional distances. For the particular case of multiple nuclear explosions at the same test site, regional distance correlation also allows very precise relative location measurements and could offer the potential to lower thresholds when multiple events exist. Using the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Internationalmore » Monitoring System (IMS) seismic array at Matsushiro, Japan (MJAR), Gibbons and Ringdal (2012) were able to create a multichannel correlation detector with a very low false alarm rate and a threshold below magnitude 3.0. They did this using the 2006 or 2009 Democratic People’s Republic of Korea (DPRK) nuclear explosion as a template to search through a data stream from the same station to find a match via waveform correlation. In this paper, we extend the work of Gibbons and Ringdal (2012) and measure the correlation detection threshold at several other IMS arrays. We use this to address three main points. First, we show the IMS array station at Mina, Nevada (NVAR), which is closest to the Nevada National Security Site (NNSS), is able to detect a chemical explosion that is well under 1 ton with the right template. Second, we examine the two IMS arrays closest to the North Korean (DPRK) test site (at Ussuriysk, Russian Federation [USRK] and Wonju, Republic of Korea [KSRS]) to show that similarly low thresholds are possible when the right templates exist. We also extend the work of Schaff et al. (2012) and measure the correlation detection threshold at the nearest Global Seismic Network (GSN) three-component station (MDJ) at Mudanjiang, Heilongjiang Province, China, from the New China Digital Seismograph Network (IC). To conclude, we use these results to explore the recent claim by Zhang and Wen (2015) that the DPRK conducted “…a low-yield nuclear test…” on 12 May 2010.« less

  5. Big Data solution for CTBT monitoring: CEA-IDC joint global cross correlation project

    NASA Astrophysics Data System (ADS)

    Bobrov, Dmitry; Bell, Randy; Brachet, Nicolas; Gaillard, Pierre; Kitov, Ivan; Rozhkov, Mikhail

    2014-05-01

    Waveform cross-correlation when applied to historical datasets of seismic records provides dramatic improvements in detection, location, and magnitude estimation of natural and manmade seismic events. With correlation techniques, the amplitude threshold of signal detection can be reduced globally by a factor of 2 to 3 relative to currently standard beamforming and STA/LTA detector. The gain in sensitivity corresponds to a body wave magnitude reduction by 0.3 to 0.4 units and doubles the number of events meeting high quality requirements (e.g. detected by three and more seismic stations of the International Monitoring System (IMS). This gain is crucial for seismic monitoring under the Comprehensive Nuclear-Test-Ban Treaty. The International Data Centre (IDC) dataset includes more than 450,000 seismic events, tens of millions of raw detections and continuous seismic data from the primary IMS stations since 2000. This high-quality dataset is a natural candidate for an extensive cross correlation study and the basis of further enhancements in monitoring capabilities. Without this historical dataset recorded by the permanent IMS Seismic Network any improvements would not be feasible. However, due to the mismatch between the volume of data and the performance of the standard Information Technology infrastructure, it becomes impossible to process all the data within tolerable elapsed time. To tackle this problem known as "BigData", the CEA/DASE is part of the French project "DataScale". One objective is to reanalyze 10 years of waveform data from the IMS network with the cross-correlation technique thanks to a dedicated High Performance Computer (HPC) infrastructure operated by the Centre de Calcul Recherche et Technologie (CCRT) at the CEA of Bruyères-le-Châtel. Within 2 years we are planning to enhance detection and phase association algorithms (also using machine learning and automatic classification) and process about 30 terabytes of data provided by the IDC to update the world seismicity map. From the new events and those in the IDC Reviewed Event Bulletin, we will automatically create various sets of master event templates that will be used for the event location globally by the CTBTO and CEA.

  6. Insights from the Source Physics Experiments on P/S Amplitude Ratio Methods of Identifying Explosions in a Background of Earthquakes

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Xu, H.; Pasyanos, M. E.; Pyle, M. L.; Matzel, E.; Mellors, R. J.; Hauk, T. F.

    2012-12-01

    It is well established empirically that regional distance (200-1600 km) amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes. However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. A goal of the Source Physics Experiments (SPE) at the Nevada National Security Site (NNSS, formerly the Nevada Test Site (NTS)) is to improve our physical understanding of the mechanisms of explosion S-wave generation and advance our ability to numerically model and predict them. Current models of explosion P/S values suggest they are frequency dependent with poor performance below the source corner frequencies and good performance above. This leads to expectations that small magnitude explosions might require much higher frequencies (>10 Hz) to identify them. Interestingly the 1-ton chemical source physics explosions SPE2 and SPE3 appear to discriminate well from background earthquakes in the frequency band 6-8 Hz, where P and S signals are visible at the NVAR array located near Mina, NV about 200 km away. NVAR is a primary seismic station in the International Monitoring System (IMS), part of the Comprehensive nuclear-Test-Ban Treaty (CTBT). The NVAR broadband element NV31 is co-located with the LLNL station MNV that recorded many NTS nuclear tests, allowing the comparison. We find the small SPE explosions in granite have similar Pn/Lg values at 6-8 Hz as the past nuclear tests mainly in softer rocks. We are currently examining a number of other stations in addition to NVAR, including the dedicated SPE stations that recorded the SPE explosions at much closer distances with very high sample rates, in order to better understand the observed frequency dependence as compared with the model predictions. We plan to use these observations to improve our explosion models and our ability to understand and predict where P/S methods of identifying explosions work and any circumstances where they may not. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  7. Seismological analysis of the fourth North Korean nuclear test

    NASA Astrophysics Data System (ADS)

    Hartmann, Gernot; Gestermann, Nicolai; Ceranna, Lars

    2016-04-01

    The Democratic People's Republic of Korea has conducted its fourth underground nuclear explosions on 06.01.2016 at 01:30 (UTC). The explosion was clearly detected and located by the seismic network of the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Additional seismic stations of international earthquake monitoring networks at regional distances, which are not part of the IMS, are used to precisely estimate the epicenter of the event in the North Hamgyong province (41.38°N / 129.05°E). It is located in the area of the North Korean Punggye-ri nuclear test site, where the verified nuclear tests from 2006, 2009, and 2013 were conducted as well. The analysis of the recorded seismic signals provides the evidence, that the event was originated by an explosive source. The amplitudes as well as the spectral characteristics of the signals were examined. Furthermore, the similarity of the signals with those from the three former nuclear tests suggests very similar source type. The seismograms at the 8,200 km distant IMS station GERES in Germany, for example, show the same P phase signal for all four explosions, differing in the amplitude only. The comparison of the measured amplitudes results in the increasing magnitude with the chronology of the explosions from 2006 (mb 4.2), 2009 (mb 4.8) until 2013 (mb 5.1), whereas the explosion in 2016 had approximately the same magnitude as that one three years before. Derived from the magnitude, a yield of 14 kt TNT equivalents was estimated for both explosions in 2013 and 2016; in 2006 and 2009 yields were 0.7 kt and 5.4 kt, respectively. However, a large inherent uncertainty for these values has to be taken into account. The estimation of the absolute yield of the explosions depends very much on the local geological situation and the degree of decoupling of the explosive from the surrounding rock. Due to the missing corresponding information, reliable magnitude-yield estimation for the North Korean test site is proved to be difficult. The direct evidence for the nuclear character of the explosion can only be found, if radioactive fission products of the explosion get released into the atmosphere and detected. The corresponding analysis by Atmospheric Transport Modelling is presented on the poster by O. Ross and L. Ceranna assessing the detection chances of IMS radionuclide stations.

  8. International Monitoring System Correlation Detection at the North Korean Nuclear Test Site at Punggye-ri with Insights from the Source Physics Experiment

    DOE PAGES

    Ford, Sean R.; Walter, William R.

    2015-05-06

    Seismic waveform correlation offers the prospect of greatly reducing event detection thresholds when compared with more conventional processing methods. Correlation is applicable for seismic events that in some sense repeat, that is they have very similar waveforms. A number of recent studies have shown that correlated seismic signals may form a significant fraction of seismicity at regional distances. For the particular case of multiple nuclear explosions at the same test site, regional distance correlation also allows very precise relative location measurements and could offer the potential to lower thresholds when multiple events exist. Using the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Internationalmore » Monitoring System (IMS) seismic array at Matsushiro, Japan (MJAR), Gibbons and Ringdal (2012) were able to create a multichannel correlation detector with a very low false alarm rate and a threshold below magnitude 3.0. They did this using the 2006 or 2009 Democratic People’s Republic of Korea (DPRK) nuclear explosion as a template to search through a data stream from the same station to find a match via waveform correlation. In this paper, we extend the work of Gibbons and Ringdal (2012) and measure the correlation detection threshold at several other IMS arrays. We use this to address three main points. First, we show the IMS array station at Mina, Nevada (NVAR), which is closest to the Nevada National Security Site (NNSS), is able to detect a chemical explosion that is well under 1 ton with the right template. Second, we examine the two IMS arrays closest to the North Korean (DPRK) test site (at Ussuriysk, Russian Federation [USRK] and Wonju, Republic of Korea [KSRS]) to show that similarly low thresholds are possible when the right templates exist. We also extend the work of Schaff et al. (2012) and measure the correlation detection threshold at the nearest Global Seismic Network (GSN) three-component station (MDJ) at Mudanjiang, Heilongjiang Province, China, from the New China Digital Seismograph Network (IC). To conclude, we use these results to explore the recent claim by Zhang and Wen (2015) that the DPRK conducted “…a low-yield nuclear test…” on 12 May 2010.« less

  9. On-Site inspections as a tool for nuclear explosion monitoring in the framework of the Comprehensive Nuclear Test Ban Treaty

    NASA Astrophysics Data System (ADS)

    Arndt, R.; Gaya-Pique, L.; Labak, P.; Tanaka, J.

    2009-04-01

    On-site inspections (OSIs) constitute the final verification measure under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). OSIs are launched to establish whether or not a nuclear explosion has been carried out, thus they are conducted to verify States' compliance with the Treaty. During such an inspection, facts are gathered within a limited investigation area of 1000 Km2 to identify possible violators of the Treaty. Time scale (referring both to the preparation of the inspection as well as to the conduct of an OSI itself) is one of the challenges that an inspection team has to face when conducting an OSI. Other challenges are the size of the team - which is limited to 40 inspectors - and political limitations imposed by the Treaty in the use of allowed techniques. The Integrated Field Exercise 2008 (IFE08) recently conducted in Kazakhstan was the first large-scale, as well as the most comprehensive, on site inspection exercise ever conducted by the Preparatory Commission of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The exercise took place in a deserted area south east of Kurchatov, within the former Soviet Union's Semipalatinsk nuclear test site. In this paper we will provide an overview of the technical activities conducted by the inspection team during IFE08 in order to collect evidence for a hypothetical nuclear explosion test. The techniques applied can be distributed in four different blocks: visual observation (to look for man-made changes in the geomorphology as well as anthropogenic features related to an underground nuclear explosion, UNE); passive seismic monitoring (to identify possible aftershocks created by the UNE); radionuclide measurements (to collect evidence for radionuclide isotopes related to a nuclear explosion); and finally geophysical surveys (to identify geophysical signatures related to an UNE in terms of changes in the geological strata, to the hydrogeological regime, and in terms of the shallow remains of the infrastructure deployed during the preparation and monitoring of the test). The data collected during IFE08, together with data from previous exercises, set the fundaments of a database of invaluable value to be used by CTBTO in the future for a better understanding of the phenomenology related to a nuclear explosion.

  10. Improvement of IDC/CTBTO Event Locations in Latin America and the Caribbean Using a Regional Seismic Travel Time Model

    NASA Astrophysics Data System (ADS)

    Given, J. W.; Guendel, F.

    2013-05-01

    The International Data Centre is a vital element of the Comprehensive Test Ban Treaty (CTBT) verification mechanism. The fundamental mission of the International Data Centre (IDC) is to collect, process, and analyze monitoring data and to present results as event bulletins to Member States. For the IDC and in particular for waveform technologies, a key measure of the quality of its products is the accuracy by which every detected event is located. Accurate event location is crucial for purposes of an On Site Inspection (OSI), which would confirm the conduct of a nuclear test. Thus it is important for the IDC monitoring and data analysis to adopt new processing algorithms that improve the accuracy of event location. Among them the development of new algorithms to compute regional seismic travel times through 3-dimensional models have greatly increased IDC's location precision, the reduction of computational time, allowing forward and inverse modeling of large data sets. One of these algorithms has been the Regional Seismic Travel Time model (RSTT) of Myers et al., (2011). The RSTT model is nominally a global model; however, it currently covers only North America and Eurasia in sufficient detail. It is the intention CTBTO's Provisional Technical Secretariat and the IDC to extend the RSTT model to other regions of the earth, e.g. Latin America-Caribbean, Africa and Asia. This is particularly important for the IDC location procedure, as there are regions of the earth for which crustal models are not well constrained. For this purpose IDC has launched a RSTT initiative. In May 2012, a technical meeting was held in Vienna under the auspices of the CTBTO. The purpose of this meeting was to invite National Data Centre experts as well as network operators from Africa, Europe, the Middle East, Asia, Australia, Latin and North America to discuss the context under which a project to extend the RSTT model would be implemented. A total of 41 participants from 32 Member States were present. The Latin America and Caribbean region with a rapidly expanding group of advanced seismic networks was selected as a pilot project for the implementation of a regional RSTT initiative. This poster will assess the actions taken by the IDC in order to advance the RSTT project in the Latin America and Caribbean Region.

  11. Three-dimensional parabolic equation models of the acoustic coverage of the CTBT hydrophone station at Crozet

    NASA Astrophysics Data System (ADS)

    Zampolli, Mario; Haralabus, Georgios; Prior, Mark K.; Heaney, Kevin D.; Campbell, Richard

    2014-05-01

    Hydrophone stations of the Comprehensive Nuclear-Test-Ban Organisation (CTBTO) International Monitoring System (IMS), with the exception of one in Australia, comprise two triplets of submerged moored hydrophones, one North and one South of the island from which the respective system is deployed. Triplet distances vary approximately between 50 - 100 km from the island, with each triplet connected to the receiving shore equipment by fibre-optic submarine data cables. Once deployed, the systems relay underwater acoustic waveforms in the band 1 - 100 Hz in real time to Vienna via a shore based satellite link. The design life of hydroacoustic stations is at least 20 years, without need for any maintenance of the underwater system. The re-establishment of hydrophone monitoring station HA04 at Crozet (French Southern and Antarctic Territories) in the South-Western Indian Ocean is currently being investigated. In order to determine appropriate locations and depths for the installation of the hydrophones a number of constraints need to be taken into account and balanced against each other. The most important of these are (i) hydrophone depth in a region where the sound-speed profile is mostly upward refracting and the Sound Fixing and Ranging (SOFAR) channel is not well defined, (ii) a safe distance from the surface currents which occupy the first few hundred meters of the water column, (iii) seabed slopes that enable the safe deployment of the hydrophone mooring bases, (iv) avoidance of regions of high internal tide activity, (v) choice of locations to optimize basin and cross-basin scale acoustic coverage of each triplet and (vi) redundancy considerations so that one triplet can partially cover for the other one in case of necessity. A state-of-the-art three-dimensional (3-D) parabolic equation acoustic propagation model was used to model the propagation for a number of potential triplet locations. Criteria for short-listing candidate triplet locations were based on acoustic coverage towards the North and South, as well as overall acoustic coverage, taking into account different scales of source strength. An increase in the predicted area coverage compared to predictions based on 2-D modelling was observed and attributed to diffraction around sharp localized features such as islands or sea-mounts.

  12. T-phase and tsunami signals recorded by IMS hydrophone triplets during the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Matsumoto, H.; Haralabus, G.; Zampolli, M.; Ozel, N. M.; Yamada, T.; Mark, P. K.

    2016-12-01

    A hydrophone station of the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) is used to estimate the back-azimuth of T-phase signals generated by the 2011 Tohoku earthquake. Among the 6 IMS hydrophone stations required by the Treaty, 5 stations consist of two triplets, with the exception of HA1 (Australia), which has only one. The hydrophones of each triplet are suspended in the SOFAR channel and arranged to form an equilateral triangle with each side being approximately two kilometers long. The waveforms from the Tohoku earthquake were received at HA11, located on Wake Island, which is located approximately 3100 km south-east of the earthquake epicenter. The frequency range used in the array analysis was chosen to be less than 0.375 Hz, which assumed the target phase velocity to be 1.5 km/s for T-phases. The T-phase signals that originated from the seismic source however show peaks in the frequency band above one Hz. As a result of the inter-element distances of 2 km, spatial aliasing is observed in the frequency-wavenumber analysis (F-K analysis) if the entire 100 Hz bandwidth of the hydrophones is used. This spatial aliasing is significant because the distance between hydrophones in the triplet is large in comparison to the ratio between the phase velocity of T-phase signals and the frequency. To circumvent this spatial aliasing problem, a three-step processing technique used in seismic array analysis is applied: (1) high-pass filtering above 1 Hz to retrieve the T-phase, followed by (2) extraction of the envelope of this signal to highlight the T-phase contribution, and finally (3) low-pass filtering of the envelope below 0.375 Hz. The F-K analysis provides accurate back-azimuth and slowness estimations without spatial aliasing. Deconvolved waveforms are also processed to retrieve tsunami components by using a three-pole model of the frequency-amplitude-phase (FAP) response below 0.1 Hz and the measured sensor response for higher frequencies. It is also shown that short-period pressure fluctuations recorded by the IMS hydrophones correspond to theoretical dispersion curves of tsunamis. Thus, short-period dispersive tsunami signals can be identified by the IMS hydrophone triplets.

  13. Results of Infrasound Interferometry in Netherlands

    NASA Astrophysics Data System (ADS)

    Fricke, J. T.; Ruigrok, E. N.; Evers, L. G.; Simons, D. G.; Wapenaar, K.

    2012-04-01

    The travel time of infrasound through the atmosphere depends on the temperature and the wind. These atmospheric conditions could be estimated by measuring the travel times between different receivers (microbarometers). For such an estimation an inverse model of the propagation of infrasound through the atmosphere is essential. In the first step it is useful to build a forward model. The inputs of our raytracing model are the atmospheric conditions and the positions of source and receiver. The model consists of three elements the source, the channel and the receiver. The source is a blast wave or microbaroms. The channel is the atmosphere and it takes into account the travel time along the eigen ray, the attenuation of the different atmospheric layers, the spreading of the rays and the influence of caustics. Each receiver is reached by different rays (eigen rays). To determine the eigen rays is part of the receiver element. As output the model generates synthetic barograms. The synthetic barograms can be used to explain measured barograms. Furthermore the synthetic barograms can also be used to evaluate the determination of the travel time. The accurate travel time is for the inverse model as input essential. Since small changes of the travel time lead to big changes of the output (temperature and wind). The travel time between two receivers is determined by crosscorrelating the barograms of these two receivers. This technique was already successfully applied in the troposphere (Haney, 2009). We show that the same can be achieved with more complicated stratospheric phases. Now we compare the crosscorrelation of synthetic barograms with the crosscorrelation of measured barograms. These barograms are measured with the 'Large Aperture Infrasound Array' (LAIA). LAIA is being installed by the Royal Netherlands Meteorological Institute (KNMI) in the framework of the radio-astronomical 'Low Frequency Array' (LOFAR) initiative. LAIA will consist of thirty microbarometers with an aperture of around 100 km. The in-house developed microbarometers are able to measure infrasound up to a period of 1000 seconds, which is in the acoustic-gravity wave regime. The results will also be directly applicable to the verification of the 'Comprehensive Nuclear-Test-Ban Treaty' (CTBT), where uncertainties in the atmospheric propagation of infrasound play a dominant role. This research is made possible by the support of the 'Netherlands Organisation for Scientific Research' (NWO). Haney, M., 2009. Infrasonic ambient noise interferometry from correlations of microbaroms, Geophysical Research Letters, 36, L19808, doi:10.1029/2009GL040179

  14. Seismicity of the North Atlantic as measured by the International Data Centre using waveform cross correlation

    NASA Astrophysics Data System (ADS)

    Given, J. W.; Bobrov, D.; Kitov, I. O.; Spiliopoulos, S.

    2012-12-01

    The Technical Secretariat (TS) of the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) will carry out the verification of the CTBT which obligates each State Party not to carry out any nuclear explosions, independently of their size and purpose. The International Data Centre (IDC) receives, collects, processes, analyses, reports on and archives data from the International Monitoring System(IMS). The IDC is responsible for automatic and interactive processing of the IMS data and for standard IDC products. The IDC is also required by the Treaty to progressively enhance its technical capabilities. In this study, we use waveform cross correlation as a technique to improve the detection capability and reliability of the seismic part of the IMS. In order to quantitatively estimate the gain obtained by cross correlation on the current sensitivity of automatic and interactive processing we compared seismic bulletins built for the North Atlantic (NA), which is an isolated region with earthquakes concentrating around the Mid-Atlantic Ridge. This avoids the influence of adjacent seismic regions on the final bulletins: the Reviewed Event Bulletin (REB) issued by the International Data Centre and the cross correlation Standard Event List (XSEL). We have cross correlated waveforms from ~1500 events reported in the REB since 2009. The resulting cross correlation matrix revealed the best candidates for master events. High-quality signals (SNR>5.0) recorded at eighteen array stations from approximately 50 master events evenly distributed over the seismically active zone in the NA were selected as templates. These templates are used for a continuous calculation of cross correlation coefficients since 2011. All detections obtained by cross-correlation are then used to build events according to the current IDC definition, i.e. at least three primary stations with accurate arrival times, azimuth and slowness estimates. The qualified event hypotheses populated the XSEL. In order to confirm the XSEL events not found in the REB, a portion of the newly built events was reviewed interactively by experienced analysts. The influence of all defining parameters (cross correlation coefficient threshold and SNR, F-statistics and fk-analysis, azimuth and slowness estimates, relative magnitude, etc.) on the final XSEL has been studied using the relevant frequency distributions for all detections vs only for those which were associated with the XSEL events. These distributions are also station and master dependent. This allows estimating the thresholds for all defining parameters, which may be adjusted to balance the rate of missed events and false alarms.

  15. Identification of mine collapses, explosions and earthquakes using INSAR: a preliminary investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foxall, B; Sweeney, J J; Walter, W R

    1998-07-07

    Interferograms constmcted from satellite-borne synthetic aperture radar images have the capability of mapping sub-cm ground surface deformation over areas on the order of 100 x 100 km with a spatial resolution on the order of 10 meters. We investigate the utility of synthetic aperture radar interferomehy (InSAR) used in conjunction with regional seismic methods in detecting and discriminating different types of seismic events in the context of special event analysis for the CTBT. For this initial study, we carried out elastic dislocation modeling of underground explosions, mine collapses and small (M<5.5) shallow earthquakes to produce synthetic interferograms and then analyzedmore » satellite radar data for a large mine collapse. The synthetic modeling shows that, for a given magnitude each type of event produces a distinctive pattern of ground deformation that can be recognized in, and recovered from, the corresponding interferogram. These diagnostic characteristics include not only differences in the polarities of surface displacements but also differences in displacement amplitudes from the different sources. The technique is especially sensitive to source depth, a parameter that is crucial in discriminating earthquakes from the other event types but is often very poorly constrained by regional seismic data alone. The ERS radar data analyzed is from a M L 5.2 seismic event that occurred in southwestern Wyoming on February 3,1995. Although seismic data from the event have some characteristics of an underground explosion, based on seismological and geodetic data it has been identified as being caused by a large underground collapse in the Solvay Mine. Several pairs of before-collapse and after-collapse radar images were phase processed to obtain interferograms. The minimum time separation for a before-collapse and after-collapse pair was 548 days. Even with this long time separation, phase coherence between the image pairs was acceptable and a deformation map was successfully obtained. Two images, separated by 1 day and occurring after the mine collapse, were used to form a digital elevation map (DEM) that was used to correct for topography. The interferograms identify the large deformation at the Solvay Mine as well as some areas of lesser deformation near other mines in the area. The large amount of deformation at the Solvay Mine was identified, but (as predicted by our dislocation modeling) could not be quantified absolutely because of the incoherent interference pattern it produced« less

  16. U.S. Nuclear Weapons Modernization - the Stockpile Life Extension Program

    NASA Astrophysics Data System (ADS)

    Cook, Donald

    2016-03-01

    Underground nuclear testing of U.S. nuclear weapons was halted by President George H.W. Bush in 1992 when he announced a moratorium. In 1993, the moratorium was extended by President Bill Clinton and, in 1995, a program of Stockpile Stewardship was put in its place. In 1996, President Clinton signed the Comprehensive Nuclear Test Ban Treaty (CTBT). Twenty years have passed since then. Over the same time, the average age of a nuclear weapon in the stockpile has increased from 6 years (1992) to nearly 29 years (2015). At its inception, achievement of the objectives of the Stockpile Stewardship Program (SSP) appeared possible but very difficult. The cost to design and construct several large facilities for precision experimentation in hydrodynamics and high energy density physics was large. The practical steps needed to move from computational platforms of less than 100 Mflops/sec to 10 Teraflops/sec and beyond were unknown. Today, most of the required facilities for SSP are in place and computational speed has been increased by more than six orders of magnitude. These, and the physicists and engineers in the complex of labs and plants within the National Nuclear Security Administration (NNSA) who put them in place, have been the basis for underpinning an annual decision, made by the weapons lab directors for each of the past 20 years, that resort to underground nuclear testing is not needed for maintaining confidence in the safety and reliability of the U.S stockpile. A key part of that decision has been annual assessment of the physical changes in stockpiled weapons. These weapons, quite simply, are systems that invariably and unstoppably age in the internal weapon environment of radioactive materials and complex interfaces of highly dissimilar organic and inorganic materials. Without an ongoing program to rebuild some components and replace other components to increase safety or security, i.e., life extending these weapons, either underground testing would again be required to assess many changes at once, or confidence in these weapons would be reduced. The strategy and details of the U.S. Stockpile Life Extension Program will be described in this talk. In brief, the strategy is to reduce the number of weapons in the stockpile while increasing confidence in the weapons that remain and, where possible, increase their safety, increase their security, and reduce their nuclear material quantities and yields. A number of ``myths'' pertaining to nuclear weapons, the SSP, and the Stockpile Life Extension Program will be explored.

  17. Regional Seismic Methods of Identifying Explosions

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Hauk, T. F.

    2013-12-01

    A lesson from the 2006, 2009 and 2013 DPRK declared nuclear explosion Ms:mb observations is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, we need to put our empirical methods on a firmer physical footing. Here we review the two of the main identification methods: 1) P/S ratios and 2) Moment Tensor techniques, which can be applied at the regional distance (200-1600 km) to very small events, improving nuclear explosion monitoring and confidence in verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Amplitude ratios of seismic P-to-S waves at sufficiently high frequencies (~>2 Hz) can identify explosions among a background of natural earthquakes (e.g. Walter et al., 1995). However the physical basis for the generation of explosion S-waves, and therefore the predictability of this P/S technique as a function of event properties such as size, depth, geology and path, remains incompletely understood. Calculated intermediate period (10-100s) waveforms from regional 1-D models can match data and provide moment tensor results that separate explosions from earthquakes and cavity collapses (e.g. Ford et al. 2009). However it has long been observed that some nuclear tests produce large Love waves and reversed Rayleigh waves that complicate moment tensor modeling. Again the physical basis for the generation of these effects from explosions remains incompletely understood. We are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Our goal is to improve our explosion models and our ability to understand and predict where P/S and moment tensor methods of identifying explosions work, and any circumstances where they may not. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. Knowledge service decision making in business incubators based on the supernetwork model

    NASA Astrophysics Data System (ADS)

    Zhao, Liming; Zhang, Haihong; Wu, Wenqing

    2017-08-01

    As valuable resources for incubating firms, knowledge resources have received gradually increasing attention from all types of business incubators, and business incubators use a variety of knowledge services to stimulate rapid growth in incubating firms. Based on previous research, we generalize the knowledge transfer and knowledge networking services of two main forms of knowledge services and further divide knowledge transfer services into knowledge depth services and knowledge breadth services. Then, we construct the business incubators' knowledge supernetwork model, describe the evolution mechanism among heterogeneous agents and utilize a simulation to explore the performance variance of different business incubators' knowledge services. The simulation results show that knowledge stock increases faster when business incubators are able to provide knowledge services to more incubating firms and that the degree of discrepancy in the knowledge stock increases during the process of knowledge growth. Further, knowledge transfer services lead to greater differences in the knowledge structure, while knowledge networking services lead to smaller differences. Regarding the two types of knowledge transfer services, knowledge depth services are more conducive to knowledge growth than knowledge breadth services, but knowledge depth services lead to greater gaps in knowledge stocks and greater differences in knowledge structures. Overall, it is optimal for business incubators to select a single knowledge service or portfolio strategy based on the amount of time and energy expended on the two types of knowledge services.

  19. Content-Related Knowledge of Biology Teachers from Secondary Schools: Structure and learning opportunities

    NASA Astrophysics Data System (ADS)

    Großschedl, Jörg; Mahler, Daniela; Kleickmann, Thilo; Harms, Ute

    2014-09-01

    Teachers' content-related knowledge is a key factor influencing the learning progress of students. Different models of content-related knowledge have been proposed by educational researchers; most of them take into account three categories: content knowledge, pedagogical content knowledge, and curricular knowledge. As there is no consensus about the empirical separability (i.e. empirical structure) of content-related knowledge yet, a total of 134 biology teachers from secondary schools completed three tests which were to capture each of the three categories of content-related knowledge. The empirical structure of content-related knowledge was analyzed by Rasch analysis, which suggests content-related knowledge to be composed of (1) content knowledge, (2) pedagogical content knowledge, and (3) curricular knowledge. Pedagogical content knowledge and curricular knowledge are highly related (rlatent = .70). The latent correlations between content knowledge and pedagogical content knowledge (rlatent = .48)-and curricular knowledge, respectively (rlatent = .35)-are moderate to low (all ps < .001). Beyond the empirical structure of content-related knowledge, different learning opportunities for teachers were investigated with regard to their relationship to content knowledge, pedagogical content knowledge, and curricular knowledge acquisition. Our results show that an in-depth training in teacher education, professional development, and teacher self-study are positively related to particular categories of content-related knowledge. Furthermore, our results indicate that teaching experience is negatively related to curricular knowledge, compared to no significant relationship with content knowledge and pedagogical content knowledge.

  20. Knowledge Management: An Introduction.

    ERIC Educational Resources Information Center

    Mac Morrow, Noreen

    2001-01-01

    Discusses issues related to knowledge management and organizational knowledge. Highlights include types of knowledge; the knowledge economy; intellectual capital; knowledge and learning organizations; knowledge management strategies and processes; organizational culture; the role of technology; measuring knowledge; and the role of the information…

  1. An Analysis of Social Studies Teachers' Perception Levels Regarding Web Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Yesiltas, Erkan

    2016-01-01

    Web pedagogical content knowledge generally takes pedagogical knowledge, content knowledge, and Web knowledge as basis. It is a structure emerging through the interaction of these three components. Content knowledge refers to knowledge of subjects to be taught. Pedagogical knowledge involves knowledge of process, implementation, learning methods,…

  2. Knowledge Management: A Skeptic's Guide

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2006-01-01

    A viewgraph presentation discussing knowledge management is shown. The topics include: 1) What is Knowledge Management? 2) Why Manage Knowledge? The Presenting Problems; 3) What Gets Called Knowledge Management? 4) Attempts to Rethink Assumptions about Knowledgs; 5) What is Knowledge? 6) Knowledge Management and INstitutional Memory; 7) Knowledge Management and Culture; 8) To solve a social problem, it's easier to call for cultural rather than organizational change; 9) Will the Knowledge Management Effort Succeed? and 10) Backup: Metrics for Valuing Intellectural Capital i.e. Knowledge.

  3. Embedding Open-domain Common-sense Knowledge from Text

    PubMed Central

    Goodwin, Travis; Harabagiu, Sanda

    2017-01-01

    Our ability to understand language often relies on common-sense knowledge – background information the speaker can assume is known by the reader. Similarly, our comprehension of the language used in complex domains relies on access to domain-specific knowledge. Capturing common-sense and domain-specific knowledge can be achieved by taking advantage of recent advances in open information extraction (IE) techniques and, more importantly, of knowledge embeddings, which are multi-dimensional representations of concepts and relations. Building a knowledge graph for representing common-sense knowledge in which concepts discerned from noun phrases are cast as vertices and lexicalized relations are cast as edges leads to learning the embeddings of common-sense knowledge accounting for semantic compositionality as well as implied knowledge. Common-sense knowledge is acquired from a vast collection of blogs and books as well as from WordNet. Similarly, medical knowledge is learned from two large sets of electronic health records. The evaluation results of these two forms of knowledge are promising: the same knowledge acquisition methodology based on learning knowledge embeddings works well both for common-sense knowledge and for medical knowledge Interestingly, the common-sense knowledge that we have acquired was evaluated as being less neutral than than the medical knowledge, as it often reflected the opinion of the knowledge utterer. In addition, the acquired medical knowledge was evaluated as more plausible than the common-sense knowledge, reflecting the complexity of acquiring common-sense knowledge due to the pragmatics and economicity of language. PMID:28649676

  4. The Relationship between Immediate Relevant Basic Science Knowledge and Clinical Knowledge: Physiology Knowledge and Transthoracic Echocardiography Image Interpretation

    ERIC Educational Resources Information Center

    Nielsen, Dorte Guldbrand; Gotzsche, Ole; Sonne, Ole; Eika, Berit

    2012-01-01

    Two major views on the relationship between basic science knowledge and clinical knowledge stand out; the Two-world view seeing basic science and clinical science as two separate knowledge bases and the encapsulated knowledge view stating that basic science knowledge plays an overt role being encapsulated in the clinical knowledge. However, resent…

  5. Scotland's Knowledge Network: translating knowledge into action to improve quality of care.

    PubMed

    Wales, A; Graham, S; Rooney, K; Crawford, A

    2012-11-01

    The Knowledge Network (www.knowledge.scot.nhs.uk) is Scotland's online knowledge service for health and social care. It is designed to support practitioners to apply knowledge in frontline delivery of care, helping to translate knowledge into better health-care outcomes through safe, effective, person-centred care. The Knowledge Network helps to combine the worlds of evidence-based practice and quality improvement by providing access to knowledge about the effectiveness of clinical interventions ('know-what') and knowledge about how to implement this knowledge to support individual patients in working health-care environments ('know-how'). An 'evidence and guidance' search enables clinicians to quickly access quality-assured evidence and best practice, while point of care and mobile solutions provide knowledge in actionable formats to embed in clinical workflow. This research-based knowledge is complemented by social networking services and improvement tools which support the capture and exchange of knowledge from experience, facilitating practice change and systems improvement. In these cases, the Knowledge Network supports key components of the knowledge-to-action cycle--acquiring, creating, sharing and disseminating knowledge to improve performance and innovate. It provides a vehicle for implementing the recommendations of the national Knowledge into Action review, which outlines a new national approach to embedding knowledge in frontline practice and systems improvement.

  6. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  7. a Conceptual Framework for Virtual Geographic Environments Knowledge Engineering

    NASA Astrophysics Data System (ADS)

    You, Lan; Lin, Hui

    2016-06-01

    VGE geographic knowledge refers to the abstract and repeatable geo-information which is related to the geo-science problem, geographical phenomena and geographical laws supported by VGE. That includes expert experiences, evolution rule, simulation processes and prediction results in VGE. This paper proposes a conceptual framework for VGE knowledge engineering in order to effectively manage and use geographic knowledge in VGE. Our approach relies on previous well established theories on knowledge engineering and VGE. The main contribution of this report is following: (1) The concepts of VGE knowledge and VGE knowledge engineering which are defined clearly; (2) features about VGE knowledge different with common knowledge; (3) geographic knowledge evolution process that help users rapidly acquire knowledge in VGE; and (4) a conceptual framework for VGE knowledge engineering providing the supporting methodologies system for building an intelligent VGE. This conceptual framework systematically describes the related VGE knowledge theories and key technologies. That will promote the rapid transformation from geodata to geographic knowledge, and furtherly reduce the gap between the data explosion and knowledge absence.

  8. The influence of prior knowledge on the retrieval-directed function of note taking in prior knowledge activation.

    PubMed

    Wetzels, Sandra A J; Kester, Liesbeth; van Merriënboer, Jeroen J G; Broers, Nick J

    2011-06-01

    Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in supporting prior knowledge activation if available prior knowledge is limited. This study investigates the effects of the retrieval-directed function of note taking depending on learners' level of prior knowledge. It is hypothesized that the effectiveness of note taking is influenced by the amount of prior knowledge learners already possess. Sixty-one high school students participated in this study. A prior knowledge test was used to ascertain differences in level of prior knowledge and assign participants to a low or a high prior knowledge group. A 2×2 factorial design was used to investigate the effects of note taking during prior knowledge activation (yes, no) depending on learners' level of prior knowledge (low, high) on mental effort, performance, and mental efficiency. Note taking during prior knowledge activation lowered mental effort and increased mental efficiency for high prior knowledge learners. For low prior knowledge learners, note taking had the opposite effect on mental effort and mental efficiency. The effects of the retrieval-directed function of note taking are influenced by learners' level of prior knowledge. Learners with high prior knowledge benefit from taking notes while activating prior knowledge, whereas note taking has no beneficial effects for learners with limited prior knowledge. ©2010 The British Psychological Society.

  9. Formalization of the engineering science discipline - knowledge engineering

    NASA Astrophysics Data System (ADS)

    Peng, Xiao

    Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.

  10. Design of customer knowledge management system for Aglaonema Nursery in South Tangerang, Indonesia

    NASA Astrophysics Data System (ADS)

    Sugiarto, D.; Mardianto, I.; Dewayana, TS; Khadafi, M.

    2017-12-01

    The purpose of this paper is to describe the design of customer knowledge management system to support customer relationship management activities for an aglaonema nursery in South Tangerang, Indonesia. System. The steps were knowledge identification (knowledge about customer, knowledge from customer, knowledge for customer), knowledge capture, codification, analysis of system requirement and create use case and activity diagram. The result showed that some key knowledge were about supporting customer in plant care (know how) and types of aglaonema including with the prices (know what). That knowledge for customer then codified and shared in knowledge portal website integrated with social media. Knowledge about customer were about customers and their behaviour in purchasing aglaonema. Knowledge from customer were about feedback, favorite and customer experience. Codified knowledge were placed and shared using content management system based on wordpress.

  11. False Beliefs in Unreliable Knowledge Networks

    NASA Astrophysics Data System (ADS)

    Ioannidis, Evangelos; Varsakelis, Nikos; Antoniou, Ioannis

    2017-03-01

    The aims of this work are: (1) to extend knowledge dynamics analysis in order to assess the influence of false beliefs and unreliable communication channels, (2) to investigate the impact of selection rule-policy for knowledge acquisition, (3) to investigate the impact of targeted link attacks ("breaks" or "infections") of certain "healthy" communication channels. We examine the knowledge dynamics analytically, as well as by simulations on both artificial and real organizational knowledge networks. The main findings are: (1) False beliefs have no significant influence on knowledge dynamics, while unreliable communication channels result in non-monotonic knowledge updates ("wild" knowledge fluctuations may appear) and in significant elongation of knowledge attainment. Moreover, false beliefs may emerge during knowledge evolution, due to the presence of unreliable communication channels, even if they were not present initially, (2) Changing the selection rule-policy, by raising the awareness of agents to avoid the selection of unreliable communication channels, results in monotonic knowledge upgrade and in faster knowledge attainment, (3) "Infecting" links is more harmful than "breaking" links, due to "wild" knowledge fluctuations and due to the elongation of knowledge attainment. Moreover, attacking even a "small" percentage of links (≤5%) with high knowledge transfer, may result in dramatic elongation of knowledge attainment (over 100%), as well as in delays of the onset of knowledge attainment. Hence, links of high knowledge transfer should be protected, because in Information Warfare and Disinformation, these links are the "best targets".

  12. The effects of activating prior topic and metacognitive knowledge on text comprehension scores.

    PubMed

    Kostons, Danny; van der Werf, Greetje

    2015-09-01

    Research on prior knowledge activation has consistently shown that activating learners' prior knowledge has beneficial effects on learning. If learners activate their prior knowledge, this activated knowledge serves as a framework for establishing relationships between the knowledge they already possess and new information provided to them. Thus far, prior knowledge activation has dealt primarily with topic knowledge in specific domains. Students, however, likely also possess at least some metacognitive knowledge useful in those domains, which, when activated, should aid in the deployment of helpful strategies during reading. In this study, we investigated the effects of both prior topic knowledge activation (PTKA) and prior metacognitive knowledge activation (PMKA) on text comprehension scores. Eighty-eight students in primary education were randomly distributed amongst the conditions of the 2 × 2 (PTKA yes/no × PMKA yes/no) designed experiment. Results show that activating prior metacognitive knowledge had a beneficial effect on text comprehension, whereas activating prior topic knowledge, after correcting for the amount of prior knowledge, did not. Most studies deal with explicit instruction of metacognitive knowledge, but our results show that this may not be necessary, specifically in the case of students who already have some metacognitive knowledge. However, existing metacognitive knowledge needs to be activated in order for students to make better use of this knowledge. © 2015 The British Psychological Society.

  13. Team knowledge representation: a network perspective.

    PubMed

    Espinosa, J Alberto; Clark, Mark A

    2014-03-01

    We propose a network perspective of team knowledge that offers both conceptual and methodological advantages, expanding explanatory value through representation and measurement of component structure and content. Team knowledge has typically been conceptualized and measured with relatively simple aggregates, without fully accounting for differing knowledge configurations among team members. Teams with similar aggregate values of team knowledge may have very different team dynamics depending on how knowledge isolates, cliques, and densities are distributed across the team; which members are the most knowledgeable; who shares knowledge with whom; and how knowledge clusters are distributed. We illustrate our proposed network approach through a sample of 57 teams, including how to compute, analyze, and visually represent team knowledge. Team knowledge network structures (isolation, centrality) are associated with outcomes of, respectively, task coordination, strategy coordination, and the proportion of team knowledge cliques, all after controlling for shared team knowledge. Network analysis helps to represent, measure, and understand the relationship of team knowledge to outcomes of interest to team researchers, members, and managers. Our approach complements existing team knowledge measures. Researchers and managers can apply network concepts and measures to help understand where team knowledge is held within a team and how this relational structure may influence team coordination, cohesion, and performance.

  14. The Emerging Phenomenon of Knowledge Management.

    ERIC Educational Resources Information Center

    Broadbent, Marianne

    1997-01-01

    Clarifies the meaning of knowledge management and gives examples of organizations that overtly practice it. Outlines four steps in knowledge management: (1) making knowledge visible; (2) building knowledge intensity; (3) building knowledge infrastructure; and (4) developing a knowledge culture. Discusses managing people as assets, librarians as…

  15. 'Ethos' Enabling Organisational Knowledge Creation

    NASA Astrophysics Data System (ADS)

    Matsudaira, Yoshito

    This paper examines knowledge creation in relation to improvements on the production line in the manufacturing department of Nissan Motor Company and aims to clarify embodied knowledge observed in the actions of organisational members who enable knowledge creation will be clarified. For that purpose, this study adopts an approach that adds a first, second, and third-person's viewpoint to the theory of knowledge creation. Embodied knowledge, observed in the actions of organisational members who enable knowledge creation, is the continued practice of 'ethos' (in Greek) founded in Nissan Production Way as an ethical basis. Ethos is knowledge (intangible) assets for knowledge creating companies. Substantiated analysis classifies ethos into three categories: the individual, team and organisation. This indicates the precise actions of the organisational members in each category during the knowledge creation process. This research will be successful in its role of showing the indispensability of ethos - the new concept of knowledge assets, which enables knowledge creation -for future knowledge-based management in the knowledge society.

  16. A network model of knowledge accumulation through diffusion and upgrade

    NASA Astrophysics Data System (ADS)

    Zhuang, Enyu; Chen, Guanrong; Feng, Gang

    2011-07-01

    In this paper, we introduce a model to describe knowledge accumulation through knowledge diffusion and knowledge upgrade in a multi-agent network. Here, knowledge diffusion refers to the distribution of existing knowledge in the network, while knowledge upgrade means the discovery of new knowledge. It is found that the population of the network and the number of each agent’s neighbors affect the speed of knowledge accumulation. Four different policies for updating the neighboring agents are thus proposed, and their influence on the speed of knowledge accumulation and the topology evolution of the network are also studied.

  17. A Survey of Knowledge Management Research & Development at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This chapter catalogs knowledge management research and development activities at NASA Ames Research Center as of April 2002. A general categorization scheme for knowledge management systems is first introduced. This categorization scheme divides knowledge management capabilities into five broad categories: knowledge capture, knowledge preservation, knowledge augmentation, knowledge dissemination, and knowledge infrastructure. Each of nearly 30 knowledge management systems developed at Ames is then classified according to this system. Finally, a capsule description of each system is presented along with information on deployment status, funding sources, contact information, and both published and internet-based references.

  18. The Roles of Knowledge Professionals for Knowledge Management.

    ERIC Educational Resources Information Center

    Kim, Seonghee

    This paper starts by exploring the definition of knowledge and knowledge management; examples of acquisition, creation, packaging, application, and reuse of knowledge are provided. It then considers the partnership for knowledge management and especially how librarians as knowledge professionals, users, and technology experts can contribute to…

  19. Domain knowledge patterns in pedagogical diagnostics

    NASA Astrophysics Data System (ADS)

    Miarka, Rostislav

    2017-07-01

    This paper shows a proposal of representation of knowledge patterns in RDF(S) language. Knowledge patterns are used for reuse of knowledge. They can be divided into two groups - Top-level knowledge patterns and Domain knowledge patterns. Pedagogical diagnostics is aimed at testing of knowledge of students at primary and secondary school. An example of domain knowledge pattern from pedagogical diagnostics is part of this paper.

  20. Examining the development of knowledge for teaching a novel introductory physics curriculum

    NASA Astrophysics Data System (ADS)

    Seung, Eulsun

    The purpose of this study was to investigate how graduate physics teaching assistants (TAs) develop professional knowledge for teaching a new undergraduate introductory physics curriculum, Matter and Interactions (M&I ). M&I has recently been adopted as a novel introductory physics course that focuses on the application of a small number of fundamental physical principles on the atomic and molecular nature of matter. In this study, I examined the process of five TAs' development of knowledge for implementing the M&I course---from the time they engaged in an M&I content and methods workshop through their first semester as TAs for the course. Through a qualitative, multiple case study research design, data was collected from multiple sources: non-participant observations, digitally recorded video, semi-structured interviews, TAs' written reflections, and field notes. The data were analyzed using the constant comparative method. The TAs' knowledge for teaching M&I was identified in three domains: pedagogical content knowledge, pedagogical knowledge, and subject matter knowledge. First, the three components of TAs' pedagogical content knowledge were identified: knowledge of the goals of M&I, knowledge of instructional strategies, and knowledge of students' learning. Second, pedagogical knowledge that the TAs demonstrated during the study fell predominantly into the category of classroom management and organization. The knowledge of classroom management and organization was categorized into two components: time management skills and group composition. Last, the TAs' subject matter knowledge that they developed through their M&I teaching experience was described in terms of the conceptual structure of the M&I curriculum, the new approach of the M&I curriculum, and specific topic knowledge. The TAs' knowledge for teaching developed from propositional knowledge to personal practical knowledge, and the process of knowledge development consisted of three phases: accepting, actualizing, and internalizing. In addition, the TAs' knowledge developed by combining various components of knowledge for teaching. Two factors that influenced the TAs' knowledge development were identified. First, the TAs' goals in the recitation class and their goals for the checkpoints in the laboratory class influenced their knowledge development. Second, various dilemmas that the TAs encountered during their teaching practice affected the TAs' knowledge development.

  1. From knowledge presentation to knowledge representation to knowledge construction: Future directions for hypermedia

    NASA Technical Reports Server (NTRS)

    Palumbo, David B.

    1990-01-01

    Relationships between human memory systems and hypermedia systems are discussed with particular emphasis on the underlying importance of associational memory. The distinctions between knowledge presentation, knowledge representation, and knowledge constructions are addressed. Issues involved in actually developing individualizable hypermedia based knowledge construction tools are presented.

  2. Distinguishing Knowledge-Sharing, Knowledge-Construction, and Knowledge-Creation Discourses

    ERIC Educational Resources Information Center

    van Aalst, Jan

    2009-01-01

    The study reported here sought to obtain the clear articulation of asynchronous computer-mediated discourse needed for Carl Bereiter and Marlene Scardamalia's knowledge-creation model. Distinctions were set up between three modes of discourse: knowledge sharing, knowledge construction, and knowledge creation. These were applied to the asynchronous…

  3. A Conceptual Framework for Examining Knowledge Management in Higher Education Contexts

    ERIC Educational Resources Information Center

    Lee, Hae-Young; Roth, Gene L.

    2009-01-01

    Knowledge management is an on-going process that involves varied activities: diagnosis, design, and implementation of knowledge creation, knowledge transfer, and knowledge sharing. The primary goal of knowledge management, like other management theories or models, is to identify and leverage organizational and individual knowledge for the…

  4. Knowledge repositories for multiple uses

    NASA Technical Reports Server (NTRS)

    Williamson, Keith; Riddle, Patricia

    1991-01-01

    In the life cycle of a complex physical device or part, for example, the docking bay door of the Space Station, there are many uses for knowledge about the device or part. The same piece of knowledge might serve several uses. Given the quantity and complexity of the knowledge that must be stored, it is critical to maintain the knowledge in one repository, in one form. At the same time, because of quantity and complexity of knowledge that must be used in life cycle applications such as cost estimation, re-design, and diagnosis, it is critical to automate such knowledge uses. For each specific use, a knowledge base must be available and must be in a from that promotes the efficient performance of that knowledge base. However, without a single source knowledge repository, the cost of maintaining consistent knowledge between multiple knowledge bases increases dramatically; as facts and descriptions change, they must be updated in each individual knowledge base. A use-neutral representation of a hydraulic system for the F-111 aircraft was developed. The ability to derive portions of four different knowledge bases is demonstrated from this use-neutral representation: one knowledge base is for re-design of the device using a model-based reasoning problem solver; two knowledge bases, at different levels of abstraction, are for diagnosis using a model-based reasoning solver; and one knowledge base is for diagnosis using an associational reasoning problem solver. It was shown how updates issued against the single source use-neutral knowledge repository can be propagated to the underlying knowledge bases.

  5. Language knowledge and event knowledge in language use.

    PubMed

    Willits, Jon A; Amato, Michael S; MacDonald, Maryellen C

    2015-05-01

    This paper examines how semantic knowledge is used in language comprehension and in making judgments about events in the world. We contrast knowledge gleaned from prior language experience ("language knowledge") and knowledge coming from prior experience with the world ("world knowledge"). In two corpus analyses, we show that previous research linking verb aspect and event representations have confounded language and world knowledge. Then, using carefully chosen stimuli that remove this confound, we performed four experiments that manipulated the degree to which language knowledge or world knowledge should be salient and relevant to performing a task, finding in each case that participants use the type of knowledge most appropriate to the task. These results provide evidence for a highly context-sensitive and interactionist perspective on how semantic knowledge is represented and used during language processing. Copyright © 2015. Published by Elsevier Inc.

  6. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  7. Issues on the use of meta-knowledge in expert systems

    NASA Technical Reports Server (NTRS)

    Facemire, Jon; Chen, Imao

    1988-01-01

    Meta knowledge is knowledge about knowledge; knowledge that is not domain specific but is concerned instead with its own internal structure. Several past systems have used meta-knowledge to improve the nature of the user interface, to maintain the knowledge base, and to control the inference engine. More extensive use of meta-knowledge is probable for the future as larger scale problems are considered. A proposed system architecture is presented and discussed in terms of meta-knowledge applications. The principle components of this system: the user support subsystem, the control structure, the knowledge base, the inference engine, and a learning facility are all outlined and discussed in light of the use of meta-knowledge. Problems with meta-constructs are also mentioned but it is concluded that the use of meta-knowledge is crucial for increasingly autonomous operations.

  8. Excellence in clinical teaching: knowledge transformation and development required.

    PubMed

    Irby, David M

    2014-08-01

    Clinical teachers in medicine face the daunting task of mastering the many domains of knowledge needed for practice and teaching. The breadth and complexity of this knowledge continue to increase, as does the difficulty of transforming the knowledge into concepts that are understandable to learners. Properly targeted faculty development has the potential to expedite the knowledge transformation process for clinical teachers. Based on my own research in clinical teaching and faculty development, as well as the work of others, I describe the unique forms of clinical teacher knowledge, the transformation of that knowledge for teaching purposes and implications for faculty development. The following forms of knowledge for clinical teaching in medicine need to be mastered and transformed: (i) knowledge of medicine and patients; (ii) knowledge of context; (iii) knowledge of pedagogy and learners, and (iv) knowledge integrated into teaching scripts. This knowledge is employed and conveyed through the parallel processes of clinical reasoning and clinical instructional reasoning. Faculty development can facilitate this knowledge transformation process by: (i) examining, deconstructing and practising new teaching scripts; (ii) focusing on foundational concepts; (iii) demonstrating knowledge-in-use, and (iv) creating a supportive organisational climate for clinical teaching. To become an excellent clinical teacher in medicine requires the transformation of multiple forms of knowledge for teaching purposes. These domains of knowledge allow clinical teachers to provide tailored instruction to learners at varying levels in the context of fast-paced and demanding clinical practice. Faculty development can facilitate this knowledge transformation process. © 2014 John Wiley & Sons Ltd.

  9. Not a One-Way Street: Bidirectional Relations between Procedural and Conceptual Knowledge of Mathematics

    ERIC Educational Resources Information Center

    Rittle-Johnson, Bethany; Schneider, Michael; Star, Jon R.

    2015-01-01

    There is a long-standing and ongoing debate about the relations between conceptual and procedural knowledge (i.e., knowledge of concepts and procedures). Although there is broad consensus that conceptual knowledge supports procedural knowledge, there is controversy over whether procedural knowledge supports conceptual knowledge and how instruction…

  10. Disciplinary Knowledge Revisited: The Social Construction of Sociology

    ERIC Educational Resources Information Center

    Cole, Stephen

    2006-01-01

    In "Making Science" (1992) I make the distinction between two types of knowledge: research frontier knowledge and core knowledge. Core knowledge is the small body of knowledge for which the entire scientific community treats as indisputable facts. The research frontier is all new knowledge which makes claim to being facts but in practice there is…

  11. The Applicability of Incoherent Array Processing to IMS Seismic Array Stations

    NASA Astrophysics Data System (ADS)

    Gibbons, S. J.

    2012-04-01

    The seismic arrays of the International Monitoring System for the CTBT differ greatly in size and geometry, with apertures ranging from below 1 km to over 60 km. Large and medium aperture arrays with large inter-site spacings complicate the detection and estimation of high frequency phases since signals are often incoherent between sensors. Many such phases, typically from events at regional distances, remain undetected since pipeline algorithms often consider only frequencies low enough to allow coherent array processing. High frequency phases that are detected are frequently attributed qualitatively incorrect backazimuth and slowness estimates and are consequently not associated with the correct event hypotheses. This can lead to missed events both due to a lack of contributing phase detections and by corruption of event hypotheses by spurious detections. Continuous spectral estimation can be used for phase detection and parameter estimation on the largest aperture arrays, with phase arrivals identified as local maxima on beams of transformed spectrograms. The estimation procedure in effect measures group velocity rather than phase velocity and the ability to estimate backazimuth and slowness requires that the spatial extent of the array is large enough to resolve time-delays between envelopes with a period of approximately 4 or 5 seconds. The NOA, AKASG, YKA, WRA, and KURK arrays have apertures in excess of 20 km and spectrogram beamforming on these stations provides high quality slowness estimates for regional phases without additional post-processing. Seven arrays with aperture between 10 and 20 km (MJAR, ESDC, ILAR, KSRS, CMAR, ASAR, and EKA) can provide robust parameter estimates subject to a smoothing of the resulting slowness grids, most effectively achieved by convolving the measured slowness grids with the array response function for a 4 or 5 second period signal. The MJAR array in Japan recorded high SNR Pn signals for both the 2006 and 2009 North Korea nuclear tests but, due to signal incoherence, failed to contribute to the automatic event detections. It is demonstrated that the smoothed incoherent slowness estimates for the MJAR Pn phases for both tests indicate unambiguously the correct type of phase and a backazimuth estimate within 5 degrees of the great-circle backazimuth. The detection part of the algorithm is applicable to all IMS arrays, and spectrogram-based processing may offer a reduction in the false alarm rate for high frequency signals. Significantly, the local maxima of the scalar functions derived from the transformed spectrogram beams provide good estimates of the signal onset time. High frequency energy is of greater significance for lower event magnitudes and in, for example, the cavity decoupling detection evasion scenario. There is a need to characterize propagation paths with low attenuation of high frequency energy and situations in which parameter estimation on array stations fails.

  12. Power and knowledge in psychiatry and the troubling case of Dr Osheroff.

    PubMed

    Robertson, Michael

    2005-12-01

    To consider the state of knowledge in psychiatry with reference to the 'Osheroff debate' about the treatment of depression. A review of the key philosophical issues regarding the nature of knowledge applied to the Osheroff case. There is an apparent dichotomy between knowledge derived from a reductionist scientific method, as manifest in evidence-based medicine, and that of a narrative form of knowledge derived from clinical experience. The Focauldian notion of knowledge/power and knowledge as discourse suggests that scientific knowledge dominates over narrative knowledge in psychiatry. The implication of this applied to the Osheroff case is the potential annihilation of all forms of knowledge other than science. Knowledge in psychiatry is a pluralist, rather than singularly scientific enterprise. In the Osheroff case, the potential for scientific knowledge to abolish other forms of knowledge posed a serious threat of weakening the profession. In the light of the current debate about best practice, there is a need for reconsideration of the implications of Osheroff.

  13. Improved knowledge diffusion model based on the collaboration hypernetwork

    NASA Astrophysics Data System (ADS)

    Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo

    2015-06-01

    The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.

  14. Exploring knowledge exchange: a useful framework for practice and policy.

    PubMed

    Ward, Vicky; Smith, Simon; House, Allan; Hamer, Susan

    2012-02-01

    Knowledge translation is underpinned by a dynamic and social knowledge exchange process but there are few descriptions of how this unfolds in practice settings. This has hampered attempts to produce realistic and useful models to help policymakers and researchers understand how knowledge exchange works. This paper reports the results of research which investigated the nature of knowledge exchange. We aimed to understand whether dynamic and fluid definitions of knowledge exchange are valid and to produce a realistic, descriptive framework of knowledge exchange. Our research was informed by a realist approach. We embedded a knowledge broker within three service delivery teams across a mental health organisation in the UK, each of whom was grappling with specific challenges. The knowledge broker participated in the team's problem-solving process and collected observational fieldnotes. We also interviewed the team members. Observational and interview data were analysed quantitatively and qualitatively in order to determine and describe the nature of the knowledge exchange process in more detail. This enabled us to refine our conceptual framework of knowledge exchange. We found that knowledge exchange can be understood as a dynamic and fluid process which incorporates distinct forms of knowledge from multiple sources. Quantitative analysis illustrated that five broadly-defined components of knowledge exchange (problem, context, knowledge, activities, use) can all be in play at any one time and do not occur in a set order. Qualitative analysis revealed a number of distinct themes which better described the nature of knowledge exchange. By shedding light on the nature of knowledge exchange, our findings problematise some of the linear, technicist approaches to knowledge translation. The revised model of knowledge exchange which we propose here could therefore help to reorient thinking about knowledge exchange and act as a starting point for further exploration and evaluation of the knowledge exchange process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. On the different "worlds" of intra-organizational knowledge management: Understanding idiosyncratic variation in MNC cross-site knowledge-sharing practices.

    PubMed

    Kasper, Helmut; Lehrer, Mark; Mühlbacher, Jürgen; Müller, Barbara

    2013-02-01

    This qualitative field study investigated cross-site knowledge sharing in a small sample of multinational corporations in three different MNC business contexts (global, multidomestic, transnational). The results disclose heterogeneous "worlds" of MNC knowledge sharing, ultimately raising the question as to whether the whole concept of MNC knowledge sharing covers a sufficiently unitary phenomenon to be meaningful. We derive a non-exhaustive typology of MNC knowledge-sharing practices: self-organizing knowledge sharing, technocratic knowledge sharing, and best practice knowledge sharing. Despite its limitations, this typology helps to elucidate a number of issues, including the latent conflict between two disparate theories of MNC knowledge sharing, namely "sender-receiver" and "social learning" theories (Noorderhaven & Harzing, 2009). More generally, we develop the term "knowledge contextualization" to highlight the way that firm-specific organizational features pre-define which knowledge is considered to be of special relevance for intra-organizational sharing.

  16. The usefulness of science knowledge for parents of hearing-impaired children.

    PubMed

    Shauli, Sophie; Baram-Tsabari, Ayelet

    2018-04-01

    Hearing-impaired children's chances of integrating into hearing society largely depend on their parents, who need to learn vast amounts of science knowledge in the field of hearing. This study characterized the role played by science knowledge in the lives of nonscientists faced with science-related decisions by examining the interactions between general science knowledge, contextual science knowledge in the field of hearing, and parents' advocacy knowledge and attitudes. Based on six semi-structured interviews and 115 questionnaires completed by parents of hearing-impaired children, contextual science knowledge emerged as the only predictor for having slightly better advocacy attitudes and knowledge (5.5% explained variance). Although general science knowledge was the best predictor of contextual knowledge (14% of explained variance), it was not a direct predictor of advocacy knowledge and attitudes. Science knowledge plays some role in the lives of hearing-impaired families, even if they do not list it as a resource for successful rehabilitation.

  17. An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process

    NASA Astrophysics Data System (ADS)

    Nguyen, ThanhDat; Kifor, Claudiu Vasile

    2015-09-01

    DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.

  18. A knowledge creation info-structure to acquire and crystallize the tacit knowledge of health-care experts.

    PubMed

    Abidi, Syed Sibte Raza; Cheah, Yu-N; Curran, Janet

    2005-06-01

    Tacit knowledge of health-care experts is an important source of experiential know-how, yet due to various operational and technical reasons, such health-care knowledge is not entirely harnessed and put into professional practice. Emerging knowledge-management (KM) solutions suggest strategies to acquire the seemingly intractable and nonarticulated tacit knowledge of health-care experts. This paper presents a KM methodology, together with its computational implementation, to 1) acquire the tacit knowledge possessed by health-care experts; 2) represent the acquired tacit health-care knowledge in a computational formalism--i.e., clinical scenarios--that allows the reuse of stored knowledge to acquire tacit knowledge; and 3) crystallize the acquired tacit knowledge so that it is validated for health-care decision-support and medical education systems.

  19. The research on construction and application of machining process knowledge base

    NASA Astrophysics Data System (ADS)

    Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai

    2018-03-01

    In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.

  20. English Learners' Knowledge of Prepositions: Collocational Knowledge or Knowledge Based on Meaning?

    ERIC Educational Resources Information Center

    Mueller, Charles M.

    2011-01-01

    Second language (L2) learners' successful performance in an L2 can be partly attributed to their knowledge of collocations. In some cases, this knowledge is accompanied by knowledge of the semantic and/or grammatical patterns that motivate the collocation. At other times, collocational knowledge may serve a compensatory role. To determine the…

  1. Application of Knowledge Management: Pressing questions and practical answers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FROMM-LEWIS,MICHELLE

    2000-02-11

    Sandia National Laboratory are working on ways to increase production using Knowledge Management. Knowledge Management is: finding ways to create, identify, capture, and distribute organizational knowledge to the people who need it; to help information and knowledge flow to the right people at the right time so they can act more efficiently and effectively; recognizing, documenting and distributing explicit knowledge (explicit knowledge is quantifiable and definable, it makes up reports, manuals, instructional materials, etc.) and tacit knowledge (tacit knowledge is doing and performing, it is a combination of experience, hunches, intuition, emotions, and beliefs) in order to improve organizational performancemore » and a systematic approach to find, understand and use knowledge to create value.« less

  2. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  3. Incorporating World Knowledge to Document Clustering via Heterogeneous Information Networks.

    PubMed

    Wang, Chenguang; Song, Yangqiu; El-Kishky, Ahmed; Roth, Dan; Zhang, Ming; Han, Jiawei

    2015-08-01

    One of the key obstacles in making learning protocols realistic in applications is the need to supervise them, a costly process that often requires hiring domain experts. We consider the framework to use the world knowledge as indirect supervision. World knowledge is general-purpose knowledge, which is not designed for any specific domain. Then the key challenges are how to adapt the world knowledge to domains and how to represent it for learning. In this paper, we provide an example of using world knowledge for domain dependent document clustering. We provide three ways to specify the world knowledge to domains by resolving the ambiguity of the entities and their types, and represent the data with world knowledge as a heterogeneous information network. Then we propose a clustering algorithm that can cluster multiple types and incorporate the sub-type information as constraints. In the experiments, we use two existing knowledge bases as our sources of world knowledge. One is Freebase, which is collaboratively collected knowledge about entities and their organizations. The other is YAGO2, a knowledge base automatically extracted from Wikipedia and maps knowledge to the linguistic knowledge base, Word-Net. Experimental results on two text benchmark datasets (20newsgroups and RCV1) show that incorporating world knowledge as indirect supervision can significantly outperform the state-of-the-art clustering algorithms as well as clustering algorithms enhanced with world knowledge features.

  4. Incorporating World Knowledge to Document Clustering via Heterogeneous Information Networks

    PubMed Central

    Wang, Chenguang; Song, Yangqiu; El-Kishky, Ahmed; Roth, Dan; Zhang, Ming; Han, Jiawei

    2015-01-01

    One of the key obstacles in making learning protocols realistic in applications is the need to supervise them, a costly process that often requires hiring domain experts. We consider the framework to use the world knowledge as indirect supervision. World knowledge is general-purpose knowledge, which is not designed for any specific domain. Then the key challenges are how to adapt the world knowledge to domains and how to represent it for learning. In this paper, we provide an example of using world knowledge for domain dependent document clustering. We provide three ways to specify the world knowledge to domains by resolving the ambiguity of the entities and their types, and represent the data with world knowledge as a heterogeneous information network. Then we propose a clustering algorithm that can cluster multiple types and incorporate the sub-type information as constraints. In the experiments, we use two existing knowledge bases as our sources of world knowledge. One is Freebase, which is collaboratively collected knowledge about entities and their organizations. The other is YAGO2, a knowledge base automatically extracted from Wikipedia and maps knowledge to the linguistic knowledge base, Word-Net. Experimental results on two text benchmark datasets (20newsgroups and RCV1) show that incorporating world knowledge as indirect supervision can significantly outperform the state-of-the-art clustering algorithms as well as clustering algorithms enhanced with world knowledge features. PMID:26705504

  5. The elements of design knowledge capture

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1988-01-01

    This paper will present the basic constituents of a design knowledge capture effort. This will include a discussion of the types of knowledge to be captured in such an effort and the difference between design knowledge capture and more traditional knowledge base construction. These differences include both knowledge base structure and knowledge acquisition approach. The motivation for establishing a design knowledge capture effort as an integral part of major NASA programs will be outlined, along with the current NASA position on that subject. Finally the approach taken in design knowledge capture for Space Station will be contrasted with that used in the HSTDEK project.

  6. Subjective knowledge of AIDS and use of HIV testing.

    PubMed

    Phillips, K A

    1993-10-01

    Increasing knowledge is an important goal of human immunodeficiency virus (HIV) prevention strategies, although increased knowledge may not be associated with increased preventive behaviors. This study examines the association of (1) objective and subjective acquired immunodeficiency syndrome (AIDS) knowledge, and (2) both objective and subjective AIDS knowledge with HIV testing use. Data are from the 1988 National Health Interview Survey. Objective and subjective knowledge were only moderately correlated. In regression analyses, higher subjective knowledge was significantly associated with higher testing use, but objective knowledge was not. The results are relevant to other preventive behaviors for which knowledge is an important factor.

  7. An Object-Oriented Software Architecture for the Explorer-2 Knowledge Management Environment

    PubMed Central

    Tarabar, David B.; Greenes, Robert A.; Slosser, Eric T.

    1989-01-01

    Explorer-2 is a workstation based environment to facilitate knowledge management. It provides consistent access to a broad range of knowledge on the basis of purpose, not type. We have developed a software architecture based on Object-Oriented programming for Explorer-2. We have defined three classes of program objects: Knowledge ViewFrames, Knowledge Resources, and Knowledge Bases. This results in knowledge management at three levels: the screen level, the disk level and the meta-knowledge level. We have applied this design to several knowledge bases, and believe that there is a broad applicability of this design.

  8. Children's religious knowledge: implications for understanding satanic ritual abuse allegations.

    PubMed

    Goodman, G S; Quas, J A; Bottoms, B L; Qin, J; Shaver, P R; Orcutt, H; Shapiro, C

    1997-11-01

    The goals of the present study were to examine the extent of children's religious, especially satanic, knowledge and to understand the influence of children's age, religious training, family, and media exposure on that knowledge. Using a structured interview, 48 3- to 16-year-old children were questioned about their knowledge of: (a) religion and religious worship; (b) religion-related symbols and pictures; and (c) movies, music, and television shows with religious and horror themes. Although few children evinced direct knowledge of ritual abuse, many revealed general knowledge of satanism and satanic worship. With age, children's religious knowledge increased and became more sophisticated. Increased exposure to nonsatanic horror media was associated with more nonreligious knowledge that could be considered precursory to satanic knowledge, and increased exposure to satanic media was associated with more knowledge related to satanism. Our results suggest that children do not generally possess sufficient knowledge of satanic ritual abuse to make up false allegations on their own. However, many children have knowledge of satanism as well as nonreligious knowledge of violence, death, and illegal activities. It is possible that such knowledge could prompt an investigation of satanic ritual abuse or possibly serve as a starting point from which an allegation is erected.

  9. The critical success factors and impact of prior knowledge to nursing students when transferring nursing knowledge during nursing clinical practise.

    PubMed

    Tsai, Ming-Tien; Tsai, Ling-Long

    2005-11-01

    Nursing practise plays an important role in transferring nursing knowledge to nursing students. From the related literature review, prior knowledge will affect how learners gain new knowledge. There has been no direct examination of the prior knowledge interaction effect on students' performance and its influence on nursing students when evaluating the knowledge transfer success factors. This study explores (1) the critical success factors in transferring nursing knowledge, (2) the impact of prior knowledge when evaluating the success factors for transferring nursing knowledge. This research utilizes in-depth interviews to probe the initial success factor phase. A total of 422 valid questionnaires were conducted by the authors. The data were analysed by comparing the mean score and t-test between two groups. Seventeen critical success factors were identified by the two groups of students. Twelve items were selected to examine the diversity in the two groups. Students with prior knowledge were more independent than the other group. They also preferred self-directed learning over students without prior knowledge. Students who did not have prior knowledge were eager to take every opportunity to gain experience and more readily adopted new knowledge.

  10. Local knowledge: Empirical Fact to Develop Community Based Disaster Risk Management Concept for Community Resilience at Mangkang Kulon Village, Semarang City

    NASA Astrophysics Data System (ADS)

    Kapiarsa, A. B.; Sariffuddin, S.

    2018-02-01

    Local knowledge in disaster management should not be neglected in developing community resilience. The circular relation between humans and their living habitat and community social relation have developed the local knowledge namely specialized knowledge, shared knowledge, and common knowledge. Its correlation with community-based disaster management has become an important discussion specially to answer can local knowledge underlie community-based disaster risk reduction concept development? To answer this question, this research used mix-method. Interview and crosstab method for 73 respondents with 90% trust rate were used to determine the correlation between local knowledge and community characteristics. This research found out that shared knowledge dominated community local knowledge (77%). While common knowledge and specialized knowledge were sequentially 8% and 15%. The high score of shared value (77%) indicated that local knowledge was occurred in household level and not yet indicated in community level. Shared knowledge was found in 3 phases of the resilient community in dealing with disaster, namely mitigation, emergency response, and recovery phase. This research, therefore, has opened a new scientific discussion on the self-help concept in community-help concept in CBDRM concept development in Indonesia.

  11. Evidence-based decision-making 7: Knowledge translation.

    PubMed

    Manns, Braden J

    2015-01-01

    There is a significant gap between what is known and what is implemented by key stakeholders in practice (the evidence to practice gap). The primary purpose of knowledge translation is to address this gap, bridging evidence to clinical practice. The knowledge to action cycle is one framework for knowledge translation that integrates policy-makers throughout the research cycle. The knowledge to action cycle begins with the identification of a problem (usually a gap in care provision). After identification of the problem, knowledge creation is undertaken, depicted at the center of the cycle as a funnel. Knowledge inquiry is at the wide end of the funnel, and moving down the funnel, the primary data is synthesized into knowledge products in the form of educational materials, guidelines, decision aids, or clinical pathways. The remaining components of the knowledge to action cycle refer to the action of applying the knowledge that has been created. This includes adapting knowledge to local context, assessing barriers to knowledge use, selecting, tailoring implementing interventions, monitoring knowledge use, evaluating outcomes, and sustaining knowledge use. Each of these steps is connected by bidirectional arrows and ideally involves healthcare decision-makers and key stakeholders at each transition.

  12. Machine learning research 1989-90

    NASA Technical Reports Server (NTRS)

    Porter, Bruce W.; Souther, Arthur

    1990-01-01

    Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.

  13. Ethical Perspectives on Knowledge Translation in Rehabilitation

    PubMed Central

    Banja, John D.; Eisen, Arri

    2013-01-01

    Although the literature on the ethical dimensions of knowledge creation, use, and dissemination is voluminous, it has not particularly examined the ethical dimensions of knowledge translation in rehabilitation. Yet, whether research is done in a wet lab or treatments are provided to patients in therapeutic settings, rehabilitation professionals commonly use (as well as create) knowledge and disseminate it to peers, patients, and various others. This article will refer to knowledge creation, use, and transfer as knowledge translation and examine some of its numerous ethical challenges. Three ethical dimensions of knowledge translation will particularly attract our attention: (1) the quality of knowledge disseminated to rehabilitationists; (2) ethical challenges in being too easily persuaded by or unreasonably resistant to putative knowledge; and (3) organizational barriers to knowledge translation. We will conclude with some recommendations on facilitating the ethical soundness of knowledge translation in rehabilitation. PMID:23168302

  14. Expert knowledge maps for knowledge management: a case study in Traditional Chinese Medicine research.

    PubMed

    Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan

    2013-10-01

    To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.

  15. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    PubMed

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  16. The biomedical disciplines and the structure of biomedical and clinical knowledge.

    PubMed

    Nederbragt, H

    2000-11-01

    The relation between biomedical knowledge and clinical knowledge is discussed by comparing their respective structures. The knowledge of a disease as a biological phenomenon is constructed by the interaction of facts and theories from the main biomedical disciplines: epidemiology, diagnostics, clinical trial, therapy development and pathogenesis. Although these facts and theories are based on probabilities and extrapolations, the interaction provides a reliable and coherent structure, comparable to a Kuhnian paradigma. In the structure of clinical knowledge, i.e. knowledge of the patient with the disease, not only biomedical knowledge contributes to the structure but also economic and social relations, ethics and personal experience. However, the interaction between each of the participating "knowledges" in clinical knowledge is not based on mutual dependency and accumulation of different arguments from each, as in biomedical knowledge, but on competition and partial exclusion. Therefore, the structure of biomedical knowledge is different from that of clinical knowledge. This difference is used as the basis for a discussion in which the place of technology, evidence-based medicine and the gap between scientific and clinical knowledge are evaluated.

  17. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed

    Schmidmaier, Ralf; Eiber, Stephan; Ebersbach, Rene; Schiller, Miriam; Hege, Inga; Holzer, Matthias; Fischer, Martin R

    2013-02-22

    Medical knowledge encompasses both conceptual (facts or "what" information) and procedural knowledge ("how" and "why" information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula.

  18. Conceptualising GP teachers' knowledge: a pedagogical content knowledge perspective.

    PubMed

    Cantillon, Peter; de Grave, Willem

    2012-05-01

    Most teacher development initiatives focus on enhancing knowledge of teaching (pedagogy), whilst largely ignoring other important features of teacher knowledge such as subject matter knowledge and awareness of the learning context. Furthermore, teachers' ability to learn from faculty development interventions is limited by their existing (often implicit) pedagogical knowledge and beliefs. Pedagogical content knowledge (PCK) represents a model of teacher knowledge incorporating what they know about subject matter, pedagogy and context. PCK can be used to explore teachers' prior knowledge and to structure faculty development programmes so that they take account of a broader range of teachers' knowledge. We set out to examine the application of a PCK model in a general practice education setting. This study is part of a larger study that employed a mixed method approach (concept mapping, phenomenological interviews and video-stimulated recall) to explore features of GP teachers' subject matter knowledge, pedagogical knowledge and knowledge of the learning environment in the context of a general practice tutorial. This paper presents data on GP teachers' pedagogical and context knowledge. There was considerable overlap between different GP teachers' knowledge and beliefs about learners and the clinical learning environment (i.e. knowledge of context). The teachers' beliefs about learners were largely based on assumptions derived from their own student experiences. There were stark differences, however, between teachers in terms of pedagogical knowledge, particularly in terms of their teaching orientations (i.e. transmission or facilitation orientation) and this was manifest in their teaching behaviours. PCK represents a useful model for conceptualising clinical teacher prior knowledge in three domains, namely subject matter, learning context and pedagogy. It can and should be used as a simple guiding framework by faculty developers to inform the design and delivery of their faculty development programmes.

  19. Autonomously acquiring declarative and procedural knowledge for ICAT systems

    NASA Technical Reports Server (NTRS)

    Kovarik, Vincent J., Jr.

    1993-01-01

    The construction of Intelligent Computer Aided Training (ICAT) systems is critically dependent on the ability to define and encode knowledge. This knowledge engineering effort can be broadly divided into two categories: domain knowledge and expert or task knowledge. Domain knowledge refers to the physical environment or system with which the expert interacts. Expert knowledge consists of the set of procedures and heuristics employed by the expert in performing their task. Both these areas are a significant bottleneck in the acquisition of knowledge for ICAT systems. This paper presents a research project in the area of autonomous knowledge acquisition using a passive observation concept. The system observes an expert and then generalizes the observations into production rules representing the domain expert's knowledge.

  20. An Extended Model of Knowledge Governance

    NASA Astrophysics Data System (ADS)

    Karvalics, Laszlo Z.; Dalal, Nikunj

    In current times, we are seeing the emergence of a new paradigm to describe, understand, and analyze the expanding "knowledge domain". This overarching framework - called knowledge governance - draws from and builds upon knowledge management and may be seen as a kind of meta-layer of knowledge management. The emerging knowledge governance approach deals with issues that lie at the intersection of organization and knowledge processes. Knowledge governance has two main interpretation levels in the literature: the company- (micro-) and the national (macro-) level. We propose a three-layer model instead of the previous two-layer version, adding a layer of "global" knowledge governance. Analyzing and separating the main issues in this way, we can re-formulate the focus of knowledge governance research and practice in all layers.

  1. Parental Knowledge of Adolescents' Online Content and Contact Risks.

    PubMed

    Symons, Katrien; Ponnet, Koen; Emmery, Kathleen; Walrave, Michel; Heirman, Wannes

    2017-02-01

    Parental knowledge about adolescents' activities is an identified protective factor in terms of adolescent adjustment. While research on parental knowledge has focused on adolescents' offline behavior, there is little empirical understanding of parental knowledge about adolescents' online behavior. This study investigates parental knowledge about adolescents' online activities and experiences with online risks, as well as the correlates of such knowledge. Building on former research, open communication and knowledge-generating monitoring practices are investigated as potential correlates of parental knowledge. Use is made of triadic data, relying on reports from children aged 13 to 18, mothers and fathers within the same family (N = 357 families; 54.9 % female adolescents). The results showed that parents have little knowledge about the occurrence of online risks and their children's online activities. While mothers did not have more accurate knowledge compared to fathers, they did perceive themselves to be more knowledgeable than fathers. Associations between parental knowledge and hypothesized correlates were tested by means of one-way ANOVA tests and stepwise logistic regression models. Limited evidence was found for associations with parents' accurate knowledge about the occurrence of online risks. Engagement in knowledge-generating monitoring practices was linked to mothers and fathers' self-perceived knowledge about their children's online activities. For mothers, open communication with the child was linked to self-perceived knowledge. The findings suggest that parents need to be more aware of the possibility that online risks might occur and that more research needs to be done in order to understand what parents can do to improve their accurate knowledge.

  2. Adaptive Knowledge Management of Project-Based Learning

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…

  3. New Structures for the Effective Dissemination of Knowledge in an Enterprise.

    ERIC Educational Resources Information Center

    Kok, J. Andrew

    2000-01-01

    Discusses the creation of knowledge enterprises. Highlights include knowledge creation and sharing; networked organizational structures; structures of knowledge organization; competitive strategies; new structures to manage knowledge; boundary crossing; multi-skilled teams; communities of interest or practice; and dissemination of knowledge in an…

  4. Knowledge Discovery from Posts in Online Health Communities Using Unified Medical Language System.

    PubMed

    Chen, Donghua; Zhang, Runtong; Liu, Kecheng; Hou, Lei

    2018-06-19

    Patient-reported posts in Online Health Communities (OHCs) contain various valuable information that can help establish knowledge-based online support for online patients. However, utilizing these reports to improve online patient services in the absence of appropriate medical and healthcare expert knowledge is difficult. Thus, we propose a comprehensive knowledge discovery method that is based on the Unified Medical Language System for the analysis of narrative posts in OHCs. First, we propose a domain-knowledge support framework for OHCs to provide a basis for post analysis. Second, we develop a Knowledge-Involved Topic Modeling (KI-TM) method to extract and expand explicit knowledge within the text. We propose four metrics, namely, explicit knowledge rate, latent knowledge rate, knowledge correlation rate, and perplexity, for the evaluation of the KI-TM method. Our experimental results indicate that our proposed method outperforms existing methods in terms of providing knowledge support. Our method enhances knowledge support for online patients and can help develop intelligent OHCs in the future.

  5. Secular trends on traditional ecological knowledge: An analysis of different domains of knowledge among Tsimane' men.

    PubMed

    Reyes-García, Victoria; Luz, Ana C; Gueze, Maximilien; Paneque-Gálvez, Jaime; Macía, Manuel J; Orta-Martínez, Martí; Pino, Joan

    2013-10-01

    Empirical research provides contradictory evidence of the loss of traditional ecological knowledge across societies. Researchers have argued that culture, methodological differences, and site-specific conditions are responsible for such contradictory evidences. We advance and test a third explanation: the adaptive nature of traditional ecological knowledge systems. Specifically, we test whether different domains of traditional ecological knowledge experience different secular changes and analyze trends in the context of other changes in livelihoods. We use data collected among 651 Tsimane' men (Bolivian Amazon). Our findings indicate that different domains of knowledge follow different secular trends. Among the domains of knowledge analyzed, medicinal and wild edible knowledge appear as the most vulnerable; canoe building and firewood knowledge seem to remain constant across generations; whereas house building knowledge seems to experience a slight secular increase. Our analysis reflects on the adaptive nature of traditional ecological knowledge, highlighting how changes in this knowledge system respond to the particular needs of a society in a given point of time.

  6. On the different “worlds” of intra-organizational knowledge management: Understanding idiosyncratic variation in MNC cross-site knowledge-sharing practices

    PubMed Central

    Kasper, Helmut; Lehrer, Mark; Mühlbacher, Jürgen; Müller, Barbara

    2013-01-01

    This qualitative field study investigated cross-site knowledge sharing in a small sample of multinational corporations in three different MNC business contexts (global, multidomestic, transnational). The results disclose heterogeneous “worlds” of MNC knowledge sharing, ultimately raising the question as to whether the whole concept of MNC knowledge sharing covers a sufficiently unitary phenomenon to be meaningful. We derive a non-exhaustive typology of MNC knowledge-sharing practices: self-organizing knowledge sharing, technocratic knowledge sharing, and best practice knowledge sharing. Despite its limitations, this typology helps to elucidate a number of issues, including the latent conflict between two disparate theories of MNC knowledge sharing, namely “sender–receiver” and “social learning” theories (Noorderhaven & Harzing, 2009). More generally, we develop the term “knowledge contextualization” to highlight the way that firm-specific organizational features pre-define which knowledge is considered to be of special relevance for intra-organizational sharing. PMID:27087759

  7. Enabling Security, Stability, Transition, and Reconstruction Operations through Knowledge Management

    DTIC Science & Technology

    2009-03-18

    strategy. Overall, the cultural barriers to knowledge sharing center on knowledge creation and capture. The primary barrier to knowledge sharing is lack ... Lacking a shared identity decreases the likelihood of knowledge sharing, which is essential to effective collaboration.84 Related to collaboration...to adapt, develop, and change based on experience-derived knowledge.90 A second cultural barrier to knowledge acquisition is the lack receptiveness

  8. Building capacity for knowledge translation in occupational therapy: learning through participatory action research.

    PubMed

    Bennett, Sally; Whitehead, Mary; Eames, Sally; Fleming, Jennifer; Low, Shanling; Caldwell, Elizabeth

    2016-10-01

    There has been widespread acknowledgement of the need to build capacity in knowledge translation however much of the existing work focuses on building capacity amongst researchers rather than with clinicians directly. This paper's aim is to describe a research project for developing a knowledge translation capacity building program for occupational therapy clinicians. Participatory action research methods were used to both develop and evaluate the knowledge translation capacity-building program. Participants were occupational therapists from a large metropolitan hospital in Australia. Researchers and clinicians worked together to use the action cycle of the Knowledge to Action Framework to increase use of knowledge translation itself within the department in general, within their clinical teams, and to facilitate knowledge translation becoming part of the department's culture. Barriers and enablers to using knowledge translation were identified through a survey based on the Theoretical Domains Framework and through focus groups. Multiple interventions were used to develop a knowledge translation capacity-building program. Fifty-two occupational therapists participated initially, but only 20 across the first 18 months of the project. Barriers and enablers were identified across all domains of the Theoretical Domains Framework. Interventions selected to address these barriers or facilitate enablers were categorised into ten different categories: educational outreach; teams working on clinical knowledge translation case studies; identifying time blocks for knowledge translation; mentoring; leadership strategies; communication strategies; documentation and resources to support knowledge translation; funding a knowledge translation champion one day per week; setting goals for knowledge translation; and knowledge translation reporting strategies. Use of these strategies was, and continues to be monitored. Participants continue to be actively involved in learning and shaping the knowledge translation program across the department and within their specific clinical areas. To build capacity for knowledge translation, it is important to involve clinicians. The action cycle of the Knowledge to Action framework is a useful guide to introduce the knowledge translation process to clinicians. It may be used to engage the department as a whole, and facilitate the learning and application of knowledge translation within specific clinical areas. Research evaluating this knowledge translation program is being conducted.

  9. Can knowledge exchange support the implementation of a health-promoting schools approach? Perceived outcomes of knowledge exchange in the COMPASS study.

    PubMed

    Brown, Kristin M; Elliott, Susan J; Robertson-Wilson, Jennifer; Vine, Michelle M; Leatherdale, Scott T

    2018-03-13

    Despite the potential population-level impact of a health-promoting schools approach, schools face challenges in implementation, indicating a gap between school health research and practice. Knowledge exchange provides an opportunity to reduce this gap; however, there has been limited evaluation of these initiatives. This research explored researchers' and knowledge users' perceptions of outcomes associated with a knowledge exchange initiative within COMPASS, a longitudinal study of Canadian secondary students and schools. Schools received annual tailored summaries of their students' health behaviours and suggestions for action and were linked with knowledge brokers to support them in taking action to improve student health. Qualitative semi-structured interviews were conducted with COMPASS researchers (n = 13), school staff (n = 13), and public health stakeholders (n = 4) to explore their experiences with COMPASS knowledge exchange. Key issues included how knowledge users used school-specific findings, perceived outcomes of knowledge exchange, and suggestions for change. Outcomes for both knowledge users and researchers were identified; interestingly, knowledge users attributed more outcomes to using school-specific findings than knowledge brokering. School and public health participants indicated school-specific findings informed their programming and planning. Importantly, knowledge exchange provided a platform for partnerships between researchers, schools, and public health units. Knowledge brokering allowed researchers to gain feedback from knowledge users to enhance the study and a better understanding of the school environment. Interestingly, COMPASS knowledge exchange outcomes aligned with Samdal and Rowling's eight theory-driven implementation components for health-promoting schools. Hence, knowledge exchange may provide a mechanism to help schools implement a health-promoting schools approach. This research contributes to the limited literature regarding outcomes of knowledge brokering in public health and knowledge exchange in school health research. However, since not all schools engaged in knowledge brokering, and not all schools that engaged discussed these outcomes, further research is needed to determine the amount of engagement required for change and examine the process of COMPASS knowledge brokering to consider how to increase school engagement.

  10. Does nurses'perceived burn prevention knowledge and ability to teach burn prevention correlate with their actual burn prevention knowledge?

    PubMed

    Lehna, Carlee; Myers, John

    2010-01-01

    The purpose of this study was to explore the relationship among nurses'perceived burn prevention knowledge, their perceived ability to teach about burn prevention, and their actual burn prevention knowledge and to test if their actual burn knowledge could be predicted by these perceived measures. A two-page, anonymous survey that included a 10-item burn prevention knowledge test and an assessment of nurses'perceived knowledge of burn prevention and their perceived ability to teach burn prevention was administered to 313 nurses. Actual burn prevention knowledge was determined and the correlation among actual burn prevention knowledge, perceived knowledge, and perceived ability to teach was determined. Differences in these outcome variables based on specialty area were tested using analysis of variance techniques. Generalized linear modeling techniques were used to investigate which variables significantly predict a nurse's actual burn prevention knowledge. Test for interaction effects were performed, and significance was set at .05. Responding nurses (N = 265) described practicing in a variety of settings, such as pediatric settings (40.2%, n = 105), emergency departments (25.4%, n = 86), medical/surgical settings (8.4%, n = 22), and one pediatric burn setting (4.1%, n = 14), with all specialty areas as having similar actual burn prevention knowledge (P = .052). Seventy-seven percent of the nurses said they never taught about burn prevention (n = 177). Perceived knowledge and actual knowledge (r = .124, P = .046) as well as perceived knowledge and perceived ability were correlated (r = .799, P < .001). Significant predictors of actual knowledge were years in practice (beta = -0.063, P = .034), years in current area (beta = 0.072, P = .003), perceived knowledge (beta = 0.109, P = .042), and perceived ability (beta = 0.137, P = .019). All nurses, regardless of specialty area, have poor burn prevention knowledge, which is correlated with their perceived lack of knowledge of burn prevention. In addition, nurses'perceived burn knowledge and ability predicts their actual burn knowledge. This is a fruitful area that merits further research and exploration.

  11. The relationship between immediate relevant basic science knowledge and clinical knowledge: physiology knowledge and transthoracic echocardiography image interpretation.

    PubMed

    Nielsen, Dorte Guldbrand; Gotzsche, Ole; Sonne, Ole; Eika, Berit

    2012-10-01

    Two major views on the relationship between basic science knowledge and clinical knowledge stand out; the Two-world view seeing basic science and clinical science as two separate knowledge bases and the encapsulated knowledge view stating that basic science knowledge plays an overt role being encapsulated in the clinical knowledge. However, resent research has implied that a more complex relationship between the two knowledge bases exists. In this study, we explore the relationship between immediate relevant basic science (physiology) and clinical knowledge within a specific domain of medicine (echocardiography). Twenty eight medical students in their 3rd year and 45 physicians (15 interns, 15 cardiology residents and 15 cardiology consultants) took a multiple-choice test of physiology knowledge. The physicians also viewed images of a transthoracic echocardiography (TTE) examination and completed a checklist of possible pathologies found. A total score for each participant was calculated for the physiology test, and for all physicians also for the TTE checklist. Consultants scored significantly higher on the physiology test than did medical students and interns. A significant correlation between physiology test scores and TTE checklist scores was found for the cardiology residents only. Basic science knowledge of immediate relevance for daily clinical work expands with increased work experience within a specific domain. Consultants showed no relationship between physiology knowledge and TTE interpretation indicating that experts do not use basic science knowledge in routine daily practice, but knowledge of immediate relevance remains ready for use.

  12. Knowledge that Acts: Evaluating the Outcomes of a Knowledge Brokering Intervention in Western Australia's Ningaloo Region

    NASA Astrophysics Data System (ADS)

    Chapman, Kelly; Boschetti, Fabio; Fulton, Elizabeth; Horwitz, Pierre; Jones, Tod; Scherrer, Pascal; Syme, Geoff

    2017-11-01

    Knowledge exchange involves a suite of strategies used to bridge the divides between research, policy and practice. The literature is increasingly focused on the notion that knowledge generated by research is more useful when there is significant interaction and knowledge sharing between researchers and research recipients (i.e., stakeholders). This is exemplified by increasing calls for the use of knowledge brokers to facilitate interaction and flow of information between scientists and stakeholder groups, and the integration of scientific and local knowledge. However, most of the environmental management literature focuses on explicit forms of knowledge, leaving unmeasured the tacit relational and reflective forms of knowledge that lead people to change their behaviour. In addition, despite the high transaction costs of knowledge brokering and related stakeholder engagement, there is little research on its effectiveness. We apply Park's Manag Learn 30(2), 141-157 (1999); Knowledge and Participatory Research, London: SAGE Publications (2006) tri-partite knowledge typology as a basis for evaluating the effectiveness of knowledge brokering in the context of a large multi-agency research programme in Australia's Ningaloo coastal region, and for testing the assumption that higher levels of interaction between scientists and stakeholders lead to improved knowledge exchange. While the knowledge brokering intervention substantively increased relational networks between scientists and stakeholders, it did not generate anticipated increases in stakeholder knowledge or research application, indicating that more prolonged stakeholder engagement was required, and/or that there was a flaw in the assumptions underpinning our conceptual framework.

  13. Research and application of knowledge resources network for product innovation.

    PubMed

    Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian

    2015-01-01

    In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method.

  14. Creating illusions of knowledge: learning errors that contradict prior knowledge.

    PubMed

    Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J

    2013-02-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors. 2013 APA, all rights reserved

  15. Ethical perspectives on knowledge translation in rehabilitation.

    PubMed

    Banja, John D; Eisen, Arri

    2013-01-01

    Although the literature on the ethical dimensions of knowledge creation, use, and dissemination is voluminous, it has not particularly examined the ethical dimensions of knowledge translation in rehabilitation. Yet, whether research is done in a wet lab or treatments are provided to patients in therapeutic settings, rehabilitation professionals commonly use (as well as create) knowledge and disseminate it to peers, patients, and various others. This article will refer to knowledge creation, use, and transfer as knowledge translation and examine some of its numerous ethical challenges. Three ethical dimensions of knowledge translation will particularly attract our attention: (1) the quality of knowledge disseminated to rehabilitationists; (2) ethical challenges in being too easily persuaded by or unreasonably resistant to putative knowledge; and (3) organizational barriers to knowledge translation. We will conclude with some recommendations on facilitating the ethical soundness of knowledge translation in rehabilitation. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  16. Age differences in suggestibility to contradictions of demonstrated knowledge: the influence of prior knowledge.

    PubMed

    Umanath, Sharda

    2016-11-01

    People maintain intact general knowledge into very old age and use it to support remembering. Interestingly, when older and younger adults encounter errors that contradict general knowledge, older adults suffer fewer memorial consequences: Older adults use fewer recently-encountered errors as answers for later knowledge questions. Why do older adults show this reduced suggestibility, and what role does their intact knowledge play? In three experiments, I examined suggestibility following exposure to errors in fictional stories that contradict general knowledge. Older adults consistently demonstrated more prior knowledge than younger adults but also gained access to even more across time. Additionally, they did not show a reduction in new learning from the stories, indicating lesser involvement of episodic memory failures. Critically, when knowledge was stably accessible, older adults relied more heavily on that knowledge compared to younger adults, resulting in reduced suggestibility. Implications for the broader role of knowledge in aging are discussed.

  17. A survey of noninteractive zero knowledge proof system and its applications.

    PubMed

    Wu, Huixin; Wang, Feng

    2014-01-01

    Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions.

  18. Knowledge of Alzheimer's disease among Vietnamese Americans and correlates of their knowledge about Alzheimer's disease.

    PubMed

    Lee, Sang E; Casado, Banghwa Lee

    2017-01-01

    The present study examined the knowledge of Alzheimer's disease and correlates of the disease knowledge among Vietnamese Americans. Cross-sectional survey interviews were conducted with 95 middle-aged and older Vietnamese Americans. Vietnamese Americans showed limited knowledge about Alzheimer's disease. Normalization of Alzheimer's disease in old age was prevalent. They lacked knowledge about treatment and cure of Alzheimer's disease. Those who reside longer in the U.S. and are more exposed to Alzheimer's disease are likely to have higher levels of Alzheimer's disease knowledge. Our study identified current Alzheimer's disease knowledge level and status, and areas of misconceptions and knowledge gaps among Vietnamese Americans, calling for urgent needs for educational outreach to improve knowledge about Alzheimer's disease among Vietnamese Americans. Information about who can be more or less knowledgeable about Alzheimer's disease can be used to strategize and tailor outreach efforts for different segments of the Vietnamese American population.

  19. Technology Focus: Enhancing Conceptual Knowledge of Linear Programming with a Flash Tool

    ERIC Educational Resources Information Center

    Garofalo, Joe; Cory, Beth

    2007-01-01

    Mathematical knowledge can be categorized in different ways. One commonly used way is to distinguish between procedural mathematical knowledge and conceptual mathematical knowledge. Procedural knowledge of mathematics refers to formal language, symbols, algorithms, and rules. Conceptual knowledge is essential for meaningful understanding of…

  20. Ubiquitous Mobile Knowledge Construction in Collaborative Learning Environments

    PubMed Central

    Baloian, Nelson; Zurita, Gustavo

    2012-01-01

    Knowledge management is a critical activity for any organization. It has been said to be a differentiating factor and an important source of competitiveness if this knowledge is constructed and shared among its members, thus creating a learning organization. Knowledge construction is critical for any collaborative organizational learning environment. Nowadays workers must perform knowledge creation tasks while in motion, not just in static physical locations; therefore it is also required that knowledge construction activities be performed in ubiquitous scenarios, and supported by mobile and pervasive computational systems. These knowledge creation systems should help people in or outside organizations convert their tacit knowledge into explicit knowledge, thus supporting the knowledge construction process. Therefore in our understanding, we consider highly relevant that undergraduate university students learn about the knowledge construction process supported by mobile and ubiquitous computing. This has been a little explored issue in this field. This paper presents the design, implementation, and an evaluation of a system called MCKC for Mobile Collaborative Knowledge Construction, supporting collaborative face-to-face tacit knowledge construction and sharing in ubiquitous scenarios. The MCKC system can be used by undergraduate students to learn how to construct knowledge, allowing them anytime and anywhere to create, make explicit and share their knowledge with their co-learners, using visual metaphors, gestures and sketches to implement the human-computer interface of mobile devices (PDAs). PMID:22969333

  1. Ubiquitous mobile knowledge construction in collaborative learning environments.

    PubMed

    Baloian, Nelson; Zurita, Gustavo

    2012-01-01

    Knowledge management is a critical activity for any organization. It has been said to be a differentiating factor and an important source of competitiveness if this knowledge is constructed and shared among its members, thus creating a learning organization. Knowledge construction is critical for any collaborative organizational learning environment. Nowadays workers must perform knowledge creation tasks while in motion, not just in static physical locations; therefore it is also required that knowledge construction activities be performed in ubiquitous scenarios, and supported by mobile and pervasive computational systems. These knowledge creation systems should help people in or outside organizations convert their tacit knowledge into explicit knowledge, thus supporting the knowledge construction process. Therefore in our understanding, we consider highly relevant that undergraduate university students learn about the knowledge construction process supported by mobile and ubiquitous computing. This has been a little explored issue in this field. This paper presents the design, implementation, and an evaluation of a system called MCKC for Mobile Collaborative Knowledge Construction, supporting collaborative face-to-face tacit knowledge construction and sharing in ubiquitous scenarios. The MCKC system can be used by undergraduate students to learn how to construct knowledge, allowing them anytime and anywhere to create, make explicit and share their knowledge with their co-learners, using visual metaphors, gestures and sketches to implement the human-computer interface of mobile devices (PDAs).

  2. Developing a framework for transferring knowledge into action: a thematic analysis of the literature

    PubMed Central

    Ward, Vicky; House, Allan; Hamer, Susan

    2010-01-01

    Objectives Although there is widespread agreement about the importance of transferring knowledge into action, we still lack high quality information about what works, in which settings and with whom. Whilst there are a large number of models and theories for knowledge transfer interventions, they are untested meaning that their applicability and relevance is largely unknown. This paper describes the development of a conceptual framework of translating knowledge into action and discusses how it can be used for developing a useful model of the knowledge transfer process. Methods A narrative review of the knowledge transfer literature identified 28 different models which explained all or part of the knowledge transfer process. The models were subjected to a thematic analysis to identify individual components and the types of processes used when transferring knowledge into action. The results were used to build a conceptual framework of the process. Results Five common components of the knowledge transfer process were identified: problem identification and communication; knowledge/research development and selection; analysis of context; knowledge transfer activities or interventions; and knowledge/research utilization. We also identified three types of knowledge transfer processes: a linear process; a cyclical process; and a dynamic multidirectional process. From these results a conceptual framework of knowledge transfer was developed. The framework illustrates the five common components of the knowledge transfer process and shows that they are connected via a complex, multidirectional set of interactions. As such the framework allows for the individual components to occur simultaneously or in any given order and to occur more than once during the knowledge transfer process. Conclusion Our framework provides a foundation for gathering evidence from case studies of knowledge transfer interventions. We propose that future empirical work is designed to test and refine the relevant importance and applicability of each of the components in order to build more useful models of knowledge transfer which can serve as a practical checklist for planning or evaluating knowledge transfer activities. PMID:19541874

  3. Knowledge Acquisition Using Linguistic-Based Knowledge Analysis

    Treesearch

    Daniel L. Schmoldt

    1998-01-01

    Most knowledge-based system developmentefforts include acquiring knowledge from one or more sources. difficulties associated with this knowledge acquisition task are readily acknowledged by most researchers. While a variety of knowledge acquisition methods have been reported, little has been done to organize those different methods and to suggest how to apply them...

  4. Effects of Prior Knowledge on Memory: Implications for Education

    ERIC Educational Resources Information Center

    Shing, Yee Lee; Brod, Garvin

    2016-01-01

    The encoding, consolidation, and retrieval of events and facts form the basis for acquiring new skills and knowledge. Prior knowledge can enhance those memory processes considerably and thus foster knowledge acquisition. But prior knowledge can also hinder knowledge acquisition, in particular when the to-be-learned information is inconsistent with…

  5. Knowledge Management in Libraries in the 21st Century.

    ERIC Educational Resources Information Center

    Shanhong, Tang

    This paper begins with a section that describes characteristics of knowledge management in libraries, including: human resource management is the core of knowledge management in libraries; the objective of knowledge management in libraries is to promote knowledge innovation; and information technology is a tool for knowledge management in…

  6. Doctoring the Knowledge Worker

    ERIC Educational Resources Information Center

    Tennant, Mark

    2004-01-01

    In this paper I examine the impact of the new 'knowledge economy' on contemporary doctoral education. I argue that the knowledge economy promotes a view of knowledge and knowledge workers that fundamentally challenges the idea of a university as a community of autonomous scholars transmitting and adding to society's 'stock of knowledge'. The paper…

  7. The Meta-Ontology Model of the Fishdisease Diagnostic Knowledge Based on Owl

    NASA Astrophysics Data System (ADS)

    Shi, Yongchang; Gao, Wen; Hu, Liang; Fu, Zetian

    For improving available and reusable of knowledge in fish disease diagnosis (FDD) domain and facilitating knowledge acquisition, an ontology model of FDD knowledge was developed based on owl according to FDD knowledge model. It includes terminology of terms in FDD knowledge and hierarchies of their class.

  8. Knowledge Construction and Knowledge Representation in High School Students' Design of Hypermedia Documents

    ERIC Educational Resources Information Center

    Chen, Pearl; McGrath, Diane

    2003-01-01

    This study documented the processes of knowledge construction and knowledge representation in high school students' hypermedia design projects. Analysis of knowledge construction in linking and structural building yielded distinct types and subtypes of hypermedia documents, which were characterized by four features of knowledge representation: (a)…

  9. Depth of Teachers' Knowledge: Frameworks for Teachers' Knowledge of Mathematics

    ERIC Educational Resources Information Center

    Holmes, Vicki-Lynn

    2012-01-01

    This article describes seven teacher knowledge frameworks and relates these frameworks to the teaching and assessment of elementary teacher's mathematics knowledge. The frameworks classify teachers' knowledge and provide a vocabulary and common language through which knowledge can be discussed and assessed. These frameworks are categorized into…

  10. Examination of Mathematics Teachers' Pedagogical Content Knowledge of Probability

    ERIC Educational Resources Information Center

    Danisman, Sahin; Tanisli, Dilek

    2017-01-01

    The aim of this study is to explore the probability-related pedagogical content knowledge (PCK) of secondary school mathematics teachers in terms of content knowledge, curriculum knowledge, student knowledge, and knowledge of teaching methods and strategies. Case study design, a qualitative research model, was used in the study, and the…

  11. The Case for Creative Abrasion: Experts Speak Out on Knowledge Management.

    ERIC Educational Resources Information Center

    Cowley-Durst, Barbara; Christensen, Hal D.; Degler, Duane; Weidner, Douglas; Feldstein, Michael

    2001-01-01

    Five knowledge management (KM) experts discuss answers to six fundamental issues of KM that address: a definition of knowledge and KM; relationship between business and KM; whether technology has helped the knowledge worker; relationship between learning, performance, knowledge, and community; the promise of knowledge ecology or ecosystem and…

  12. Research on the construction of three level customer service knowledge graph

    NASA Astrophysics Data System (ADS)

    Cheng, Shi; Shen, Jiajie; Shi, Quan; Cheng, Xianyi

    2017-09-01

    With the explosion of knowledge and information of the enterprise and the growing demand for intelligent knowledge management and application and improve business performance the knowledge expression and processing of the enterprise has become a hot topic. Aim at the problems of the electric marketing customer service knowledge map (customer service knowledge map) in building theory and method, electric marketing knowledge map of three levels of customer service was discussed, and realizing knowledge reasoning based on Neo4j, achieve good results in practical application.

  13. Incorporating linguistic knowledge for learning distributed word representations.

    PubMed

    Wang, Yan; Liu, Zhiyuan; Sun, Maosong

    2015-01-01

    Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining.

  14. Comparison of clinical knowledge bases for summarization of electronic health records.

    PubMed

    McCoy, Allison B; Sittig, Dean F; Wright, Adam

    2013-01-01

    Automated summarization tools that create condition-specific displays may improve clinician efficiency. These tools require new kinds of knowledge that is difficult to obtain. We compared five problem-medication pair knowledge bases generated using four previously described knowledge base development approaches. The number of pairs in the resulting mapped knowledge bases varied widely due to differing mapping techniques from the source terminologies, ranging from 2,873 to 63,977,738 pairs. The number of overlapping pairs across knowledge bases was low, with one knowledge base having half of the pairs overlapping with another knowledge base, and most having less than a third overlapping. Further research is necessary to better evaluate the knowledge bases independently in additional settings, and to identify methods to integrate the knowledge bases.

  15. Incorporating Linguistic Knowledge for Learning Distributed Word Representations

    PubMed Central

    Wang, Yan; Liu, Zhiyuan; Sun, Maosong

    2015-01-01

    Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining. PMID:25874581

  16. Scotland's knowledge network: a progress report on Knowledge into Action.

    PubMed

    Wales, Ann; Boyle, Derek

    2015-11-01

    Launched in 2012, Knowledge into Action is the national knowledge management strategy for the health and social care workforce in Scotland. It is transforming the role of the national digital knowledge service--NHS Education for Scotlands' Knowledge Network--and the NHSS librarian role to offer more active, tailored support for translating knowledge into frontline clinical practice. This includes the development of a national evidence search and summary service, help with converting knowledge into practical and usable formats for easy use at point of care and with using digital tools to share clinicians' learning, experience and expertise. Through this practical support, Knowledge into Action is contributing to quality and safety outcomes across NHS Scotland, building clinicians' capacity and capability in applying knowledge in frontline practice and service improvement. © The Author(s) 2015.

  17. Sexual knowledge and victimization in adults with autism spectrum disorders.

    PubMed

    Brown-Lavoie, S M; Viecili, M A; Weiss, J A

    2014-09-01

    There is a significant gap in understanding the risk of sexual victimization in individuals with autism spectrum disorders (ASD) and the variables that contribute to risk. Age appropriate sexual interest, limited sexual knowledge and experiences, and social deficits, may place adults with ASD at increased risk. Ninety-five adults with ASD and 117 adults without ASD completed questionnaires regarding sexual knowledge sources, actual knowledge, perceived knowledge, and sexual victimization. Individuals with ASD obtained less of their sexual knowledge from social sources, more sexual knowledge from non-social sources, had less perceived and actual knowledge, and experienced more sexual victimization than controls. The increased risk of victimization by individuals with ASD was partially mediated by their actual knowledge. The link between knowledge and victimization has important clinical implications for interventions.

  18. On the acquisition and representation of procedural knowledge

    NASA Technical Reports Server (NTRS)

    Saito, T.; Ortiz, C.; Loftin, R. B.

    1992-01-01

    Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.

  19. Sexually active older Australian's knowledge of sexually transmitted infections and safer sexual practices.

    PubMed

    Lyons, Anthony; Heywood, Wendy; Fileborn, Bianca; Minichiello, Victor; Barrett, Catherine; Brown, Graham; Hinchliff, Sharron; Malta, Sue; Crameri, Pauline

    2017-06-01

    Rates of sexually transmitted infections (STIs) are rising among older Australians. We conducted a large survey of older people's knowledge of STIs and safer sexual practices. A total of 2,137 Australians aged 60 years and older completed the survey, which included 15 questions assessing knowledge of STIs and safer sexual practices. We examined both levels of knowledge and factors associated with an overall knowledge score. In total, 1,652 respondents reported having sex in the past five years and answered all knowledge questions. This group had good general knowledge but poorer knowledge in areas such as the protection offered by condoms and potential transmission modes for specific STIs. Women had better knowledge than men. Men in their 60s, men with higher education levels, and men who thought they were at risk of STIs reported better knowledge than other men. Knowledge was also better among men and women who had been tested for STIs or reported 'other' sources of knowledge on STIs. Many older Australians lack knowledge of STIs and safer sexual practices. Implications for public health: To reverse current trends toward increasing STI diagnoses in this population, policies and education campaigns aimed at improving knowledge levels may need to be considered. © 2017 The Authors.

  20. Local ecological knowledge among Baka children: a case of “children's culture” ?

    PubMed Central

    Gallois, Sandrine; Duda, Romain; Reyes-García, Victoria

    2016-01-01

    Childhood is an extensive life period specific to the human species and a key stage for development. Considering the importance of childhood for cultural transmission, we test the existence of a 'children's culture', or child-specific knowledge and practices not necessarily shared with adults, among the Baka in Southeast Cameroon. Using structured questionnaires, we collected data among 69 children and 175 adults to assess the ability to name, identify, and conceptualize animals and wild edibles. We found that some of the ecological knowledge related to little mammals and birds reported by Baka children was not reported by adults. We also found similarities between children’s and adult’s knowledge, both regarding the content of knowledge and how knowledge is distributed. Thus, middle childhood children hold similar knowledge than adults, especially related to wild edibles. Moreover, as children age, they start shedding child-specific knowledge and holding more adult’s knowledge. Additionally and echoing the gendered knowledge distribution present in adulthood, since middle childhood there are differences in the knowledge hold by boys and girls. We discuss our results highlighting the existence of specific ecological knowledge held by Baka children, the overlap between children’s and adults’ knowledge, and the changes in children’s ecological knowledge as they move into adulthood. PMID:28386157

  1. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  2. Knowledge Creation in Nursing Education

    PubMed Central

    Hassanian, Zahra Marzieh; Ahanchian, Mohammad Reza; Ahmadi, Suleiman; Gholizadeh, Rezvan Hossein; Karimi-Moonaghi, Hossein

    2015-01-01

    In today’s society, knowledge is recognized as a valuable social asset and the educational system is in search of a new strategy that allows them to construct their knowledge and experience. The purpose of this study was to explore the process of knowledge creation in nursing education. In the present study, the grounded theory approach was used. This method provides a comprehensive approach to collecting, organizing, and analyzing data. Data were obtained through 17 semi-structured interviews with nursing faculties and nursing students. Purposeful and theoretical sampling was conducted. Based on the method of Strauss and Corbin, the data were analyzed using fragmented, deep, and constant-comparative methods. The main categories included striving for growth and reduction of ambiguity, use of knowledge resources, dynamism of mind and social factors, converting knowledge, and creating knowledge. Knowledge was converted through mind processes, individual and group reflection, praxis and research, and resulted in the creation of nursing knowledge. Discrete nursing knowledge is gained through disconformity research in order to gain more individual advantages. The consequence of this analysis was gaining new knowledge. Knowledge management must be included in the mission and strategic planning of nursing education, and it should be planned through operational planning in order to create applicable knowledge. PMID:25716383

  3. A microgenetic study of learning about the molecular theory of matter and chemical reactions

    NASA Astrophysics Data System (ADS)

    Chinn, Clark Allen

    This paper reports the results of an experimental microgenetic study of children learning complex knowledge from text and experiments. The study had two goals. The first was to investigate fine-grained, moment-to-moment changes in knowledge as middle-school students learned about molecules and chemical reactions over thirteen sessions. The second was to investigate the effects of two instructional treatments, one using implicit textbook explanations and one using explicit explanations developed according to a theory of how scientific knowledge is structured. In the study, 61 sixth- and seventh-graders worked one on one with undergraduate instructors in eleven sessions of about 50 to 80 minutes. The instructors guided the students in conducting experiments and thinking out loud about texts. Topics studied included molecules, states of matter, chemical reactions, and heat transfer. A dense array of questions provided a detailed picture of children's moment-to-moment and day-to-day changes in knowledge. Three results chapters address students' preinstructional knowledge, the effects of the experimental treatment at posttest, and five detailed case studies of students' step-by-step knowledge change over eleven sessions. The chapter on preinstructional knowledge discussed three aspects of global knowledge change: conceptual change, coherence, and entrenchment. Notably, this chapter provides systematic evidence that children's knowledge was fragmented and that consistency with general unifying principles did not guarantee a highly coherent body of knowledge. The experimental manipulation revealed a strong advantage for explicit explanations over implicit textbook explanations. Multiple explicit explanations (e.g., highly explicit explanations of three or four chemical reactions) appeared to be necessary for students to master key concepts. Microgenetic analyses of five cases addressed eight empirical issues that should be addressed by any theory of knowledge acquisition: (a) the nature of the overall knowledge change, (b) the progression of intermediate states during knowledge change, (c) initiators of knowledge change, (d) interactions of prior background knowledge and prior domain knowledge during knowledge changes, (e) the fate of old and new knowledge, (f) the relationship between belief and knowledge, (g) changes in meta-awareness, and (h) factors that influenced the course of knowledge change.

  4. Combining factual and heuristic knowledge in knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Hull, Richard; Karr, Clark; Hosken, Bruce; Verhagen, William

    1992-01-01

    A knowledge acquisition technique that combines heuristic and factual knowledge represented as two hierarchies is described. These ideas were applied to the construction of a knowledge acquisition interface to the Expert System Analyst (OPERA). The goal of OPERA is to improve the operations support of the computer network in the space shuttle launch processing system. The knowledge acquisition bottleneck lies in gathering knowledge from human experts and transferring it to OPERA. OPERA's knowledge acquisition problem is approached as a classification problem-solving task, combining this approach with the use of factual knowledge about the domain. The interface was implemented in a Symbolics workstation making heavy use of windows, pull-down menus, and other user-friendly devices.

  5. A Discussion of Knowledge Based Design

    NASA Technical Reports Server (NTRS)

    Wood, Richard M.; Bauer, Steven X. S.

    1999-01-01

    A discussion of knowledge and Knowledge- Based design as related to the design of aircraft is presented. The paper discusses the perceived problem with existing design studies and introduces the concepts of design and knowledge for a Knowledge- Based design system. A review of several Knowledge-Based design activities is provided. A Virtual Reality, Knowledge-Based system is proposed and reviewed. The feasibility of Virtual Reality to improve the efficiency and effectiveness of aerodynamic and multidisciplinary design, evaluation, and analysis of aircraft through the coupling of virtual reality technology and a Knowledge-Based design system is also reviewed. The final section of the paper discusses future directions for design and the role of Knowledge-Based design.

  6. Terminological reference of a knowledge-based system: the data dictionary.

    PubMed

    Stausberg, J; Wormek, A; Kraut, U

    1995-01-01

    The development of open and integrated knowledge bases makes new demands on the definition of the used terminology. The definition should be realized in a data dictionary separated from the knowledge base. Within the works done at a reference model of medical knowledge, a data dictionary has been developed and used in different applications: a term definition shell, a documentation tool and a knowledge base. The data dictionary includes that part of terminology, which is largely independent of a certain knowledge model. For that reason, the data dictionary can be used as a basis for integrating knowledge bases into information systems, for knowledge sharing and reuse and for modular development of knowledge-based systems.

  7. Research and Application of Knowledge Resources Network for Product Innovation

    PubMed Central

    Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian

    2015-01-01

    In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method. PMID:25884031

  8. Level and determinants of diabetes knowledge in patients with diabetes in Zimbabwe: a cross-sectional study

    PubMed Central

    Mufunda, Esther; Wikby, Kerstin; Björn, Albin; Hjelm, Katarina

    2012-01-01

    Introduction A previous study of beliefs about health and illness in Zimbabweans with diabetes mellitus indicated limited knowledge about diabetes and the body, affecting self-care and health-care seeking behaviour. The aim of this study was to assess the level of diabetes knowledge in Zimbabwean adults with diabetes mellitus, to determine the main gaps in knowledge and identify the socio-demographic and diabetes-related determinants that predict diabetes awareness and self-care practices. Methods A cross-sectional descriptive study was performed using a standardized self-report Diabetes Knowledge Test questionnaire (DKT) of 58 respondents, 32 women and 26 men. Results were analysed with descriptive and analytic statistical methods. Results The majority of the respondents scored average knowledge on all three sub-scales: general knowledge, insulin use and total knowledge, with an overall score of 63.1± 14, 2%. Major knowledge gaps were in areas related to diet, insulin use and glycaemic control. No significant differences in mean scores were detected in the diabetes knowledge sub-scales when comparisons were made of mean knowledge scores in relation to socio-demographic and diabetes-related characteristics. However, diabetes-related complications were significantly associated with lower total and general diabetes knowledge, and female gender was an independent determinant of low general knowledge. Conclusion Knowledge gaps were evident in areas regarding insulin use, diet and glycaemic control. Low diabetes knowledge was associated with female gender and could be a risk factor for development of diabetes-related complications. Knowledge gaps need to be addressed in diabetes education to prevent development of diabetes-related complications. PMID:23396799

  9. Developing a geoscience knowledge framework for a national geological survey organisation

    NASA Astrophysics Data System (ADS)

    Howard, Andrew S.; Hatton, Bill; Reitsma, Femke; Lawrie, Ken I. G.

    2009-04-01

    Geological survey organisations (GSOs) are established by most nations to provide a geoscience knowledge base for effective decision-making on mitigating the impacts of natural hazards and global change, and on sustainable management of natural resources. The value of the knowledge base as a national asset is continually enhanced by the exchange of knowledge between GSOs as data and information providers and the stakeholder community as knowledge 'users and exploiters'. Geological maps and associated narrative texts typically form the core of national geoscience knowledge bases, but have some inherent limitations as methods of capturing and articulating knowledge. Much knowledge about the three-dimensional (3D) spatial interpretation and its derivation and uncertainty, and the wider contextual value of the knowledge, remains intangible in the minds of the mapping geologist in implicit and tacit form. To realise the value of these knowledge assets, the British Geological Survey (BGS) has established a workflow-based cyber-infrastructure to enhance its knowledge management and exchange capability. Future geoscience surveys in the BGS will contribute to a national, 3D digital knowledge base on UK geology, with the associated implicit and tacit information captured as metadata, qualitative assessments of uncertainty, and documented workflows and best practice. Knowledge-based decision-making at all levels of society requires both the accessibility and reliability of knowledge to be enhanced in the grid-based world. Establishment of collaborative cyber-infrastructures and ontologies for geoscience knowledge management and exchange will ensure that GSOs, as knowledge-based organisations, can make their contribution to this wider goal.

  10. Exchanging clinical knowledge via Internet.

    PubMed

    Buchan, I E; Hanka, R

    1997-11-01

    The need for effective and efficient exchange of clinical knowledge is increasing. Paper based methods for managing clinical knowledge are not meeting the demand for knowledge and this has undoubtedly contributed to the widely reported failures of clinical guidelines. Internet affords both opportunities and dangers for clinical knowledge. Systems such as Wax have demonstrated the importance of intuitive structure in the management of knowledge. We report on a new initiative for the global management of clinical knowledge.

  11. Case Study: The Transfer of Tacit Knowledge from Community College Full-Time to Adjunct Faculty

    ERIC Educational Resources Information Center

    Guzzo, Linda R.

    2013-01-01

    Knowledge is a valuable resource that fosters innovation and growth in organizations. There are two forms of knowledge: explicit knowledge or documented information and tacit knowledge or undocumented information which resides in individuals' minds. There is heightened interest in knowledge management and specifically the transfer of tacit…

  12. An Effective Assessment of Knowledge Sharing and E-Learning Portals

    ERIC Educational Resources Information Center

    Subramanian, D. Venkata; Geetha, Angelina; Shankar, P.

    2015-01-01

    In recent years, most of the companies have increasingly realized the importance of the knowledge sharing portal and E-Learning portals to provide competitive knowledge for their employees. The knowledge stored in these portals varies from technical, process and project knowledge functional or domain specific knowledge to face the competitiveness…

  13. University, Knowledge and Regional Development: Factors Affecting Knowledge Transfer in a Developing Region

    ERIC Educational Resources Information Center

    Fongwa, Neba Samuel; Marais, Lochner

    2016-01-01

    The role of knowledge in the current knowledge economy cannot be overly emphasised. Successful regions are continuously being linked to excellence in the production, accumulation, and application of knowledge. Universities have increasingly been at the centre of such knowledge production, application and transfer. Yet, there is little research and…

  14. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    ERIC Educational Resources Information Center

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  15. Factors Influencing Knowledge Sharing among Undergraduate Students: A Malaysian Perspective

    ERIC Educational Resources Information Center

    Ong, Hway-Boon; Yeap, Peik-Foong; Tan, Siow-Hooi; Chong, Lee-Lee

    2011-01-01

    Knowledge sharing can enhance learning and help to build the knowledge workforce. This paper reports on a study of knowledge sharing behaviour among undergraduate students in Malaysia. Knowledge sharing was found to be influenced by the mechanisms used, various barriers to communication and the motivations behind knowledge sharing. The mechanisms…

  16. Relationship of the Cognitive Functions of Prospective Science Teachers and Their Knowledge, Knowledge Levels, Success and Success Levels

    ERIC Educational Resources Information Center

    Ismail, Yilmaz

    2017-01-01

    This study reveals the transformation of prospective science teachers into knowledgeable individuals through classical, combination, and information theories. It distinguishes between knowledge and success, and between knowledge levels and success levels calculated each through three theories. The relation between the knowledge of prospective…

  17. Knowledge-for-Action Theories in Evaluation: Knowledge Utilization, Diffusion, Implementation, Transfer, and Translation

    ERIC Educational Resources Information Center

    Ottoson, Judith M.

    2009-01-01

    Five knowledge-for-action theories are summarized and compared in this chapter for their evaluation implications: knowledge utilization, diffusion, implementation, transfer, and translation. Usually dispersed across multiple fields and disciplines, these theories are gathered here for a common focus on knowledge and change. Knowledge in some form…

  18. More Stake, Less Gravy? Issues of Knowledge and Power in Higher Education.

    ERIC Educational Resources Information Center

    Bak, Nelleke; Paterson, Andrew

    1997-01-01

    Two perspectives on higher education's stakeholders and their involvement in the development of knowledge in universities are examined and contrasted: (1) that the "stakeholder" notion of knowledge doesn't allow for critical engagement with knowledge, and (2) that the "stakeholder" view of knowledge acknowledges clear links between knowledge and…

  19. Prompted Journal Writing Supports Preservice History Teachers in Drawing on Multiple Knowledge Domains for Designing Learning Tasks

    ERIC Educational Resources Information Center

    Wäschle, Kristin; Lehmann, Thomas; Brauch, Nicola; Nückles, Matthias

    2015-01-01

    Becoming a history teacher requires the integration of pedagogical knowledge, pedagogical content knowledge, and content knowledge. Because the integration of knowledge from different disciplines is a complex task, we investigated prompted learning journals as a method to support teacher students' knowledge integration. Fifty-two preservice…

  20. Content-Related Knowledge of Biology Teachers from Secondary Schools: Structure and Learning Opportunities

    ERIC Educational Resources Information Center

    Großschedl, Jörg; Mahler, Daniela; Kleickmann, Thilo; Harms, Ute

    2014-01-01

    Teachers' content-related knowledge is a key factor influencing the learning progress of students. Different models of content-related knowledge have been proposed by educational researchers; most of them take into account three categories: content knowledge, pedagogical content knowledge, and curricular knowledge. As there is no consensus about…

  1. Exploring Biology Teachers' Pedagogical Content Knowledge in the Teaching of Genetics in Swaziland Science Classrooms

    ERIC Educational Resources Information Center

    Mthethwa-Kunene, Eunice; Onwu, Gilbert Oke; de Villiers, Rian

    2015-01-01

    This study explored the pedagogical content knowledge (PCK) and its development of four experienced biology teachers in the context of teaching school genetics. PCK was defined in terms of teacher content knowledge, pedagogical knowledge and knowledge of students' preconceptions and learning difficulties. Data sources of teacher knowledge base…

  2. Measuring Knowledge Elaboration Based on a Computer-Assisted Knowledge Map Analytical Approach to Collaborative Learning

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Huang, Ronghuai; Hwang, Gwo-Jen; Yang, Kaicheng

    2015-01-01

    The purpose of this study is to quantitatively measure the level of knowledge elaboration and explore the relationships between prior knowledge of a group, group performance, and knowledge elaboration in collaborative learning. Two experiments were conducted to investigate the level of knowledge elaboration. The collaborative learning objective in…

  3. Building Knowledge Cultures: Education and Development in the Age of Knowledge Capitalism

    ERIC Educational Resources Information Center

    Peters, Michael A.; Besley, A.C.

    2006-01-01

    This book develops the notion of "knowledge cultures" as a basis for understanding the possibilities of education and development in the age of knowledge Capitalism. "Knowledge cultures" point to the significance of cultural preconditions in the new production of knowledge and how they are based on shared practices, embodying culturally preferred…

  4. World Knowledge and Global Citizenship: Factual and Perceived World Knowledge as Predictors of Global Citizenship Identification

    ERIC Educational Resources Information Center

    Reysen, Stephen; Katzarska-Miller, Iva; Gibson, Shonda A.; Hobson, Braken

    2013-01-01

    We examine the influence of factual and perceived world knowledge on global citizenship identification. Perceived world knowledge directly predicted global citizenship identification, while factual world knowledge did not (Study 1). Students' factual (Study 1) and perceived (Study 2) world knowledge predicted students' normative environment…

  5. Knowledge management in healthcare: towards 'knowledge-driven' decision-support services.

    PubMed

    Abidi, S S

    2001-09-01

    In this paper, we highlight the involvement of Knowledge Management in a healthcare enterprise. We argue that the 'knowledge quotient' of a healthcare enterprise can be enhanced by procuring diverse facets of knowledge from the seemingly placid healthcare data repositories, and subsequently operationalising the procured knowledge to derive a suite of Strategic Healthcare Decision-Support Services that can impact strategic decision-making, planning and management of the healthcare enterprise. In this paper, we firstly present a reference Knowledge Management environment-a Healthcare Enterprise Memory-with the functionality to acquire, share and operationalise the various modalities of healthcare knowledge. Next, we present the functional and architectural specification of a Strategic Healthcare Decision-Support Services Info-structure, which effectuates a synergy between knowledge procurement (vis-à-vis Data Mining) and knowledge operationalisation (vis-à-vis Knowledge Management) techniques to generate a suite of strategic knowledge-driven decision-support services. In conclusion, we argue that the proposed Healthcare Enterprise Memory is an attempt to rethink the possible sources of leverage to improve healthcare delivery, hereby providing a valuable strategic planning and management resource to healthcare policy makers.

  6. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public

    PubMed Central

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314

  7. A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.

    PubMed

    Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin

    2016-01-01

    The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.

  8. The relationship between knowledge of leadership and knowledge management practices in the food industry in Kurdistan province, Iran.

    PubMed

    Jad, Seyyed Mohammad Moosavi; Geravandi, Sahar; Mohammadi, Mohammad Javad; Alizadeh, Rashin; Sarvarian, Mohammad; Rastegarimehr, Babak; Afkar, Abolhasan; Yari, Ahmad Reza; Momtazan, Mahboobeh; Valipour, Aliasghar; Mahboubi, Mohammad; Karimyan, Azimeh; Mazraehkar, Alireza; Nejad, Ali Soleimani; Mohammadi, Hafez

    2017-12-01

    The aim of this study was to identify the relationship between the knowledge of leadership and knowledge management practices. This research strategy, in terms of quantity, procedure and obtain information, is descriptive and correlational. Statistical population, consist of all employees of a food industry in Kurdistan province of Iran, who were engaged in 2016 and their total number is about 1800 people. 316 employees in the Kurdistan food industry (Kurdistan FI) were selected, using Cochran formula. Non-random method and valid questions (standard) for measurement of the data are used. Reliability and validity were confirmed. Statistical analysis of the data was carried out, using SPSS 16. The statistical analysis of collected data showed the relationship between knowledge-oriented of leadership and knowledge management activities as mediator variables. The results of the data and test hypotheses suggest that knowledge management activities play an important role in the functioning of product innovation and the results showed that the activities of Knowledge Management (knowledge transfer, storage knowledge, application of knowledge, creation of knowledge) on performance of product innovation.

  9. Language knowledge and event knowledge in language use

    PubMed Central

    Willits, Jon A.; Amato, Michael S.; MacDonald, Maryellen C.

    2018-01-01

    This paper examines how semantic knowledge is used in language comprehension and in making judgments about events in the world. We contrast knowledge gleaned from prior language experience (“language knowledge”) and knowledge coming from prior experience with the world (“world knowledge”). In two corpus analyses, we show that previous research linking verb aspect and event representations have confounded language and world knowledge. Then, using carefully chosen stimuli that remove this confound, we performed four experiments that manipulated the degree to which language knowledge or world knowledge should be salient and relevant to performing a task, finding in each case that participants use the type of knowledge most appropriate to the task. These results provide evidence for a highly context-sensitive and interactionist perspective on how semantic knowledge is represented and used during language processing. PMID:25791750

  10. Perspectives on knowledge in engineering design

    NASA Technical Reports Server (NTRS)

    Rasdorf, W. J.

    1985-01-01

    Various perspectives are given of the knowledge currently used in engineering design, specifically dealing with knowledge-based expert systems (KBES). Constructing an expert system often reveals inconsistencies in domain knowledge while formalizing it. The types of domain knowledge (facts, procedures, judgments, and control) differ from the classes of that knowledge (creative, innovative, and routine). The feasible tasks for expert systems can be determined based on these types and classes of knowledge. Interpretive tasks require reasoning about a task in light of the knowledge available, where generative tasks create potential solutions to be tested against constraints. Only after classifying the domain by type and level can the engineer select a knowledge-engineering tool for the domain being considered. The critical features to be weighed after classification are knowledge representation techniques, control strategies, interface requirements, compatibility with traditional systems, and economic considerations.

  11. A Knowledge-Base for a Personalized Infectious Disease Risk Prediction System.

    PubMed

    Vinarti, Retno; Hederman, Lucy

    2018-01-01

    We present a knowledge-base to represent collated infectious disease risk (IDR) knowledge. The knowledge is about personal and contextual risk of contracting an infectious disease obtained from declarative sources (e.g. Atlas of Human Infectious Diseases). Automated prediction requires encoding this knowledge in a form that can produce risk probabilities (e.g. Bayesian Network - BN). The knowledge-base presented in this paper feeds an algorithm that can auto-generate the BN. The knowledge from 234 infectious diseases was compiled. From this compilation, we designed an ontology and five rule types for modelling IDR knowledge in general. The evaluation aims to assess whether the knowledge-base structure, and its application to three disease-country contexts, meets the needs of personalized IDR prediction system. From the evaluation results, the knowledge-base conforms to the system's purpose: personalization of infectious disease risk.

  12. Computer-assisted knowledge acquisition for hypermedia systems

    NASA Technical Reports Server (NTRS)

    Steuck, Kurt

    1990-01-01

    The usage of procedural and declarative knowledge to set up the structure or 'web' of a hypermedia environment is described. An automated knowledge acquisition tool was developed that helps a knowledge engineer elicit and represent an expert's knowledge involved in performing procedural tasks. The tool represents both procedural and prerequisite, declarative knowledge that supports each activity performed by the expert. This knowledge is output and subsequently read by a hypertext scripting language to generate the link between blank, but labeled cards. Each step of the expert's activity and each piece of supporting declarative knowledge is set up as an empty node. An instructional developer can then enter detailed instructional material concerning each step and declarative knowledge into these empty nodes. Other research is also described that facilitates the translation of knowledge from one form into a form more readily useable by computerized systems.

  13. Major accident prevention through applying safety knowledge management approach.

    PubMed

    Kalatpour, Omid

    2016-01-01

    Many scattered resources of knowledge are available to use for chemical accident prevention purposes. The common approach to management process safety, including using databases and referring to the available knowledge has some drawbacks. The main goal of this article was to devise a new emerged knowledge base (KB) for the chemical accident prevention domain. The scattered sources of safety knowledge were identified and scanned. Then, the collected knowledge was formalized through a computerized program. The Protégé software was used to formalize and represent the stored safety knowledge. The domain knowledge retrieved as well as data and information. This optimized approach improved safety and health knowledge management (KM) process and resolved some typical problems in the KM process. Upgrading the traditional resources of safety databases into the KBs can improve the interaction between the users and knowledge repository.

  14. What Did They Learn? Effects of a Brief Cognitive Behavioral Therapy Workshop on Community Therapists' Knowledge.

    PubMed

    Scott, Kelli; Klech, David; Lewis, Cara C; Simons, Anne D

    2016-11-01

    Knowledge gain has been identified as necessary but not sufficient for therapist behavior change. Declarative knowledge, or factual knowledge, is thought to serve as a prerequisite for procedural knowledge, the how to knowledge system, and reflective knowledge, the skill refinement system. The study aimed to examine how a 1-day workshop affected therapist cognitive behavioral therapy declarative knowledge. Participating community therapists completed a test before and after training that assessed cognitive behavioral therapy knowledge. Results suggest that the workshop significantly increased declarative knowledge. However, post-training total scores remained moderately low, with several questions answered incorrectly despite content coverage in the workshop. These findings may have important implications for structuring effective cognitive behavioral therapy training efforts and for the successful implementation of cognitive behavioral therapy in community settings.

  15. Team knowledge research: emerging trends and critical needs.

    PubMed

    Wildman, Jessica L; Thayer, Amanda L; Pavlas, Davin; Salas, Eduardo; Stewart, John E; Howse, William R

    2012-02-01

    This article provides a systematic review of the team knowledge literature and guidance for further research. Recent research has called attention to the need for the improved study and understanding of team knowledge. Team knowledge refers to the higher level knowledge structures that emerge from the interactions of individual team members. We conducted a systematic review of the team knowledge literature, focusing on empirical work that involves the measurement of team knowledge constructs. For each study, we extracted author degree area, study design type, study setting, participant type, task type, construct type, elicitation method, aggregation method, measurement timeline, and criterion domain. Our analyses demonstrate that many of the methodological characteristics of team knowledge research can be linked back to the academic training of the primary author and that there are considerable gaps in our knowledge with regard to the relationships between team knowledge constructs, the mediating mechanisms between team knowledge and performance, and relationships with criteria outside of team performance, among others. We also identify categories of team knowledge not yet examined based on an organizing framework derived from a synthesis of the literature. There are clear opportunities for expansion in the study of team knowledge; the science of team knowledge would benefit from a more holistic theoretical approach. Human factors researchers are increasingly involved in the study of teams. This review and the resulting organizing framework provide researchers with a summary of team knowledge research over the past 10 years and directions for improving further research.

  16. Whose Knowledge, Whose Development? Use and Role of Local and External Knowledge in Agroforestry Projects in Bolivia.

    PubMed

    Jacobi, Johanna; Mathez-Stiefel, Sarah-Lan; Gambon, Helen; Rist, Stephan; Altieri, Miguel

    2017-03-01

    Agroforestry often relies on local knowledge, which is gaining recognition in development projects. However, how local knowledge can articulate with external and scientific knowledge is little known. Our study explored the use and integration of local and external knowledge in agroforestry projects in Bolivia. In 42 field visits and 62 interviews with agroforestry farmers, civil society representatives, and policymakers, we found a diverse knowledge base. We examined how local and external knowledge contribute to livelihood assets and tree and crop diversity. Projects based predominantly on external knowledge tended to promote a single combination of tree and crop species and targeted mainly financial capital, whereas projects with a local or mixed knowledge base tended to focus on food security and increased natural capital (e.g., soil restoration) and used a higher diversity of trees and crops than those with an external knowledge base. The integration of different forms of knowledge can enable farmers to better cope with new challenges emerging as a result of climate change, fluctuating market prices for cash crops, and surrounding destructive land use strategies such as uncontrolled fires and aerial fumigation with herbicides. However, many projects still tended to prioritize external knowledge and undervalue local knowledge-a tendency that has long been institutionalized in the formal educational system and in extension services. More dialogue is needed between different forms of knowledge, which can be promoted by strengthening local organizations and their networks, reforming agricultural educational institutions, and working in close interaction with policymakers.

  17. Evaluating Environmental Knowledge Dimension Convergence to Assess Educational Programme Effectiveness

    NASA Astrophysics Data System (ADS)

    Liefländer, Anne K.; Bogner, Franz X.; Kibbe, Alexandra; Kaiser, Florian G.

    2015-03-01

    One aim of environmental education is fostering sustainable environmental action. Some environmental behaviour models suggest that this can be accomplished in part by improving people's knowledge. Recent studies have identified a distinct, psychometrically supported environmental knowledge structure consisting of system, action-related and effectiveness knowledge. Besides system knowledge, which is most often the focus of such studies, incorporating the other knowledge dimensions into these dimensions was suggested to enhance effectiveness. Our study is among the first to implement these dimensions together in an educational campaign and to use these dimensions to evaluate the effectiveness of a programme on water issues. We designed a four-day environmental education programme on water issues for students at an educational field centre. We applied a newly developed multiple-choice instrument using a pre-, post-, retention test design. The knowledge scales were calibrated with the Rasch model. In addition to the commonly assessed individual change in knowledge level, we also measured the change in knowledge convergence, the extent to which the knowledge dimensions merge as a person's environmental knowledge increases, as an innovative indicator of educational success. Following programme participation, students significantly improved in terms of amount learned in each knowledge dimension and in terms of integration of the knowledge dimensions. The effectiveness knowledge shows the least gain, persistence and convergence, which we explain by considering the dependence of the knowledge dimensions on each other. Finally, we discuss emerging challenges for educational researchers and practical implications for environmental educators.

  18. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed Central

    2013-01-01

    Background Medical knowledge encompasses both conceptual (facts or “what” information) and procedural knowledge (“how” and “why” information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Methods Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Conclusions Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula. PMID:23433202

  19. Relationships between core factors of knowledge management in hospital nursing organisations and outcomes of nursing performance.

    PubMed

    Lee, Eun Ju; Kim, Hong Soon; Kim, Hye Young

    2014-12-01

    The study was conducted to investigate the levels of implementation of knowledge management and outcomes of nursing performance, to examine the relationships between core knowledge management factors and nursing performance outcomes and to identify core knowledge management factors affecting these outcomes. Effective knowledge management is very important to achieve strong organisational performance. The success or failure of knowledge management depends on how effectively an organisation's members share and use their knowledge. Because knowledge management plays a key role in enhancing nursing performance, identifying the core factors and investigating the level of knowledge management in a given hospital are priorities to ensure a high quality of nursing for patients. The study employed a descriptive research procedure. The study sample consisted of 192 nurses registered in three large healthcare organisations in South Korea. The variables demographic characteristics, implementation of core knowledge management factors and outcomes of nursing performance were examined and analysed in this study. The relationships between the core knowledge management factors and outcomes of nursing performance as well as the factors affecting the performance outcomes were investigated. A knowledge-sharing culture and organisational learning were found to be core factors affecting nursing performance. The study results provide basic data that can be used to formulate effective knowledge management strategies for enhancing nursing performance in hospital nursing organisations. In particular, prioritising the adoption of a knowledge-sharing culture and organisational learning in knowledge management systems might be one method for organisations to more effectively manage their knowledge resources and thus to enhance the outcomes of nursing performance and achieve greater business competitiveness. The study results can contribute to the development of effective and efficient knowledge management systems and strategies for enhancing knowledge-sharing culture and organisational learning that can improve both the productivity and competitiveness of healthcare organisations. © 2014 John Wiley & Sons Ltd.

  20. Knowledge management: implications for human service organizations.

    PubMed

    Austin, Michael J; Claassen, Jennette; Vu, Catherine M; Mizrahi, Paola

    2008-01-01

    Knowledge management has recently taken a more prominent role in the management of organizations as worker knowledge and intellectual capital are recognized as critical to organizational success. This analysis explores the literature of knowledge management including the individual level of tacit and explicit knowledge, the networks and social interactions utilized by workers to create and share new knowledge, and the multiple organizational and managerial factors associated with effective knowledge management systems. Based on the role of organizational culture, structure, leadership, and reward systems, six strategies are identified to assist human service organizations with implementing new knowledge management systems.

  1. Architecture and Initial Development of a Knowledge-as-a-Service Activator for Computable Knowledge Objects for Health.

    PubMed

    Flynn, Allen J; Boisvert, Peter; Gittlen, Nate; Gross, Colin; Iott, Brad; Lagoze, Carl; Meng, George; Friedman, Charles P

    2018-01-01

    The Knowledge Grid (KGrid) is a research and development program toward infrastructure capable of greatly decreasing latency between the publication of new biomedical knowledge and its widespread uptake into practice. KGrid comprises digital knowledge objects, an online Library to store them, and an Activator that uses them to provide Knowledge-as-a-Service (KaaS). KGrid's Activator enables computable biomedical knowledge, held in knowledge objects, to be rapidly deployed at Internet-scale in cloud computing environments for improved health. Here we present the Activator, its system architecture and primary functions.

  2. Applications of Ontologies in Knowledge Management Systems

    NASA Astrophysics Data System (ADS)

    Rehman, Zobia; Kifor, Claudiu V.

    2014-12-01

    Enterprises are realizing that their core asset in 21st century is knowledge. In an organization knowledge resides in databases, knowledge bases, filing cabinets and peoples' head. Organizational knowledge is distributed in nature and its poor management causes repetition of activities across the enterprise. To get true benefits from this asset, it is important for an organization to "know what they know". That's why many organizations are investing a lot in managing their knowledge. Artificial intelligence techniques have a huge contribution in organizational knowledge management. In this article we are reviewing the applications of ontologies in knowledge management realm

  3. Collaborative research, knowledge and emergence.

    PubMed

    Zittoun, Tania; Baucal, Aleksandar; Cornish, Flora; Gillespie, Alex

    2007-06-01

    We use the notion of emergence to consider the sorts of knowledge that can be produced in a collaborative research project. The notion invites us to see collaborative work as a developmental dynamic system in which various changes constantly occur. Among these we examine two sorts of knowledge that can be produced: scientific knowledge, and collaborative knowledge. We argue that collaborative knowledge can enable researchers to reflectively monitor their collaborative project, so as to encourage its most productive changes. On the basis of examples taken from this special issue, we highlight four modes of producing collaborative knowledge and discuss the possible uses of such knowledge.

  4. Knowledge Integration in Global R&D Networks

    NASA Astrophysics Data System (ADS)

    Erkelens, Rose; van den Hooff, Bart; Vlaar, Paul; Huysman, Marleen

    This paper reports a qualitative study conducted at multinational organizations' R&D departments about their process of knowledge integration. Taking into account the knowledge based view (KBV) of the firm and the practice-based view of knowledge, and building on the literatures concerning specialization and integration of knowledge in organizations, we explore which factors may have a significant influence on the integration process of knowledge between R&D units. The findings indicated (1) the contribution of relevant factors influencing knowledge integration processes and (2) a thoughtful balance between engineering and emergent approaches to be helpful in understanding and overcoming knowledge integration issues.

  5. 'Best practice' development and transfer in the NHS: the importance of process as well as product knowledge.

    PubMed

    Newell, Sue; Edelman, Linda; Scarbrough, Harry; Swan, Jacky; Bresnen, Mike

    2003-02-01

    A core prescription from the knowledge management movement is that the successful management of organizational knowledge will prevent firms from 'reinventing the wheel', in particular through the transfer of 'best practices'. Our findings challenge this logic. They suggest instead that knowledge is emergent and enacted in practice, and that normally those involved in a given practice have only a partial understanding of the overall practice. Generating knowledge about current practice is therefore a precursor to changing that practice. In this sense, knowledge transfer does not occur independently of or in sequence to knowledge generation, but instead the process of knowledge generation and its transfer are inexorably intertwined. Thus, rather than transferring 'product' knowledge about the new 'best practice' per se, our analysis suggests that it is more useful to transfer 'process' knowledge about effective ways to generate the knowledge of existing practice, which is the essential starting point for attempts to change that practice.

  6. Knowledge management systems success in healthcare: Leadership matters.

    PubMed

    Ali, Nor'ashikin; Tretiakov, Alexei; Whiddett, Dick; Hunter, Inga

    2017-01-01

    To deliver high-quality healthcare doctors need to access, interpret, and share appropriate and localised medical knowledge. Information technology is widely used to facilitate the management of this knowledge in healthcare organisations. The purpose of this study is to develop a knowledge management systems success model for healthcare organisations. A model was formulated by extending an existing generic knowledge management systems success model by including organisational and system factors relevant to healthcare. It was tested by using data obtained from 263 doctors working within two district health boards in New Zealand. Of the system factors, knowledge content quality was found to be particularly important for knowledge management systems success. Of the organisational factors, leadership was the most important, and more important than incentives. Leadership promoted knowledge management systems success primarily by positively affecting knowledge content quality. Leadership also promoted knowledge management use for retrieval, which should lead to the use of that better quality knowledge by the doctors, ultimately resulting in better outcomes for patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Recent study, but not retrieval, of knowledge protects against learning errors.

    PubMed

    Mullet, Hillary G; Umanath, Sharda; Marsh, Elizabeth J

    2014-11-01

    Surprisingly, people incorporate errors into their knowledge bases even when they have the correct knowledge stored in memory (e.g., Fazio, Barber, Rajaram, Ornstein, & Marsh, 2013). We examined whether heightening the accessibility of correct knowledge would protect people from later reproducing misleading information that they encountered in fictional stories. In Experiment 1, participants studied a series of target general knowledge questions and their correct answers either a few minutes (high accessibility of knowledge) or 1 week (low accessibility of knowledge) before exposure to misleading story references. In Experiments 2a and 2b, participants instead retrieved the answers to the target general knowledge questions either a few minutes or 1 week before the rest of the experiment. Reading the relevant knowledge directly before the story-reading phase protected against reproduction of the misleading story answers on a later general knowledge test, but retrieving that same correct information did not. Retrieving stored knowledge from memory might actually enhance the encoding of relevant misinformation.

  8. On knowledge transfer management as a learning process for ad hoc teams

    NASA Astrophysics Data System (ADS)

    Iliescu, D.

    2017-08-01

    Knowledge management represents an emerging domain becoming more and more important. Concepts like knowledge codification and personalisation, knowledge life-cycle, social and technological dimensions, knowledge transfer and learning management are integral parts. Focus goes here in the process of knowledge transfer for the case of ad hoc teams. The social dimension of knowledge transfer plays an important role. No single individual actors involved in the process, but a collective one, representing the organisation. It is critically important for knowledge to be managed from the life-cycle point of view. A complex communication network needs to be in place to supports the process of knowledge transfer. Two particular concepts, the bridge tie and transactive memory, would eventually enhance the communication. The paper focuses on an informational communication platform supporting the collaborative work on knowledge transfer. The platform facilitates the creation of a topic language to be used in knowledge modelling, storage and reuse, by the ad hoc teams.

  9. Knowledge translation is the use of knowledge in health care decision making.

    PubMed

    Straus, Sharon E; Tetroe, Jacqueline M; Graham, Ian D

    2011-01-01

    To provide an overview of the science and practice of knowledge translation. Narrative review outlining what knowledge translation is and a framework for its use. Knowledge translation is defined as the use of knowledge in practice and decision making by the public, patients, health care professionals, managers, and policy makers. Failures to use research evidence to inform decision making are apparent across all these key decision maker groups. There are several proposed theories and frameworks for achieving knowledge translation. A conceptual framework developed by Graham et al., termed the knowledge-to-action cycle, provides an approach that builds on the commonalities found in an assessment of planned action theories. Review of the evidence base for the science and practice of knowledge translation has identified several gaps including the need to develop valid strategies for assessing the determinants of knowledge use and for evaluating sustainability of knowledge translation interventions. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. A Survey of Noninteractive Zero Knowledge Proof System and Its Applications

    PubMed Central

    Wu, Huixin; Wang, Feng

    2014-01-01

    Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions. PMID:24883407

  11. Communication Policies in Knowledge Networks

    NASA Astrophysics Data System (ADS)

    Ioannidis, Evangelos; Varsakelis, Nikos; Antoniou, Ioannis

    2018-02-01

    Faster knowledge attainment within organizations leads to improved innovation, and therefore competitive advantage. Interventions on the organizational network may be risky or costly or time-demanding. We investigate several communication policies in knowledge networks, which reduce the knowledge attainment time without interventions. We examine the resulting knowledge dynamics for real organizational networks, as well as for artificial networks. More specifically, we investigate the dependence of knowledge dynamics on: (1) the Selection Rule of agents for knowledge acquisition, and (2) the Order of implementation of "Selection" and "Filtering". Significant decrease of the knowledge attainment time (up to -74%) can be achieved by: (1) selecting agents of both high knowledge level and high knowledge transfer efficiency, and (2) implementing "Selection" after "Filtering" in contrast to the converse, implicitly assumed, conventional prioritization. The Non-Commutativity of "Selection" and "Filtering", reveals a Non-Boolean Logic of the Network Operations. The results demonstrate that significant improvement of knowledge dynamics can be achieved by implementing "fruitful" communication policies, by raising the awareness of agents, without any intervention on the network structure.

  12. Knowledge Resources - A Knowledge Management Approach for Digital Ecosystems

    NASA Astrophysics Data System (ADS)

    Kurz, Thomas; Eder, Raimund; Heistracher, Thomas

    The paper at hand presents an innovative approach for the conception and implementation of knowledge management in Digital Ecosystems. Based on a reflection of Digital Ecosystem research of the past years, an architecture is outlined which utilizes Knowledge Resources as the central and simplest entities of knowledge transfer. After the discussion of the related conception, the result of a first prototypical implementation is described that helps the transformation of implicit knowledge to explicit knowledge for wide use.

  13. New knowledge-based genetic algorithm for excavator boom structural optimization

    NASA Astrophysics Data System (ADS)

    Hua, Haiyan; Lin, Shuwen

    2014-03-01

    Due to the insufficiency of utilizing knowledge to guide the complex optimal searching, existing genetic algorithms fail to effectively solve excavator boom structural optimization problem. To improve the optimization efficiency and quality, a new knowledge-based real-coded genetic algorithm is proposed. A dual evolution mechanism combining knowledge evolution with genetic algorithm is established to extract, handle and utilize the shallow and deep implicit constraint knowledge to guide the optimal searching of genetic algorithm circularly. Based on this dual evolution mechanism, knowledge evolution and population evolution can be connected by knowledge influence operators to improve the configurability of knowledge and genetic operators. Then, the new knowledge-based selection operator, crossover operator and mutation operator are proposed to integrate the optimal process knowledge and domain culture to guide the excavator boom structural optimization. Eight kinds of testing algorithms, which include different genetic operators, are taken as examples to solve the structural optimization of a medium-sized excavator boom. By comparing the results of optimization, it is shown that the algorithm including all the new knowledge-based genetic operators can more remarkably improve the evolutionary rate and searching ability than other testing algorithms, which demonstrates the effectiveness of knowledge for guiding optimal searching. The proposed knowledge-based genetic algorithm by combining multi-level knowledge evolution with numerical optimization provides a new effective method for solving the complex engineering optimization problem.

  14. Experts’ and Novices’ Perception of Ignorance and Knowledge in Different Research Disciplines and Its Relation to Belief in Certainty of Knowledge

    PubMed Central

    Hansson, Isabelle; Buratti, Sandra; Allwood, Carl Martin

    2017-01-01

    Assessments of the extent of knowledge in a domain can be important since non-identified lack of knowledge may lead to decisions that do not consider the effect of relevant factors. Two studies examined experts’ and novices’ perception of their own ignorance and knowledge out of everything there is to know within their own and other disciplines and their assessments of their discipline’s, and other disciplines’ knowledge of all there is to know in each discipline. In total 380 experts and 401 students from the disciplines of history, medicine, physics, and psychology participated. The results for ignorance and knowledge assessments of one’s own knowledge were similar. Novices reported more ignorance and less knowledge in their own discipline than experts, but no differences were found in the assessments of how much is known in each discipline. General belief in certainty of knowledge was associated with the knowledge assessments and level of expertise. Finally, disciplinary differences were found both for the knowledge assessments and for belief in certainty of knowledge. Historians and physicists assessed that less was known in their own discipline out of all there is to know (approximately 40%), compared to the medics (about 50%). Historians believed least in certainty of knowledge and physicists most. Our results have practical implications for higher educational teaching and interdisciplinary collaboration. PMID:28367132

  15. The Google-ization of Knowledge

    ERIC Educational Resources Information Center

    Larson, Natasja; Parsons, Jim; Servage, Laura

    2007-01-01

    How has GOOGLE shaped knowledge? How has it shaped those who use it? This article considers the impact of online knowledge upon the content of knowledge and upon the people who seek it and create it. The authors suggest that 1. Google-ization is reshaping knowledge. 2. Google-ization is changing how knowledge counts as important. 3. Google-ization…

  16. Mapping out the Integration of the Components of Pedagogical Content Knowledge (PCK): Examples from High School Biology Classrooms

    ERIC Educational Resources Information Center

    Park, Soonhye; Chen, Ying-Chih

    2012-01-01

    This study explored the nature of the integration of the five components of pedagogical content knowledge (PCK): (a) Orientations toward Teaching Science, (b) Knowledge of Student Understanding, (c) Knowledge of Instructional Strategies and Representations, (d) Knowledge of Science Curriculum, and (e) Knowledge of Assessment of Science Learning.…

  17. Knowledge Exchange in the Shrines of Knowledge: The ''How's'' and ''Where's'' of Knowledge Sharing Processes

    ERIC Educational Resources Information Center

    Reychav, Iris; Te'eni, Dov

    2009-01-01

    Academic conferences are places of situated learning dedicated to the exchange of knowledge. Knowledge is exchanged between colleagues who are looking to enhance their future research by taking part in several formal and informal settings (lectures, discussions and social events). We studied the processes of knowledge sharing and the influence of…

  18. Teacher Knowledge of Attention Deficit Hyperactivity Disorder among Middle School Students in South Texas

    ERIC Educational Resources Information Center

    Guerra, Fred R., Jr.; Brown, Michelle S.

    2012-01-01

    This quantitative study examined the knowledge levels middle school teachers in South Texas have in relation to attention deficit hyperactivity disorder (ADHD). The study specifically compared teacher knowledge levels among three specific ADHD knowledge areas: (a) general knowledge of ADHD, (b) knowledge of symptoms/diagnosis of ADHD, and (c)…

  19. Using Doubly Latent Multilevel Analysis to Elucidate Relationships between Science Teachers' Professional Knowledge and Students' Performance

    ERIC Educational Resources Information Center

    Mahler, Daniela; Großschedl, Jörg; Harms, Ute

    2017-01-01

    Teachers make a difference for the outcome of their students in science classrooms. One focus in this context lies on teachers' professional knowledge. We describe this knowledge according to three domains, namely (1) content knowledge (CK), (2) pedagogical content knowledge (PCK), and (3) curricular knowledge (CuK). We hypothesise a positive…

  20. Approaching Knowledge Management through the Lens of the Knowledge Life Cycle: A Case Study Investigation

    ERIC Educational Resources Information Center

    Fowlin, Julaine M.; Cennamo, Katherine S.

    2017-01-01

    More organizational leaders are recognizing that their greatest competitive advantage is the knowledge base of their employees and for organizations to thrive knowledge management (KM) systems need to be in place that encourage the natural interplay and flow of tacit and explicit knowledge. Approaching KM through the lens of the knowledge life…

  1. Tell Me Why! Content Knowledge Predicts Process-Orientation of Math Researchers' and Math Teachers' Explanations

    ERIC Educational Resources Information Center

    Lachner, Andreas; Nückles, Matthias

    2016-01-01

    In two studies, we investigated the impact of instructors' different knowledge bases on the quality of their instructional explanations. In Study 1, we asked 20 mathematics teachers (with high pedagogical content knowledge, but lower content knowledge) and 15 mathematicians (with lower pedagogical content knowledge, but high content knowledge) to…

  2. Building Scalable Knowledge Graphs for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Maskey, Manil; Gatlin, Patrick; Zhang, Jia; Duan, Xiaoyi; Miller, J. J.; Bugbee, Kaylin; Christopher, Sundar; Freitag, Brian

    2017-01-01

    Knowledge Graphs link key entities in a specific domain with other entities via relationships. From these relationships, researchers can query knowledge graphs for probabilistic recommendations to infer new knowledge. Scientific papers are an untapped resource which knowledge graphs could leverage to accelerate research discovery. Goal: Develop an end-to-end (semi) automated methodology for constructing Knowledge Graphs for Earth Science.

  3. Early Predictors of Middle School Fraction Knowledge

    PubMed Central

    Bailey, Drew H.; Siegler, Robert S.; Geary, David C.

    2014-01-01

    Recent findings that earlier fraction knowledge predicts later mathematics achievement raise the question of what predicts later fraction knowledge. Analyses of longitudinal data indicated that whole number magnitude knowledge in first grade predicted knowledge of fraction magnitudes in middle school, controlling for whole number arithmetic proficiency, domain general cognitive abilities, parental income and education, race, and gender. Similarly, knowledge of whole number arithmetic in first grade predicted knowledge of fraction arithmetic in middle school, controlling for whole number magnitude knowledge in first grade and the other control variables. In contrast, neither type of early whole number knowledge uniquely predicted middle school reading achievement. We discuss the implications of these findings for theories of numerical development and for improving mathematics learning. PMID:24576209

  4. Knowledge acquisition for temporal abstraction.

    PubMed

    Stein, A; Musen, M A; Shahar, Y

    1996-01-01

    Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.

  5. Toward a Conceptual Knowledge Management Framework in Health

    PubMed Central

    Lau, Francis

    2004-01-01

    This paper describes a conceptual organizing scheme for managing knowledge within the health setting. First, a brief review of the notions of knowledge and knowledge management is provided. This is followed by a detailed depiction of our proposed knowledge management framework, which focuses on the concepts of production, use, and refinement of three specific knowledge sources-policy, evidence, and experience. These concepts are operationalized through a set of knowledge management methods and tools tailored for the health setting. We include two case studies around knowledge translation on parent-child relations and virtual networks in community health research to illustrate how this knowledge management framework can be operationalized within specific contexts and the issues involved. We conclude with the lessons learned and implications. PMID:18066388

  6. Exploring linkages between research, policy and practice in the Netherlands: perspectives on sexual and reproductive health and rights knowledge flows.

    PubMed

    de Haas, Billie; van der Kwaak, Anke

    2017-05-12

    The attention to and demand for stronger linkages between research, policy and practice are increasing, especially in fields concerned with sensitive and challenging issues such as sexual and reproductive health and rights (SRHR). The study described in this article was conducted in the Netherlands among actors working in international development, especially the domain of SRHR. It explores the perceived flow of knowledge between research, policy and practice, the perceived impeding factors, and suggested strategies for improvement. A narrative literature review was performed and 28 key informants were interviewed between May and August 2015. Most interviewees were either active or passive members of Share-Net Netherlands, an SRHR knowledge platform. All interviews, which lasted 70 minutes on average, were recorded, transcribed verbatim and coded in MAXQDA. Linkages between research, policy and practice are many and diffuse. The demands for and supplies of knowledge within and across the fields vary and do not always match, which is shown by participants' research purposes and approaches. Participants identified various barriers to strengthening knowledge flows, including a lack of familiarity with practices in other fields, power relations and the undervaluation of tacit knowledge. They suggested a more visible and concrete demand for and supply of knowledge, the development of a joint knowledge agenda, more opportunities for the interdisciplinary creation of knowledge, and the development of a system for learning and sharing knowledge. This study shows the willingness to undertake, and the perceived advantages of, interdisciplinary dialogues and joint creation of knowledge to advance SRHR research, policies and practices. Whereas barriers to the flow of knowledge may maintain present understandings of knowledge and of whose knowledge is valid, enabling factors, such as interactions between research, policy and practice in knowledge-sharing activities, may challenge such perceptions and create an enabling environment for generating innovative knowledge and increasing knowledge use. Knowledge platforms are recommended to place more emphasis on sharing and documenting tacit knowledge through interdisciplinary dialogues, to address power relations and to set criteria for interdisciplinary funding.

  7. Knowledge Management Implementation and the Tools Utilized in Healthcare for Evidence-Based Decision Making: A Systematic Review.

    PubMed

    Shahmoradi, Leila; Safadari, Reza; Jimma, Worku

    2017-09-01

    Healthcare is a knowledge driven process and thus knowledge management and the tools to manage knowledge in healthcare sector are gaining attention. The aim of this systematic review is to investigate knowledge management implementation and knowledge management tools used in healthcare for informed decision making. Three databases, two journals websites and Google Scholar were used as sources for the review. The key terms used to search relevant articles include: "Healthcare and Knowledge Management"; "Knowledge Management Tools in Healthcare" and "Community of Practices in healthcare". It was found that utilization of knowledge management in healthcare is encouraging. There exist numbers of opportunities for knowledge management implementation, though there are some barriers as well. Some of the opportunities that can transform healthcare are advances in health information and communication technology, clinical decision support systems, electronic health record systems, communities of practice and advanced care planning. Providing the right knowledge at the right time, i.e., at the point of decision making by implementing knowledge management in healthcare is paramount. To do so, it is very important to use appropriate tools for knowledge management and user-friendly system because it can significantly improve the quality and safety of care provided for patients both at hospital and home settings.

  8. Knowledge diffusion in the collaboration hypernetwork

    NASA Astrophysics Data System (ADS)

    Yang, Guang-Yong; Hu, Zhao-Long; Liu, Jian-Guo

    2015-02-01

    As knowledge constitutes a primary productive force, it is important to understand the performance of knowledge diffusion. In this paper, we present a knowledge diffusion model based on the local-world non-uniform hypernetwork, which introduces the preferential diffusion mechanism and the knowledge absorptive capability αj, where αj is correlated with the hyperdegree dH(j) of node j. At each time step, we randomly select a node i as the sender; a receiver node is selected from the set of nodes that the sender i has published with previously, with probability proportional to the number of papers they have published together. Applying the average knowledge stock V bar(t) , the variance σ2(t) and the variance coefficient c(t) of knowledge stock to measure the growth and diffusion of knowledge and the adequacy of knowledge diffusion, we have made 3 groups of comparative experiments to investigate how different network structures, hypernetwork sizes and knowledge evolution mechanisms affect the knowledge diffusion, respectively. As the diffusion mechanisms based on the hypernetwork combine with the hyperdegree of node, the hypernetwork is more suitable for investigating the performance of knowledge diffusion. Therefore, the proposed model could be helpful for deeply understanding the process of the knowledge diffusion in the collaboration hypernetwork.

  9. Knowledge management impact of information technology Web 2.0/3.0. The case study of agent software technology usability in knowledge management system

    NASA Astrophysics Data System (ADS)

    Sołtysik-Piorunkiewicz, Anna

    2015-02-01

    How we can measure the impact of internet technology Web 2.0/3.0 for knowledge management? How we can use the Web 2.0/3.0 technologies for generating, evaluating, sharing, organizing knowledge in knowledge-based organization? How we can evaluate it from user-centered perspective? Article aims to provide a method for evaluate the usability of web technologies to support knowledge management in knowledge-based organizations of the various stages of the cycle knowledge management, taking into account: generating knowledge, evaluating knowledge, sharing knowledge, etc. for the modern Internet technologies based on the example of agent technologies. The method focuses on five areas of evaluation: GUI, functional structure, the way of content publication, organizational aspect, technological aspect. The method is based on the proposed indicators relating respectively to assess specific areas of evaluation, taking into account the individual characteristics of the scoring. Each of the features identified in the evaluation is judged first point wise, then this score is subject to verification and clarification by means of appropriate indicators of a given feature. The article proposes appropriate indicators to measure the impact of Web 2.0/3.0 technologies for knowledge management and verification them in an example of agent technology usability in knowledge management system.

  10. Policy impacts of ecosystem services knowledge

    PubMed Central

    Posner, Stephen M.; McKenzie, Emily; Ricketts, Taylor H.

    2016-01-01

    Research about ecosystem services (ES) often aims to generate knowledge that influences policies and institutions for conservation and human development. However, we have limited understanding of how decision-makers use ES knowledge or what factors facilitate use. Here we address this gap and report on, to our knowledge, the first quantitative analysis of the factors and conditions that explain the policy impact of ES knowledge. We analyze a global sample of cases where similar ES knowledge was generated and applied to decision-making. We first test whether attributes of ES knowledge themselves predict different measures of impact on decisions. We find that legitimacy of knowledge is more often associated with impact than either the credibility or salience of the knowledge. We also examine whether predictor variables related to the science-to-policy process and the contextual conditions of a case are significant in predicting impact. Our findings indicate that, although many factors are important, attributes of the knowledge and aspects of the science-to-policy process that enhance legitimacy best explain the impact of ES science on decision-making. Our results are consistent with both theory and previous qualitative assessments in suggesting that the attributes and perceptions of scientific knowledge and process within which knowledge is coproduced are important determinants of whether that knowledge leads to action. PMID:26831101

  11. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition dion and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. The knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use is discussed. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  12. Invisible Brain: Knowledge in Research Works and Neuron Activity.

    PubMed

    Segev, Aviv; Curtis, Dorothy; Jung, Sukhwan; Chae, Suhyun

    2016-01-01

    If the market has an invisible hand, does knowledge creation and representation have an "invisible brain"? While knowledge is viewed as a product of neuron activity in the brain, can we identify knowledge that is outside the brain but reflects the activity of neurons in the brain? This work suggests that the patterns of neuron activity in the brain can be seen in the representation of knowledge-related activity. Here we show that the neuron activity mechanism seems to represent much of the knowledge learned in the past decades based on published articles, in what can be viewed as an "invisible brain" or collective hidden neural networks. Similar results appear when analyzing knowledge activity in patents. Our work also tries to characterize knowledge increase as neuron network activity growth. The results propose that knowledge-related activity can be seen outside of the neuron activity mechanism. Consequently, knowledge might exist as an independent mechanism.

  13. EXPECT: Explicit Representations for Flexible Acquisition

    NASA Technical Reports Server (NTRS)

    Swartout, BIll; Gil, Yolanda

    1995-01-01

    To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.

  14. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. This paper discusses the knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  15. Invisible Brain: Knowledge in Research Works and Neuron Activity

    PubMed Central

    Segev, Aviv; Curtis, Dorothy; Jung, Sukhwan; Chae, Suhyun

    2016-01-01

    If the market has an invisible hand, does knowledge creation and representation have an “invisible brain”? While knowledge is viewed as a product of neuron activity in the brain, can we identify knowledge that is outside the brain but reflects the activity of neurons in the brain? This work suggests that the patterns of neuron activity in the brain can be seen in the representation of knowledge-related activity. Here we show that the neuron activity mechanism seems to represent much of the knowledge learned in the past decades based on published articles, in what can be viewed as an “invisible brain” or collective hidden neural networks. Similar results appear when analyzing knowledge activity in patents. Our work also tries to characterize knowledge increase as neuron network activity growth. The results propose that knowledge-related activity can be seen outside of the neuron activity mechanism. Consequently, knowledge might exist as an independent mechanism. PMID:27439199

  16. How Structure Shapes Dynamics: Knowledge Development in Wikipedia - A Network Multilevel Modeling Approach

    PubMed Central

    Halatchliyski, Iassen; Cress, Ulrike

    2014-01-01

    Using a longitudinal network analysis approach, we investigate the structural development of the knowledge base of Wikipedia in order to explain the appearance of new knowledge. The data consists of the articles in two adjacent knowledge domains: psychology and education. We analyze the development of networks of knowledge consisting of interlinked articles at seven snapshots from 2006 to 2012 with an interval of one year between them. Longitudinal data on the topological position of each article in the networks is used to model the appearance of new knowledge over time. Thus, the structural dimension of knowledge is related to its dynamics. Using multilevel modeling as well as eigenvector and betweenness measures, we explain the significance of pivotal articles that are either central within one of the knowledge domains or boundary-crossing between the two domains at a given point in time for the future development of new knowledge in the knowledge base. PMID:25365319

  17. The knowledge-value chain: A conceptual framework for knowledge translation in health.

    PubMed

    Landry, Réjean; Amara, Nabil; Pablos-Mendes, Ariel; Shademani, Ramesh; Gold, Irving

    2006-08-01

    This article briefly discusses knowledge translation and lists the problems associated with it. Then it uses knowledge-management literature to develop and propose a knowledge-value chain framework in order to provide an integrated conceptual model of knowledge management and application in public health organizations. The knowledge-value chain is a non-linear concept and is based on the management of five dyadic capabilities: mapping and acquisition, creation and destruction, integration and sharing/transfer, replication and protection, and performance and innovation.

  18. The Power of a Question: A Case Study of Two Organizational Knowledge Capture Systems

    NASA Technical Reports Server (NTRS)

    Cooper, Lynn P.

    2003-01-01

    This document represents a presentation regarding organizational knowledge capture systems which was delivered at the HICSS-36 conference held from January 6-9, 2003. An exploratory case study of two knowledge resources is offered. Then, two organizational knowledge capture systems are briefly described: knowledge transfer from practitioner and the use of questions to represent knowledge. Finally, the creation of a database of peer review questions is suggested as a method of promoting organizational discussions and knowledge representation and exchange.

  19. Automated knowledge base development from CAD/CAE databases

    NASA Technical Reports Server (NTRS)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  20. Formalizing nursing knowledge: from theories and models to ontologies.

    PubMed

    Peace, Jane; Brennan, Patricia Flatley

    2009-01-01

    Knowledge representation in nursing is poised to address the depth of nursing knowledge about the specific phenomena of importance to nursing. Nursing theories and models may provide a starting point for making this knowledge explicit in representations. We combined knowledge building methods from nursing and ontology design methods from biomedical informatics to create a nursing representation of family health history. Our experience provides an example of how knowledge representations may be created to facilitate electronic support for nursing practice and knowledge development.

  1. The knowledge-value chain: A conceptual framework for knowledge translation in health.

    PubMed Central

    Landry, Réjean; Amara, Nabil; Pablos-Mendes, Ariel; Shademani, Ramesh; Gold, Irving

    2006-01-01

    This article briefly discusses knowledge translation and lists the problems associated with it. Then it uses knowledge-management literature to develop and propose a knowledge-value chain framework in order to provide an integrated conceptual model of knowledge management and application in public health organizations. The knowledge-value chain is a non-linear concept and is based on the management of five dyadic capabilities: mapping and acquisition, creation and destruction, integration and sharing/transfer, replication and protection, and performance and innovation. PMID:16917645

  2. Social Ontology Documentation for Knowledge Externalization

    NASA Astrophysics Data System (ADS)

    Aranda-Corral, Gonzalo A.; Borrego-Díaz, Joaquín; Jiménez-Mavillard, Antonio

    Knowledge externalization and organization is a major challenge that companies must face. Also, they have to ask whether is possible to enhance its management. Mechanical processing of information represents a chance to carry out these tasks, as well as to turn intangible knowledge assets into real assets. Machine-readable knowledge provides a basis to enhance knowledge management. A promising approach is the empowering of Knowledge Externalization by the community (users, employees). In this paper, a social semantic tool (called OntoxicWiki) for enhancing the quality of knowledge is presented.

  3. Knowledge Engineering and Education.

    ERIC Educational Resources Information Center

    Lopez, Antonio M., Jr.; Donlon, James

    2001-01-01

    Discusses knowledge engineering, computer software, and possible applications in the field of education. Highlights include the distinctions between data, information, and knowledge; knowledge engineering as a subfield of artificial intelligence; knowledge acquisition; data mining; ontology development for subject terms; cognitive apprentices; and…

  4. "Catching the Knowledge Wave" Redefining Knowledge for the Post-Industrial Age

    ERIC Educational Resources Information Center

    Gilbert, Jane

    2007-01-01

    Over the last five years, people have heard a great deal about something called the Knowledge Society. The term "knowledge" is appearing in places they would not have expected to see it a decade or so ago. The media is full of references to the knowledge economy and the knowledge revolution; business discussions now routinely talk about knowledge…

  5. Creating Illusions of Knowledge: Learning Errors that Contradict Prior Knowledge

    ERIC Educational Resources Information Center

    Fazio, Lisa K.; Barber, Sarah J.; Rajaram, Suparna; Ornstein, Peter A.; Marsh, Elizabeth J.

    2013-01-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks…

  6. [The aesthetic character of caring knowledge].

    PubMed

    Tsai, Cheng-Yun

    2013-08-01

    The identity of nursing is founded on caring knowledge, which is derived from our understanding of its experience-revealed essence. This purposive knowledge differs from scientific knowledge because validity guides the latter and ethics guides the former. Therefore, justifying the objectivity of caring knowledge should be based on the aesthetic character of this knowledge rather than on a general social-science explanation.

  7. Knowledge Valorisation: A Route of Knowledge That Ends In Surplus Value (An Example of The Netherlands)

    ERIC Educational Resources Information Center

    Hladchenko, Myroslava

    2016-01-01

    Purpose: The purpose of this paper is to explore the reasons of the success of the Netherlands in knowledge valorisation: what are the actors that participate in knowledge valorisation process and what are their functions; what is the route of knowledge in valorisation; what "surplus value" does knowledge gain in the valorisation…

  8. Improving drivers' knowledge of road rules using digital games.

    PubMed

    Li, Qing; Tay, Richard

    2014-04-01

    Although a proficient knowledge of the road rules is important to safe driving, many drivers do not retain the knowledge acquired after they have obtained their licenses. Hence, more innovative and appealing methods are needed to improve drivers' knowledge of the road rules. This study examines the effect of game based learning on drivers' knowledge acquisition and retention. We find that playing an entertaining game that is designed to impart knowledge of the road rules not only improves players' knowledge but also helps them retain such knowledge. Hence, learning by gaming appears to be a promising learning approach for driver education. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Design of a Knowledge Driven HIS

    PubMed Central

    Pryor, T. Allan; Clayton, Paul D.; Haug, Peter J.; Wigertz, Ove

    1987-01-01

    Design of the software architecture for a knowledge driven HIS is presented. In our design the frame has been used as the basic unit of knowledge representation. The structure of the frame is being designed to be sufficiently universal to contain knowledge required to implement not only expert systems, but almost all traditional HIS functions including ADT, order entry and results review. The design incorporates a two level format for the knowledge. The first level as ASCII records is used to maintain the knowledge base while the second level converted by special knowledge compilers to standard computer languages is used for efficient implementation of the knowledge applications.

  10. Tacit knowledge: A refinement and empirical test of the Academic Tacit Knowledge Scale.

    PubMed

    Insch, Gary S; McIntyre, Nancy; Dawley, David

    2008-11-01

    Researchers have linked tacit knowledge to improved organizational performance, but research on how to measure tacit knowledge is scarce. In the present study, the authors proposed and empirically tested a model of tacit knowledge and an accompanying measurement scale of academic tacit knowledge. They present 6 hypotheses that support the proposed tacit knowledge model regarding the role of cognitive (self-motivation, self-organization); technical (individual task, institutional task); and social (task-related, general) skills. The authors tested these hypotheses with 542 responses to the Academic Tacit Knowledge Scale, which included the respondents' grade point average-the performance variable. All 6 hypotheses were supported.

  11. Understanding the Financial Knowledge Gap: A New Dimension of Inequality in Later Life.

    PubMed

    Khan, Mohammad Nuruzzaman; Rothwell, David W; Cherney, Katrina; Sussman, Tamara

    2017-01-01

    To understand individuals' financial behaviors, it is important to understand the financial knowledge gap - the distance between one's objective and subjective financial knowledge. Overestimating one's financial knowledge can lead to risky financial behaviors. To date, limited empirical work has examined how financial knowledge gap varies across age groups. We analyze the size and nature of the financial knowledge gap and its variation across age groups. Using nationally representative data, we find robust evidence that older adults overestimate their financial knowledge. Social workers can assess the financial knowledge gap and educate their clients to protect from financial fraud, exploitation, and abuse.

  12. Smoking, health knowledge, and anti-smoking campaigns: an empirical study in Taiwan.

    PubMed

    Hsieh, C R; Yen, L L; Liu, J T; Lin, C J

    1996-02-01

    This paper uses a measure of health knowledge of smoking hazards to investigate the determinants of health knowledge and its effect on smoking behavior. In our analysis, two equations are estimated: smoking participation and health knowledge. The simultaneity problem in estimating smoking behavior and health knowledge is also considered. Overall, the estimated results suggest that anti-smoking campaigns have a significantly positive effect on the public's health knowledge, and this health knowledge in turn, has a significantly negative effect on smoking participation. The health knowledge elasticities of smoking participation are -0.48 and -0.56 for all adults and adult males, respectively.

  13. Hubble Space Telescope Design Engineering Knowledgebase (HSTDEK)

    NASA Technical Reports Server (NTRS)

    Johannes, James D.; Everetts, Clark

    1989-01-01

    The research covered here pays specific attention to the development of tools to assist knowledge engineers in acquiring knowledge and to assist other technical, engineering, and management personnel in automatically performing knowledge capture as part of their everyday work without adding any extra work to what they already do. Requirements for data products, the knowledge base, and methods for mapping knowledge in the documents onto the knowledge representations are discussed, as are some of the difficulties of capturing in the knowledge base the structure of the design process itself, along with a model of the system designed. The capture of knowledge describing the interactions of different components is also discussed briefly.

  14. Knowledge acquisition and representation using fuzzy evidential reasoning and dynamic adaptive fuzzy Petri nets.

    PubMed

    Liu, Hu-Chen; Liu, Long; Lin, Qing-Lian; Liu, Nan

    2013-06-01

    The two most important issues of expert systems are the acquisition of domain experts' professional knowledge and the representation and reasoning of the knowledge rules that have been identified. First, during expert knowledge acquisition processes, the domain expert panel often demonstrates different experience and knowledge from one another and produces different types of knowledge information such as complete and incomplete, precise and imprecise, and known and unknown because of its cross-functional and multidisciplinary nature. Second, as a promising tool for knowledge representation and reasoning, fuzzy Petri nets (FPNs) still suffer a couple of deficiencies. The parameters in current FPN models could not accurately represent the increasingly complex knowledge-based systems, and the rules in most existing knowledge inference frameworks could not be dynamically adjustable according to propositions' variation as human cognition and thinking. In this paper, we present a knowledge acquisition and representation approach using the fuzzy evidential reasoning approach and dynamic adaptive FPNs to solve the problems mentioned above. As is illustrated by the numerical example, the proposed approach can well capture experts' diversity experience, enhance the knowledge representation power, and reason the rule-based knowledge more intelligently.

  15. Evaluating Three Dimensions of Environmental Knowledge and Their Impact on Behaviour

    NASA Astrophysics Data System (ADS)

    Braun, Tina; Dierkes, Paul

    2017-09-01

    This research evaluates the development of three environmental knowledge dimensions of secondary school students after participation in a singular 1-day outdoor education programme. Applying a cross-national approach, system, action-related and effectiveness knowledge levels of students educated in Germany and Singapore were assessed before and after intervention participation. Correlations between single knowledge dimensions and behaviour changes due to the environmental education intervention were examined. The authors applied a pre-, post- and retention test design and developed a unique multiple-choice instrument. Results indicate significant baseline differences in the prevalence of the different knowledge dimensions between subgroups. Both intervention subsamples showed a low presence of all baseline knowledge dimensions. Action-related knowledge levels were higher than those of system and effectiveness knowledge. Subsample-specific differences in performed pro-environmental behaviour were also significant. Both experimental groups showed significant immediate and sustained knowledge increases in the three dimensions after programme participation. Neither of the two control cohorts showed any significant increase in any knowledge dimension. Effectiveness knowledge improved most. The amount of demonstrated environmental actions increased significantly in both intervention groups. Both control cohorts did not show shifts in environmental behaviour. Yet, only weak correlations between any knowledge dimension and behaviour could be found.

  16. Enhancing health care professionals' and trainees' knowledge of physical activity guidelines for adults with and without SCI.

    PubMed

    Shirazipour, Celina H; Tomasone, Jennifer R; Martin Ginis, Kathleen A

    2018-01-11

    Health care providers (HCPs) are preferred sources of physical activity (PA) information; however, minimal research has explored HCPs' knowledge of spinal cord injury (SCI) PA guidelines, and no research has examined HCP trainees' PA guideline knowledge. The current study explored HCPs' and trainees' initial knowledge of PA guidelines for both adults with SCI and the general population, and the utility of an event-based intervention for improving this knowledge. Participants (HCPs n = 129; trainees n = 573) reported guideline knowledge for both sets of guidelines (SCI and general population) immediately after, one-month, and six-months following the intervention. Frequencies determined guideline knowledge at each timepoint, while chi-squared tests examined differences in knowledge of both guidelines, as well as knowledge differences in the short- and long-term. Results demonstrated that HCPs and trainees lack knowledge of PA guidelines, particularly guidelines for adults with SCI. The results further suggest that a single event-based intervention is not effective for improving long-term guideline knowledge. Suggestions are made for future research with the aim of improving interventions that target HCP and HCP trainees' long-term guideline knowledge for adults with SCI and the general population.

  17. Knowledge Management

    NASA Technical Reports Server (NTRS)

    Shariq, Syed Z.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The emergence of rapidly expanding technologies for distribution and dissemination of information and knowledge has brought to focus the opportunities for development of knowledge-based networks, knowledge dissemination and knowledge management technologies and their potential applications for enhancing productivity of knowledge work. The challenging and complex problems of the future can be best addressed by developing the knowledge management as a new discipline based on an integrative synthesis of hard and soft sciences. A knowledge management professional society can provide a framework for catalyzing the development of proposed synthesis as well as serve as a focal point for coordination of professional activities in the strategic areas of education, research and technology development. Preliminary concepts for the development of the knowledge management discipline and the professional society are explored. Within this context of knowledge management discipline and the professional society, potential opportunities for application of information technologies for more effectively delivering or transferring information and knowledge (i.e., resulting from the NASA's Mission to Planet Earth) for the development of policy options in critical areas of national and global importance (i.e., policy decisions in economic and environmental areas) can be explored, particularly for those policy areas where a global collaborative knowledge network is likely to be critical to the acceptance of the policies.

  18. Technological Pedagogical Content Knowledge of Prospective Mathematics Teacher in Three Dimensional Material Based on Sex Differences

    NASA Astrophysics Data System (ADS)

    Aqib, M. A.; Budiarto, M. T.; Wijayanti, P.

    2018-01-01

    The effectiveness of learning in this era can be seen from 3 factors such as: technology, content, and pedagogy that covered in Technological Pedagogical Content Knowledge (TPCK). This research was a qualitative research which aimed to describe each domain from TPCK include Content Knowledge, Pedagogical Knowledge, Pedagogical Content Knowledge, Technological Knowledge, Technological Content Knowledge, Technological Pedagogical Knowledge and Technological, Pedagogical, and Content Knowledge. The subjects of this research were male and female mathematics college students at least 5th semester who has almost the same ability for some course like innovative learning, innovative learning II, school mathematics I, school mathematics II, computer applications and instructional media. Research began by spreading the questionnaire of subject then continued with the assignment and interview. The obtained data was validated by time triangulation.This research has result that male and female prospective teacher was relatively same for Content Knowledge and Pedagogical Knowledge domain. While it was difference in the Technological Knowledge domain. The difference in this domain certainly has an impact on other domains that has technology components on it. Although it can be minimized by familiarizing the technology.

  19. Translating three states of knowledge--discovery, invention, and innovation

    PubMed Central

    2010-01-01

    Background Knowledge Translation (KT) has historically focused on the proper use of knowledge in healthcare delivery. A knowledge base has been created through empirical research and resides in scholarly literature. Some knowledge is amenable to direct application by stakeholders who are engaged during or after the research process, as shown by the Knowledge to Action (KTA) model. Other knowledge requires multiple transformations before achieving utility for end users. For example, conceptual knowledge generated through science or engineering may become embodied as a technology-based invention through development methods. The invention may then be integrated within an innovative device or service through production methods. To what extent is KT relevant to these transformations? How might the KTA model accommodate these additional development and production activities while preserving the KT concepts? Discussion Stakeholders adopt and use knowledge that has perceived utility, such as a solution to a problem. Achieving a technology-based solution involves three methods that generate knowledge in three states, analogous to the three classic states of matter. Research activity generates discoveries that are intangible and highly malleable like a gas; development activity transforms discoveries into inventions that are moderately tangible yet still malleable like a liquid; and production activity transforms inventions into innovations that are tangible and immutable like a solid. The paper demonstrates how the KTA model can accommodate all three types of activity and address all three states of knowledge. Linking the three activities in one model also illustrates the importance of engaging the relevant stakeholders prior to initiating any knowledge-related activities. Summary Science and engineering focused on technology-based devices or services change the state of knowledge through three successive activities. Achieving knowledge implementation requires methods that accommodate these three activities and knowledge states. Accomplishing beneficial societal impacts from technology-based knowledge involves the successful progression through all three activities, and the effective communication of each successive knowledge state to the relevant stakeholders. The KTA model appears suitable for structuring and linking these processes. PMID:20205873

  20. A Knowledge-Based System Developer for aerospace applications

    NASA Technical Reports Server (NTRS)

    Shi, George Z.; Wu, Kewei; Fensky, Connie S.; Lo, Ching F.

    1993-01-01

    A prototype Knowledge-Based System Developer (KBSD) has been developed for aerospace applications by utilizing artificial intelligence technology. The KBSD directly acquires knowledge from domain experts through a graphical interface then builds expert systems from that knowledge. This raises the state of the art of knowledge acquisition/expert system technology to a new level by lessening the need for skilled knowledge engineers. The feasibility, applicability , and efficiency of the proposed concept was established, making a continuation which would develop the prototype to a full-scale general-purpose knowledge-based system developer justifiable. The KBSD has great commercial potential. It will provide a marketable software shell which alleviates the need for knowledge engineers and increase productivity in the workplace. The KBSD will therefore make knowledge-based systems available to a large portion of industry.

  1. Case-based tutoring from a medical knowledge base.

    PubMed

    Chin, H L; Cooper, G F

    1989-01-01

    The past decade has seen the emergence of programs that make use of large knowledge bases to assist physicians in diagnosis within the general field of internal medicine. One such program, Internist-I, contains knowledge about over 600 diseases, covering a significant proportion of internal medicine. This paper describes the process of converting a subset of this knowledge base--in the area of cardiovascular diseases--into a probabilistic format, and the use of this resulting knowledge base to teach medical diagnostic knowledge. The system (called KBSimulator--for Knowledge-Based patient Simulator) generates simulated patient cases and uses these cases as a focal point from which to teach medical knowledge. This project demonstrates the feasibility of building an intelligent, flexible instructional system that uses a knowledge base constructed primarily for medical diagnosis.

  2. Family Context, Mexican-Origin Adolescent Mothers' Parenting Knowledge, and Children's Subsequent Developmental Outcomes

    PubMed Central

    Jahromi, Laudan B.; Guimond, Amy B.; Umaña-Taylor, Adriana J.; Updegraff, Kimberly A.; Toomey, Russell B.

    2014-01-01

    This study examined parenting knowledge among Mexican-origin adolescent mothers (N = 191; M age = 16.26 years), family contextual factors associated with adolescents' parenting knowledge, and toddlers' (M age = 2.01 years) subsequent developmental outcomes. Data came from home interviews and direct child assessments. Adolescents both under- and over-estimated children's developmental timing, and showed differences in their knowledge of specific developmental domains. Instrumental support from mother figures was positively linked to adolescents' knowledge accuracy, whereas emotional support was negatively related to adolescents' knowledge confidence. Furthermore, whereas mother figures' autonomy-granting was positively linked to knowledge confidence, psychological control was associated with less accurate adolescent parenting knowledge. Toddlers of adolescents with more accurate knowledge showed positive developmental functioning. Intervention implications are discussed. PMID:24004448

  3. Integrated learning: ways of fostering the applicability of teachers’ pedagogical and psychological knowledge

    PubMed Central

    Harr, Nora; Eichler, Andreas; Renkl, Alexander

    2015-01-01

    In teacher education, general pedagogical and psychological knowledge (PPK) is often taught separately from the teaching subject itself, potentially leading to inert knowledge. In an experimental study with 69 mathematics student teachers, we tested the benefits of fostering the integration of pedagogical content knowledge (PCK) and general PPK with respect to knowledge application. Integration was fostered either by integrating the contents or by prompting the learners to integrate separately taught knowledge. Fostering integration, as compared to a separate presentation without integration help, led to more applicable PPK and greater simultaneous application of PPK and PCK. The advantages of fostering knowledge integration were not moderated by the student teachers’ prior knowledge or working memory capacity. A disadvantage of integrating different knowledge types referred to increased learning times. PMID:26082740

  4. Applying IT Governance Concepts and Elements to Knowledge Governance: An Initial Approach

    NASA Astrophysics Data System (ADS)

    Rouyet, Juan Ignacio; Joyanes, Luis

    As the era of knowledge-based economy is emerging, the importance of knowledge governance is gradually increasing. The question of how the governance mechanisms influence on the knowledge transactions is becoming increasingly relevant. However, the theoretical approaches have yet to solve outstanding issues, such as how the the micro-level governance mechanisms influence the knowledge processes or what kind of organizational hazard could decrease the benefits form the knowledge processes. Furthermore, the deployment of empirical studies to address the issues mentioned is arguably needed. This paper proposes a knowledge governance framework to assist effectively in the implementation of governance mechanisms for knowledge management processes. Additionally, it shows how this may be implented in a knowledge-intensive firm and proposes specific structures and governance mechanisms.

  5. Whose Knowledge, Whose Development? Use and Role of Local and External Knowledge in Agroforestry Projects in Bolivia

    NASA Astrophysics Data System (ADS)

    Jacobi, Johanna; Mathez-Stiefel, Sarah-Lan; Gambon, Helen; Rist, Stephan; Altieri, Miguel

    2017-03-01

    Agroforestry often relies on local knowledge, which is gaining recognition in development projects. However, how local knowledge can articulate with external and scientific knowledge is little known. Our study explored the use and integration of local and external knowledge in agroforestry projects in Bolivia. In 42 field visits and 62 interviews with agroforestry farmers, civil society representatives, and policymakers, we found a diverse knowledge base. We examined how local and external knowledge contribute to livelihood assets and tree and crop diversity. Projects based predominantly on external knowledge tended to promote a single combination of tree and crop species and targeted mainly financial capital, whereas projects with a local or mixed knowledge base tended to focus on food security and increased natural capital (e.g., soil restoration) and used a higher diversity of trees and crops than those with an external knowledge base. The integration of different forms of knowledge can enable farmers to better cope with new challenges emerging as a result of climate change, fluctuating market prices for cash crops, and surrounding destructive land use strategies such as uncontrolled fires and aerial fumigation with herbicides. However, many projects still tended to prioritize external knowledge and undervalue local knowledge—a tendency that has long been institutionalized in the formal educational system and in extension services. More dialogue is needed between different forms of knowledge, which can be promoted by strengthening local organizations and their networks, reforming agricultural educational institutions, and working in close interaction with policymakers.

  6. [Tacit Knowledge: Characteristics in nursing practice].

    PubMed

    Pérez-Fuillerat, Natalia; Solano-Ruiz, M Carmen; Amezcua, Manuel

    2018-01-20

    Tacit knowledge can be defined as knowledge which is used intuitively and unconsciously, which is acquired through one's experience, characterized by being personal and contextual. Some terms such as 'intuition', 'know how' and 'implicit knowledge' have been used to describe tacit knowledge. Different disciplines in the fields of management or health have studied tacit knowledge, identifying it as a powerful tool to create knowledge and clinical decision-making. The aim of this review is to analyse the definition and characteristics that make up tacit knowledge and determine the role it plays in the nursing discipline. An integrative review was undertaken of the literature published up to November 2016 in the databases CUIDEN, SciELO, PubMed, Cochrane and CINAHL. The synthesis and interpretation of the data was performed by two researchers through content analysis. From a total of 819 articles located, 35 articles on tacit knowledge and nursing were chosen. There is no consensus on the name and description of results in tacit knowledge. The main characteristics of tacit knowledge have a personal and social character, which is used from an organised mental structure, called mindline. This structure relates to the use of tacit knowledge on clinical decision-making. Previous studies on tacit knowledge and nursing provide the nursing community with perspectives without going into depth. The production of a framework is suggested, as it would clarify implied concepts and its role on the management of nursing knowledge. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. Knowledge Management.

    ERIC Educational Resources Information Center

    1999

    The first of the four papers in this symposium, "Knowledge Management and Knowledge Dissemination" (Wim J. Nijhof), presents two case studies exploring the strategies companies use in sharing and disseminating knowledge and expertise among employees. "A Theory of Knowledge Management" (Richard J. Torraco), develops a conceptual…

  8. Website Quality, Expectation, Confirmation, and End User Satisfaction: The Knowledge-Intensive Website of the Korean National Cancer Information Center

    PubMed Central

    Koo, Chulmo; Wati, Yulia; Park, Keeho

    2011-01-01

    Background The fact that patient satisfaction with primary care clinical practices and physician-patient communications has decreased gradually has brought a new opportunity to the online channel as a supplementary service to provide additional information. Objective In this study, our objectives were to examine the process of cognitive knowledge expectation-confirmation from eHealth users and to recommend the attributes of a “knowledge-intensive website.”. Knowledge expectation can be defined as users’ existing attitudes or beliefs regarding expected levels of knowledge they may gain by accessing the website. Knowledge confirmation is the extent to which user’s knowledge expectation of information systems use is realized during actual use. In our hypothesized research model, perceived information quality, presentation and attractiveness as well as knowledge expectation influence knowledge confirmation, which in turn influences perceived usefulness and end user satisfaction, which feeds back to knowledge expectation. Methods An empirical study was conducted at the National Cancer Center (NCC), Republic of Korea (South Korea), by evaluating its official website. A user survey was administered containing items to measure subjectively perceived website quality and expectation-confirmation attributes. A study sample of 198 usable responses was used for further analysis. We used the structural equation model to test the proposed research model. Results Knowledge expectation exhibited a positive effect on knowledge confirmation (beta = .27, P < .001). The paths from information quality, information presentation, and website attractiveness to knowledge confirmation were also positive and significant (beta = .24, P < .001; beta = .29, P < .001; beta = .18, P < .001, respectively). Moreover, the effect of knowledge confirmation on perceived usefulness was also positively significant (beta = .64, P < .001). Knowledge expectation together with knowledge confirmation and perceived usefulness also significantly affected end user satisfaction (beta = .22 P < .001; beta = .39, P < .001; beta = .25, P < .001, respectively). Conclusions Theoretically, this study has (1) identified knowledge-intensive website attributes, (2) enhanced the theoretical foundation of eHealth from the information systems (IS) perspective by adopting the expectation-confirmation theory (ECT), and (3) examined the importance of information and knowledge attributes and explained their impact on user satisfaction. Practically, our empirical results suggest that perceived website quality (ie, information quality, information presentation, and website attractiveness) is a core requirement for knowledge building. In addition, our study has also shown that knowledge confirmation has a greater effect on satisfaction than both knowledge expectation and perceived usefulness. PMID:22047810

  9. Designing for health in school buildings: between research and practice.

    PubMed

    Kirkeby, Inge Mette; Jensen, Bjarne Bruun; Larsen, Kristian; Kural, René

    2015-05-01

    To investigate the kinds of knowledge practitioners use when planning and designing for health in school buildings. Twelve semi-structured qualitative interviews were conducted with architects, teachers and officials to investigate use of knowledge in the making of school buildings. Practitioners drew on many kinds and sources of knowledge, but in particular they made use of concepts, examples or pictures or thought-provoking knowledge. However, the interviews indicate a number of hurdles for efficient knowledge sharing between research and practice: (1) a considerable discrepancy between kinds of knowledge used by practice and knowledge traditionally produced by research; (2) research-knowledge and practice-knowledge form two circuits and the flow from one circuit to the other is weak; (3) practitioners' knowledge was often based on experience and therefore person-dependent. It makes the knowledge vulnerable. Special attention has to be paid by research to concepts and principles to guide the decision-making in practice. Further is recommended to consider new kinds of collaboration between researchers and practitioners. © 2015 the Nordic Societies of Public Health.

  10. Exploring the Associations Among Nutrition, Science, and Mathematics Knowledge for an Integrative, Food-Based Curriculum.

    PubMed

    Stage, Virginia C; Kolasa, Kathryn M; Díaz, Sebastián R; Duffrin, Melani W

    2018-01-01

    Explore associations between nutrition, science, and mathematics knowledge to provide evidence that integrating food/nutrition education in the fourth-grade curriculum may support gains in academic knowledge. Secondary analysis of a quasi-experimental study. Sample included 438 students in 34 fourth-grade classrooms across North Carolina and Ohio; mean age 10 years old; gender (I = 53.2% female; C = 51.6% female). Dependent variable = post-test-nutrition knowledge; independent variables = baseline-nutrition knowledge, and post-test science and mathematics knowledge. Analyses included descriptive statistics and multiple linear regression. The hypothesized model predicted post-nutrition knowledge (F(437) = 149.4, p < .001; Adjusted R = .51). All independent variables were significant predictors with positive association. Science and mathematics knowledge were predictive of nutrition knowledge indicating use of an integrative science and mathematics curriculum to improve academic knowledge may also simultaneously improve nutrition knowledge among fourth-grade students. Teachers can benefit from integration by meeting multiple academic standards, efficiently using limited classroom time, and increasing nutrition education provided in the classroom. © 2018, American School Health Association.

  11. Auditing Knowledge toward Leveraging Organizational IQ in Healthcare Organizations.

    PubMed

    Shahmoradi, Leila; Karami, Mahtab; Farzaneh Nejad, Ahmadreza

    2016-04-01

    In this study, a knowledge audit was conducted based on organizational intelligence quotient (OIQ) principles of Iran's Ministry of Health and Medical Education (MOHME) to determine levers that can enhance OIQ in healthcare. The mixed method study was conducted within the MOHME. The study population consisted of 15 senior managers and policymakers. A tool based on literature review and panel expert opinions was developed to perform a knowledge audit. The significant results of this auditing revealed the following: lack of defined standard processes for organizing knowledge management (KM), lack of a knowledge map, absence of a trustee to implement KM, absence of specialists to produce a knowledge map, individuals' unwillingness to share knowledge, implicitness of knowledge format, occasional nature of knowledge documentation for repeated use, lack of a mechanism to determine repetitive tasks, lack of a reward system for the formation of communities, groups and networks, non-updatedness of the available knowledge, and absence of commercial knowledge. The analysis of the audit findings revealed that three levers for enhancing OIQ, including structure and process, organizational culture, and information technology must be created or modified.

  12. Auditing Knowledge toward Leveraging Organizational IQ in Healthcare Organizations

    PubMed Central

    Shahmoradi, Leila; Farzaneh Nejad, Ahmadreza

    2016-01-01

    Objectives In this study, a knowledge audit was conducted based on organizational intelligence quotient (OIQ) principles of Iran's Ministry of Health and Medical Education (MOHME) to determine levers that can enhance OIQ in healthcare. Methods The mixed method study was conducted within the MOHME. The study population consisted of 15 senior managers and policymakers. A tool based on literature review and panel expert opinions was developed to perform a knowledge audit. Results The significant results of this auditing revealed the following: lack of defined standard processes for organizing knowledge management (KM), lack of a knowledge map, absence of a trustee to implement KM, absence of specialists to produce a knowledge map, individuals' unwillingness to share knowledge, implicitness of knowledge format, occasional nature of knowledge documentation for repeated use, lack of a mechanism to determine repetitive tasks, lack of a reward system for the formation of communities, groups and networks, non-updatedness of the available knowledge, and absence of commercial knowledge. Conclusions The analysis of the audit findings revealed that three levers for enhancing OIQ, including structure and process, organizational culture, and information technology must be created or modified. PMID:27200221

  13. The Relationship between Feelings-of-Knowing and Partial Knowledge for General Knowledge Questions

    PubMed Central

    Norman, Elisabeth; Blakstad, Oskar; Johnsen, Øivind; Martinsen, Stig K.; Price, Mark C.

    2016-01-01

    Feelings of knowing (FoK) are introspective self-report ratings of the felt likelihood that one will be able to recognize a currently unrecallable memory target. Previous studies have shown that FoKs are influenced by retrieved fragment knowledge related to the target, which is compatible with the accessibility hypothesis that FoK is partly based on currently activated partial knowledge about the memory target. However, previous results have been inconsistent as to whether or not FoKs are influenced by the accuracy of such information. In our study (N = 26), we used a recall-judge-recognize procedure where stimuli were general knowledge questions. The measure of partial knowledge was wider than those applied previously, and FoK was measured before rather than after partial knowledge. The accuracy of reported partial knowledge was positively related to subsequent recognition accuracy, and FoK only predicted recognition on trials where there was correct partial knowledge. Importantly, FoK was positively related to the amount of correct partial knowledge, but did not show a similar incremental relation with incorrect knowledge. PMID:27445950

  14. Women care about local knowledge, experiences from ethnomycology

    PubMed Central

    2012-01-01

    Gender is one of the main variables that influence the distribution of local knowledge. We carried out a literature review concerning local mycological knowledge, paying special attention to data concerning women’s knowledge and comparative gender data. We found that unique features of local mycological knowledge allow people to successfully manage mushrooms. Women are involved in every stage of mushroom utilization from collection to processing and marketing. Local mycological knowledge includes the use mushrooms as food, medicine, and recreational objects as well as an aid to seasonal household economies. In many regions of the world, women are often the main mushroom collectors and possess a vast knowledge about mushroom taxonomy, biology, and ecology. Local experts play a vital role in the transmission of local mycological knowledge. Women participate in the diffusion of this knowledge as well as in its enrichment through innovation. Female mushroom collectors appreciate their mycological knowledge and pursue strategies and organization to reproduce it in their communities. Women mushroom gatherers are conscious of their knowledge, value its contribution in their subsistence systems, and proudly incorporate it in their cultural identity. PMID:22809491

  15. Women care about local knowledge, experiences from ethnomycology.

    PubMed

    Garibay-Orijel, Roberto; Ramírez-Terrazo, Amaranta; Ordaz-Velázquez, Marisa

    2012-07-18

    Gender is one of the main variables that influence the distribution of local knowledge. We carried out a literature review concerning local mycological knowledge, paying special attention to data concerning women's knowledge and comparative gender data. We found that unique features of local mycological knowledge allow people to successfully manage mushrooms. Women are involved in every stage of mushroom utilization from collection to processing and marketing. Local mycological knowledge includes the use mushrooms as food, medicine, and recreational objects as well as an aid to seasonal household economies. In many regions of the world, women are often the main mushroom collectors and possess a vast knowledge about mushroom taxonomy, biology, and ecology. Local experts play a vital role in the transmission of local mycological knowledge. Women participate in the diffusion of this knowledge as well as in its enrichment through innovation. Female mushroom collectors appreciate their mycological knowledge and pursue strategies and organization to reproduce it in their communities. Women mushroom gatherers are conscious of their knowledge, value its contribution in their subsistence systems, and proudly incorporate it in their cultural identity.

  16. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  17. Facilitating Naval Knowledge Flow

    DTIC Science & Technology

    2001-07-01

    flow theory and its application to very-large enterprises such as the Navy. Without such basic understanding, one cannot expect to design effective...understanding knowledge flow? Informed by advances in knowledge-flow theory , this work can propel knowledge management toward the methods and tools...address the phenomenology of knowledge flow well, nor do we have the benefit of knowledge-flow theory and its application to very-large enterprises

  18. The Wiki as Knowledge Repository: Using a Wiki in a Community of Practice to Strengthen K-12 Education

    ERIC Educational Resources Information Center

    Sheehy, Geofrey

    2008-01-01

    The concept of managing an organization's knowledge has caught on in recent years (Sallis & Jones, 2002). Dubbed knowledge management, the field has grown as it addresses key characteristics of knowledge, like the concept that knowledge cannot be separated from a knower and the idea that there are two types of knowledge: tacit, which is intangible…

  19. I Learned More than I Taught: The Hidden Dimension of Learning in Intercultural Knowledge Transfer

    ERIC Educational Resources Information Center

    Chen, Fang; Bapuji, Hari; Dyck, Bruno; Wang, Xiaoyun

    2012-01-01

    Purpose: Although knowledge transfer is generally conceived as a two-way process in which knowledge is transferred to and from the knowledge source, research has tended to focus on the first part of the process and neglect the second part. This study aims to examine the feedback loop and how knowledge is transferred from the knowledge receiver to…

  20. The Use of Clinical Interviews to Develop Inservice Secondary Science Teachers' Nature of Science Knowledge and Assessment of Student Nature of Science Knowledge

    ERIC Educational Resources Information Center

    Peters-Burton, Erin E.

    2013-01-01

    To fully incorporate nature of science knowledge into classrooms, teachers must be both proficient in their own nature of science knowledge, but also skillful in translating their knowledge into a learning environment which assesses student knowledge. Twenty-eight inservice teachers enrolled in a graduate course which in part required a clinical…

  1. Ontologies, Knowledge Bases and Knowledge Management

    DTIC Science & Technology

    2002-07-01

    AFRL-IF-RS-TR-2002-163 Final Technical Report July 2002 ONTOLOGIES, KNOWLEDGE BASES AND KNOWLEDGE MANAGEMENT USC Information ...and layer additional information necessary to make specific uses of the knowledge in this core. Finally, while we were able to find adequate solutions... knowledge base and inference engine. Figure 3.2: SDA Editor Interface 46 Although the SDA has access to information about the situation, we wanted the user

  2. The power of techknowledgy.

    PubMed

    Kabachinski, Jeff

    2010-01-01

    Knowledge can range from complex, accumulated expertise (tacit knowledge) to structured explicit content like service procedures. For most of us, knowledge management should only be one of many collaborative means to an end, not the end in itself (unless you are the corporate knowledge management director or chief knowledge officer). For that reason, KM is important only to the extent that it improves an organization's capability and capacity to deal with, and develop in, the four dimensions of capturing, codifying, storing, and using knowledge. Knowledge that is more or less explicit can be embedded in procedures or represented in documents and databases and transferred with reasonable accuracy. Tacit knowledge transfer generally requires extensive personal contact. Take for example troubleshooting circuits. While troubleshooting can be procedural to an extent, it is still somewhat of an art that pulls from experience and training. This is the kind of tacit knowledge where partnerships, mentoring, or an apprenticeship, are most effective. The most successful organizations are those where knowledge management is part of everyone's job. Tacit, complex knowledge that is developed and internalized over a long period of time is almost impossible to reproduce in a document, database, or expert system. Even before the days of "core competencies", the learning organization, expert systems, and strategy focus, good managers valued the experience and know-how of employees. Today, many are recognizing that what is needed is more than a casual approach to corporate knowledge if they are to succeed. In addition, the aging population of the baby boomers may require means to capture their experience and knowledge before they leave the workforce. There is little doubt that knowledge is one of any organization's most important resources, or that knowledge workers' roles will grow in importance in the years ahead. Why would an organization believe that knowledge and knowledge workers are important, yet not advocate active management of knowledge itself? Taking advantage of already accumulated corporate intellectual property is by far the most low-cost way to increase capability and competitive stature. These are all good reasons why it might pay to take a look at your KM usage.

  3. Using knowledge translation as a framework for the design of a research protocol.

    PubMed

    Fredericks, Suzanne; Martorella, Géraldine; Catallo, Cristina

    2015-05-01

    Knowledge translation has been defined as the synthesis, dissemination, exchange and ethically sound application of knowledge to improve health, resulting in a stronger health-care system. Using KT activities to aid in the adoption of evidence into practice can address current health-care challenges such as increasing organizational practice standards, alleviating the risk for adverse events and meeting practitioner needs for evidence at the bedside. Two general forms of KT have been identified. These being integrated KT and end-of-grant KT. Integrated KT involves the knowledge users in the research team and in the majority of stages of the research process. End-of-grant KT relates to the translation of findings through a well-developed dissemination plan. This paper describes the process of using an integrated knowledge translation approach to design a research protocol that will examine the effectiveness of a web-based patient educational intervention. It begins with a description of integrated knowledge translation, followed by the presentation of a specific case example in which integrated knowledge translation is used to develop a nursing intervention. The major elements of integrated knowledge translation pertain to need for a knowledge user who represents the broad target user group, and who is knowledgeable in the area under investigation and who as authority to enact changes to practice. Use of knowledge users as equal partners within the research team; exploring all feasible opportunities for knowledge exchange; and working with knowledge users to identify all outcomes related to knowledge translation are the other major elements of integrated knowledge translation that are addressed throughout this paper. Furthermore, the relevance of psychosocial or educational interventions to knowledge translation is also discussed as a source of knowledge. In summary, integrated knowledge translation is an important tool for the development of new interventions, as it helps to apply science to practice accurately. It supports the elaboration of the design while enhancing the relevance of the intervention through the validation of feasibility and acceptability with clinicians and patients. © 2015 Wiley Publishing Asia Pty Ltd.

  4. Effects of informed consent for individual genome sequencing on relevant knowledge.

    PubMed

    Kaphingst, K A; Facio, F M; Cheng, M-R; Brooks, S; Eidem, H; Linn, A; Biesecker, B B; Biesecker, L G

    2012-11-01

    Increasing availability of individual genomic information suggests that patients will need knowledge about genome sequencing to make informed decisions, but prior research is limited. In this study, we examined genome sequencing knowledge before and after informed consent among 311 participants enrolled in the ClinSeq™ sequencing study. An exploratory factor analysis of knowledge items yielded two factors (sequencing limitations knowledge; sequencing benefits knowledge). In multivariable analysis, high pre-consent sequencing limitations knowledge scores were significantly related to education [odds ratio (OR): 8.7, 95% confidence interval (CI): 2.45-31.10 for post-graduate education, and OR: 3.9; 95% CI: 1.05, 14.61 for college degree compared with less than college degree] and race/ethnicity (OR: 2.4, 95% CI: 1.09, 5.38 for non-Hispanic Whites compared with other racial/ethnic groups). Mean values increased significantly between pre- and post-consent for the sequencing limitations knowledge subscale (6.9-7.7, p < 0.0001) and sequencing benefits knowledge subscale (7.0-7.5, p < 0.0001); increase in knowledge did not differ by sociodemographic characteristics. This study highlights gaps in genome sequencing knowledge and underscores the need to target educational efforts toward participants with less education or from minority racial/ethnic groups. The informed consent process improved genome sequencing knowledge. Future studies could examine how genome sequencing knowledge influences informed decision making. © 2012 John Wiley & Sons A/S.

  5. A community of practice for knowledge translation trainees: an innovative approach for learning and collaboration.

    PubMed

    Urquhart, Robin; Cornelissen, Evelyn; Lal, Shalini; Colquhoun, Heather; Klein, Gail; Richmond, Sarah; Witteman, Holly O

    2013-01-01

    A growing number of researchers and trainees identify knowledge translation (KT) as their field of study or practice. Yet, KT educational and professional development opportunities and established KT networks remain relatively uncommon, making it challenging for trainees to develop the necessary skills, networks, and collaborations to optimally work in this area. The Knowledge Translation Trainee Collaborative is a trainee-initiated and trainee-led community of practice established by junior knowledge translation researchers and practitioners to: examine the diversity of knowledge translation research and practice, build networks with other knowledge translation trainees, and advance the field through knowledge generation activities. In this article, we describe how the collaborative serves as an innovative community of practice for continuing education and professional development in knowledge translation and present a logic model that provides a framework for designing an evaluation of its impact as a community of practice. The expectation is that formal and informal networking will lead to knowledge sharing and knowledge generation opportunities that improve individual members' competencies (eg, combination of skills, abilities, and knowledge) in knowledge translation research and practice and contribute to the development and advancement of the knowledge translation field. Copyright © 2013 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.

  6. A Systematic Review of Athletes’ and Coaches’ Nutrition Knowledge and Reflections on the Quality of Current Nutrition Knowledge Measures

    PubMed Central

    Trakman, Gina L.; Forsyth, Adrienne; Devlin, Brooke L.; Belski, Regina

    2016-01-01

    Context: Nutrition knowledge can influence dietary choices and impact on athletic performance. Valid and reliable measures are needed to assess the nutrition knowledge of athletes and coaches. Objectives: (1) To systematically review the published literature on nutrition knowledge of adult athletes and coaches and (2) to assess the quality of measures used to assess nutrition knowledge. Data Sources: MEDLINE, CINAHL, SPORTDiscuss, Web of Science, and SCOPUS. Study Selection: 36 studies that provided a quantitative measure of nutrition knowledge and described the measurement tool that was used were included. Data extraction: Participant description, questionnaire description, results (mean correct and responses to individual items), study quality, and questionnaire quality. Data synthesis: All studies were of neutral quality. Tools used to measure knowledge did not consider health literacy, were outdated with regards to consensus recommendations, and lacked appropriate and adequate validation. The current status of nutrition knowledge in athletes and coaches is difficult to ascertain. Gaps in knowledge also remain unclear, but it is likely that energy density, the need for supplementation, and the role of protein are frequently misunderstood. Conclusions: Previous reports of nutrition knowledge need to be interpreted with caution. A new, universal, up-to-date, validated measure of general and sports nutrition knowledge is required to allow for assessment of nutrition knowledge. PMID:27649242

  7. An exploratory analysis of the nature of informal knowledge underlying theories of planned action used for public health oriented knowledge translation.

    PubMed

    Kothari, Anita; Boyko, Jennifer A; Campbell-Davison, Andrea

    2015-09-09

    Informal knowledge is used in public health practice to make sense of research findings. Although knowledge translation theories highlight the importance of informal knowledge, it is not clear to what extent the same literature provides guidance in terms of how to use it in practice. The objective of this study was to address this gap by exploring what planned action theories suggest in terms of using three types of informal knowledge: local, experiential and expert. We carried out an exploratory secondary analysis of the planned action theories that informed the development of a popular knowledge translation theory. Our sample included twenty-nine (n = 29) papers. We extracted information from these papers about sources of and guidance for using informal knowledge, and then carried out a thematic analysis. We found that theories of planned action provide guidance (including sources of, methods for identifying, and suggestions for use) for using local, experiential and expert knowledge. This study builds on previous knowledge translation related work to provide insight into the practical use of informal knowledge. Public health practitioners can refer to the guidance summarized in this paper to inform their decision-making. Further research about how to use informal knowledge in public health practice is needed given the value being accorded to using informal knowledge in public health decision-making processes.

  8. Knowledge Creation in Constructivist Learning

    ERIC Educational Resources Information Center

    Jaleel, Sajna; Verghis, Alie Molly

    2015-01-01

    In today's competitive global economy characterized by knowledge acquisition, the concept of knowledge management has become increasingly prevalent in academic and business practices. Knowledge creation is an important factor and remains a source of competitive advantage over knowledge management. Constructivism holds that learners learn actively…

  9. Knowledge Management in Pursuit of Performance: The Challenge of Context.

    ERIC Educational Resources Information Center

    Degler, Duane; Battle, Lisa

    2000-01-01

    Discusses the integration of knowledge management into business applications. Topics include the difference between knowledge and information; performance-centered design (PCD); applying knowledge to support business outcomes, including context, experience, and information quality; techniques for merging PCD and knowledge management, including…

  10. A Biological Conception of Knowledge: One Problematic Consequence.

    ERIC Educational Resources Information Center

    Haroutunian, Sophie

    1980-01-01

    Piaget's use of the equilibrium model to define knowledge results in a cybernetic conception of knowledge that cannot explain how knowledge becomes possible. The knowledge that behaviors apply discriminately must be acquired, and cannot be programed, and therefore cannot be learned. (FG)

  11. Eliciting and Representing High-Level Knowledge Requirements to Discover Ecological Knowledge in Flower-Visiting Data

    PubMed Central

    2016-01-01

    Observations of individual organisms (data) can be combined with expert ecological knowledge of species, especially causal knowledge, to model and extract from flower–visiting data useful information about behavioral interactions between insect and plant organisms, such as nectar foraging and pollen transfer. We describe and evaluate a method to elicit and represent such expert causal knowledge of behavioral ecology, and discuss the potential for wider application of this method to the design of knowledge-based systems for knowledge discovery in biodiversity and ecosystem informatics. PMID:27851814

  12. Making Teamwork Work: Team Knowledge for Team Effectiveness.

    PubMed

    Guchait, Priyanko; Lei, Puiwa; Tews, Michael J

    2016-01-01

    This study examined the impact of two types of team knowledge on team effectiveness. The study assessed the impact of taskwork knowledge and teamwork knowledge on team satisfaction and performance. A longitudinal study was conducted with 27 service-management teams involving 178 students in a real-life restaurant setting. Teamwork knowledge was found to impact both team outcomes. Furthermore, team learning behavior was found to mediate the relationships between teamwork knowledge and team outcomes. Educators and managers should therefore ensure these types of knowledge are developed in teams along with learning behavior for maximum effectiveness.

  13. Emotion Knowledge in Young Neglected Children

    PubMed Central

    Sullivan, Margaret W.; Bennett, David S.; Carpenter, Kim; Lewis, Michael

    2013-01-01

    Young neglected children may be at risk for emotion knowledge deficits. Children with histories of neglect or with no maltreatment were initially seen at age 4 and again 1 year later to assess their emotion knowledge. Higher IQ was associated with better emotion knowledge, but neglected children had consistently poorer emotion knowledge over time compared to non-neglected children after controlling for IQ. Because both neglected status and IQ may contribute to deficits in emotional knowledge, both should be assessed when evaluating these children to appropriately design and pace emotion knowledge interventions. PMID:18299632

  14. Emotion knowledge in young neglected children.

    PubMed

    Sullivan, Margaret W; Bennett, David S; Carpenter, Kim; Lewis, Michael

    2008-08-01

    Young neglected children may be at risk for emotion knowledge deficits. Children with histories of neglect or with no maltreatment were initially seen at age 4 and again 1 year later to assess their emotion knowledge. Higher IQ was associated with better emotion knowledge, but neglected children had consistently poorer emotion knowledge over time compared to non-neglected children after controlling for IQ. Because both neglected status and IQ may contribute to deficits in emotional knowledge, both should be assessed when evaluating these children to appropriately design and pace emotion knowledge interventions.

  15. Concept Formation in Scientific Knowledge Discovery from a Constructivist View

    NASA Astrophysics Data System (ADS)

    Peng, Wei; Gero, John S.

    The central goal of scientific knowledge discovery is to learn cause-effect relationships among natural phenomena presented as variables and the consequences their interactions. Scientific knowledge is normally expressed as scientific taxonomies and qualitative and quantitative laws [1]. This type of knowledge represents intrinsic regularities of the observed phenomena that can be used to explain and predict behaviors of the phenomena. It is a generalization that is abstracted and externalized from a set of contexts and applicable to a broader scope. Scientific knowledge is a type of third-person knowledge, i.e., knowledge that independent of a specific enquirer. Artificial intelligence approaches, particularly data mining algorithms that are used to identify meaningful patterns from large data sets, are approaches that aim to facilitate the knowledge discovery process [2]. A broad spectrum of algorithms has been developed in addressing classification, associative learning, and clustering problems. However, their linkages to people who use them have not been adequately explored. Issues in relation to supporting the interpretation of the patterns, the application of prior knowledge to the data mining process and addressing user interactions remain challenges for building knowledge discovery tools [3]. As a consequence, scientists rely on their experience to formulate problems, evaluate hypotheses, reason about untraceable factors and derive new problems. This type of knowledge which they have developed during their career is called “first-person” knowledge. The formation of scientific knowledge (third-person knowledge) is highly influenced by the enquirer’s first-person knowledge construct, which is a result of his or her interactions with the environment. There have been attempts to craft automatic knowledge discovery tools but these systems are limited in their capabilities to handle the dynamics of personal experience. There are now trends in developing approaches to assist scientists applying their expertise to model formation, simulation, and prediction in various domains [4], [5]. On the other hand, first-person knowledge becomes third-person theory only if it proves general by evidence and is acknowledged by a scientific community. Researchers start to focus on building interactive cooperation platforms [1] to accommodate different views into the knowledge discovery process. There are some fundamental questions in relation to scientific knowledge development. What aremajor components for knowledge construction and how do people construct their knowledge? How is this personal construct assimilated and accommodated into a scientific paradigm? How can one design a computational system to facilitate these processes? This chapter does not attempt to answer all these questions but serves as a basis to foster thinking along this line. A brief literature review about how people develop their knowledge is carried out through a constructivist view. A hydrological modeling scenario is presented to elucidate the approach.

  16. Concept Formation in Scientific Knowledge Discovery from a Constructivist View

    NASA Astrophysics Data System (ADS)

    Peng, Wei; Gero, John S.

    The central goal of scientific knowledge discovery is to learn cause-effect relationships among natural phenomena presented as variables and the consequences their interactions. Scientific knowledge is normally expressed as scientific taxonomies and qualitative and quantitative laws [1]. This type of knowledge represents intrinsic regularities of the observed phenomena that can be used to explain and predict behaviors of the phenomena. It is a generalization that is abstracted and externalized from a set of contexts and applicable to a broader scope. Scientific knowledge is a type of third-person knowledge, i.e., knowledge that independent of a specific enquirer. Artificial intelligence approaches, particularly data mining algorithms that are used to identify meaningful patterns from large data sets, are approaches that aim to facilitate the knowledge discovery process [2]. A broad spectrum of algorithms has been developed in addressing classification, associative learning, and clustering problems. However, their linkages to people who use them have not been adequately explored. Issues in relation to supporting the interpretation of the patterns, the application of prior knowledge to the data mining process and addressing user interactions remain challenges for building knowledge discovery tools [3]. As a consequence, scientists rely on their experience to formulate problems, evaluate hypotheses, reason about untraceable factors and derive new problems. This type of knowledge which they have developed during their career is called "first-person" knowledge. The formation of scientific knowledge (third-person knowledge) is highly influenced by the enquirer's first-person knowledge construct, which is a result of his or her interactions with the environment. There have been attempts to craft automatic knowledge discovery tools but these systems are limited in their capabilities to handle the dynamics of personal experience. There are now trends in developing approaches to assist scientists applying their expertise to model formation, simulation, and prediction in various domains [4], [5]. On the other hand, first-person knowledge becomes third-person theory only if it proves general by evidence and is acknowledged by a scientific community. Researchers start to focus on building interactive cooperation platforms [1] to accommodate different views into the knowledge discovery process. There are some fundamental questions in relation to scientific knowledge development. What aremajor components for knowledge construction and how do people construct their knowledge? How is this personal construct assimilated and accommodated into a scientific paradigm? How can one design a computational system to facilitate these processes? This chapter does not attempt to answer all these questions but serves as a basis to foster thinking along this line. A brief literature review about how people develop their knowledge is carried out through a constructivist view. A hydrological modeling scenario is presented to elucidate the approach.

  17. Critical inquiry and knowledge translation: exploring compatibilities and tensions

    PubMed Central

    Reimer-Kirkham, Sheryl; Varcoe, Colleen; Browne, Annette J.; Lynam, M. Judith; Khan, Koushambhi Basu; McDonald, Heather

    2016-01-01

    Knowledge translation has been widely taken up as an innovative process to facilitate the uptake of research-derived knowledge into health care services. Drawing on a recent research project, we engage in a philosophic examination of how knowledge translation might serve as vehicle for the transfer of critically oriented knowledge regarding social justice, health inequities, and cultural safety into clinical practice. Through an explication of what might be considered disparate traditions (those of critical inquiry and knowledge translation), we identify compatibilities and discrepancies both within the critical tradition, and between critical inquiry and knowledge translation. The ontological and epistemological origins of the knowledge to be translated carry implications for the synthesis and translation phases of knowledge translation. In our case, the studies we synthesized were informed by various critical perspectives and hence we needed to reconcile differences that exist within the critical tradition. A review of the history of critical inquiry served to articulate the nature of these differences while identifying common purposes around which to strategically coalesce. Other challenges arise when knowledge translation and critical inquiry are brought together. Critique is one of the hallmark methods of critical inquiry and, yet, the engagement required for knowledge translation between researchers and health care administrators, practitioners, and other stakeholders makes an antagonistic stance of critique problematic. While knowledge translation offers expanded views of evidence and the complex processes of knowledge exchange, we have been alerted to the continual pull toward epistemologies and methods reminiscent of the positivist paradigm by their instrumental views of knowledge and assumptions of objectivity and political neutrality. These types of tensions have been productive for us as a research team in prompting a critical reconceptualization of knowledge translation. PMID:19527437

  18. Critical inquiry and knowledge translation: exploring compatibilities and tensions.

    PubMed

    Reimer-Kirkham, Sheryl; Varcoe, Colleen; Browne, Annette J; Lynam, M Judith; Khan, Koushambhi Basu; McDonald, Heather

    2009-07-01

    Knowledge translation has been widely taken up as an innovative process to facilitate the uptake of research-derived knowledge into health care services. Drawing on a recent research project, we engage in a philosophic examination of how knowledge translation might serve as vehicle for the transfer of critically oriented knowledge regarding social justice, health inequities, and cultural safety into clinical practice. Through an explication of what might be considered disparate traditions (those of critical inquiry and knowledge translation), we identify compatibilities and discrepancies both within the critical tradition, and between critical inquiry and knowledge translation. The ontological and epistemological origins of the knowledge to be translated carry implications for the synthesis and translation phases of knowledge translation. In our case, the studies we synthesized were informed by various critical perspectives and hence we needed to reconcile differences that exist within the critical tradition. A review of the history of critical inquiry served to articulate the nature of these differences while identifying common purposes around which to strategically coalesce. Other challenges arise when knowledge translation and critical inquiry are brought together. Critique is one of the hallmark methods of critical inquiry and, yet, the engagement required for knowledge translation between researchers and health care administrators, practitioners, and other stakeholders makes an antagonistic stance of critique problematic. While knowledge translation offers expanded views of evidence and the complex processes of knowledge exchange, we have been alerted to the continual pull toward epistemologies and methods reminiscent of the positivist paradigm by their instrumental views of knowledge and assumptions of objectivity and political neutrality. These types of tensions have been productive for us as a research team in prompting a critical reconceptualization of knowledge translation.

  19. Examining Challenges Related to the Production of Actionable Climate Knowledge for Adaptation Decision-Making: A Focus on Climate Knowledge System Producers

    NASA Astrophysics Data System (ADS)

    Ernst, K.; Preston, B. L.; Tenggren, S.; Klein, R.; Gerger-Swartling, Å.

    2017-12-01

    Many challenges to adaptation decision-making and action have been identified across peer-reviewed and gray literature. These challenges have primarily focused on the use of climate knowledge for adaptation decision-making, the process of adaptation decision-making, and the needs of the decision-maker. Studies on climate change knowledge systems often discuss the imperative role of climate knowledge producers in adaptation decision-making processes and stress the need for producers to engage in knowledge co-production activities and to more effectively meet decision-maker needs. While the influence of climate knowledge producers on the co-production of science for adaptation decision-making is well-recognized, hardly any research has taken a direct approach to analyzing the challenges that climate knowledge producers face when undertaking science co-production. Those challenges can influence the process of knowledge production and may hinder the creation, utilization, and dissemination of actionable knowledge for adaptation decision-making. This study involves semi-structured interviews, focus groups, and participant observations to analyze, identify, and contextualize the challenges that climate knowledge producers in Sweden face as they endeavor to create effective climate knowledge systems for multiple contexts, scales, and levels across the European Union. Preliminary findings identify complex challenges related to education, training, and support; motivation, willingness, and culture; varying levels of prioritization; professional roles and responsibilities; the type and amount of resources available; and professional incentive structures. These challenges exist at varying scales and levels across individuals, organizations, networks, institutions, and disciplines. This study suggests that the creation of actionable knowledge for adaptation decision-making is not supported across scales and levels in the climate knowledge production landscape. Additionally, enabling the production of actionable knowledge for adaptation decision-making requires multi-level effort beyond the individual level.

  20. Comparing Children with ASD and Their Peers' Growth in Print Knowledge.

    PubMed

    Dynia, Jaclyn M; Brock, Matthew E; Logan, Jessica A R; Justice, Laura M; Kaderavek, Joan N

    2016-07-01

    Many children with autism spectrum disorder (ASD) struggle with reading. An increased focus on emergent literacy skills-particularly print knowledge-might improve later reading outcomes. We analyzed longitudinal measures of print knowledge (i.e., alphabet knowledge and print-concept knowledge) for 35 preschoolers with ASD relative to a sample of 35 typically developing peers. Through multilevel growth curve analysis, we found that relative to their peers, children with ASD had comparable alphabet knowledge, lower print-concept knowledge, and acquired both skills at a similar rate. These findings suggest that children with ASD are unlikely to acquire print-concept knowledge commensurate to their peers without an increased emphasis on high-quality instruction that targets this skill.

  1. A relational data-knowledge base system and its potential in developing a distributed data-knowledge system

    NASA Technical Reports Server (NTRS)

    Rahimian, Eric N.; Graves, Sara J.

    1988-01-01

    A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.

  2. Neuro-Fuzzy Support of Knowledge Management in Social Regulation

    NASA Astrophysics Data System (ADS)

    Petrovic-Lazarevic, Sonja; Coghill, Ken; Abraham, Ajith

    2002-09-01

    The aim of the paper is to demonstrate the neuro-fuzzy support of knowledge management in social regulation. Knowledge could be understood for social regulation purposes as explicit and tacit. Explicit knowledge relates to the community culture indicating how things work in the community based on social policies and procedures. Tacit knowledge is ethics and norms of the community. The former could be codified, stored and transferable in order to support decision making, while the latter being based on personal knowledge, experience and judgments is difficult to codify and store. Tacit knowledge expressed through linguistic information can be stored and used to support knowledge management in social regulation through the application of fuzzy and neuro-fuzzy logic.

  3. How do we Remain Us in a Time of Change: Culture and Knowledge Management at NASA

    NASA Technical Reports Server (NTRS)

    Linde, Charlotte

    2003-01-01

    This viewgraph representation presents an overview of findings of a NASA agency-wide Knowledge Management Team considering culture and knowledge management issues at the agency. Specific issues identified by the team include: (1) NASA must move from being a knowledge hoarding culture to a knowledge sharing culture; (2) NASA must move from being center focused to being Agency focused; (3) NASA must capture the knowledge of a departing workforce. Topics considered include: what must NASA know to remain NASA, what were previous forms of knowledge reproduction and how has technological innovations changed these systems, and what changes in funding and relationships between contractors and NASA affected knowledge reproduction.

  4. A bilateral integrative health-care knowledge service mechanism based on 'MedGrid'.

    PubMed

    Liu, Chao; Jiang, Zuhua; Zhen, Lu; Su, Hai

    2008-04-01

    Current health-care organizations are encountering impression of paucity of medical knowledge. This paper classifies medical knowledge with new scopes. The discovery of health-care 'knowledge flow' initiates a bilateral integrative health-care knowledge service, and we make medical knowledge 'flow' around and gain comprehensive effectiveness through six operations (such as knowledge refreshing...). Seizing the active demand of Chinese health-care revolution, this paper presents 'MedGrid', which is a platform with medical ontology and knowledge contents service. Each level and detailed contents are described on MedGrid info-structure. Moreover, a new diagnosis and treatment mechanism are formed by technically connecting with electronic health-care records (EHRs).

  5. Description of pedagogical content knowledge (PCK) and content knowledge on Muhammadiyah Semarang University's preservice teacher

    NASA Astrophysics Data System (ADS)

    Astuti, Andari Puji; Wijayatiningsih, Testiana Deni; Azis, Abdul; Sumarti, Sri Susilogati; Barati, Dwi Anggani Linggar

    2017-12-01

    One of the competencies of teachers to be mastered under the constitution is pedagogic competence. This study aims to provide an overview of the pedagogic competence of Preservice teachers through the mastery of Pedagogical Content Knowledge (PCK) and Content knowledge (CK). The research method used is descriptive qualitative, with data retrieval technique through essay tests, questionnaire and interview. The results showed that of the five PCK indicators, only knowledge of learning strategies to teach chemistry already in high category. For Content Knowledge of preservice teachers are in the middle category for indicators of knowledge of disciplinary content, whereas knowledge that alternative frameworks for thinking about the content exist and the knowledge of the relationship between big ideas and the supporting ideas in a content area is in the fair category.

  6. Personal Knowledge Management for Employee Commoditization

    ERIC Educational Resources Information Center

    Schild, Susie A.

    2013-01-01

    Knowledge management thinking has resulted in the perception that the organization is the relevant beneficiary of knowledge. Individual approaches to and experiences with personal knowledge management are not well documented in empirical studies, which uncovered the specific problem that the situatedness of knowledge worker contemporaries within…

  7. 49 CFR 383.111 - Required knowledge.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 5 2011-10-01 2011-10-01 false Required knowledge. 383.111 Section 383.111... STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.111 Required knowledge. (a) All CMV operators must have knowledge of the following 20 general areas: (1) Safe operations regulations...

  8. 49 CFR 383.111 - Required knowledge.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Required knowledge. 383.111 Section 383.111... STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.111 Required knowledge. All commercial motor vehicle operators must have knowledge of the following general areas: (a) Safe operations...

  9. Knowledge-Sparse and Knowledge-Rich Learning in Information Retrieval.

    ERIC Educational Resources Information Center

    Rada, Roy

    1987-01-01

    Reviews aspects of the relationship between machine learning and information retrieval. Highlights include learning programs that extend from knowledge-sparse learning to knowledge-rich learning; the role of the thesaurus; knowledge bases; artificial intelligence; weighting documents; work frequency; and merging classification structures. (78…

  10. Knowledge Searching and Sharing on Virtual Networks.

    ERIC Educational Resources Information Center

    Helokunnas, Tuija; Herrala, Juha

    2001-01-01

    Describes searching and sharing of knowledge on virtual networks, based on experiences gained when hosting virtual knowledge networks at Tampere University of Technology in Finland. Discusses information and knowledge management studies; role of information technology in knowledge searching and sharing; implementation and experiences of the…

  11. Knowledge at Work.

    ERIC Educational Resources Information Center

    Scribner, Sylvia

    1985-01-01

    Activity theory posits that culturally organized actions guide the acquisition and organization of knowledge. This theory was applied to the organization of knowledge within a large milk processing plant. The dairy was found to be organized by social knowledge, yet individuals creatively synthesized several domains of knowledge to organize their…

  12. Teacher Subject Matter Knowledge of Number Sense

    ERIC Educational Resources Information Center

    Briand-Newman, Hannah; Wong, Monica; Evans, David

    2012-01-01

    Pedagogical content knowledge has been widely acknowledged by researchers and practitioners as a significant factor for improving student knowledge, understanding and achievement. Recently, the knowledge teachers need for teaching has expanded to include teacher horizon content knowledge, "an awareness of how mathematical topics are related…

  13. Investigating the Knowledge Management Culture

    ERIC Educational Resources Information Center

    Stylianou, Vasso; Savva, Andreas

    2016-01-01

    Knowledge Management (KM) efforts aim at leveraging an organization into a knowledge organization thereby presenting knowledge employees with a very powerful tool; organized valuable knowledge accessible when and where needed in flexible, technologically-enhanced modes. The attainment of this aim, i.e., the transformation into a knowledge…

  14. Early Predictors of Middle School Fraction Knowledge

    ERIC Educational Resources Information Center

    Bailey, Drew H.; Siegler, Robert S.; Geary, David C.

    2014-01-01

    Recent findings that earlier fraction knowledge predicts later mathematics achievement raise the question of what predicts later fraction knowledge. Analyses of longitudinal data indicated that whole number magnitude knowledge in first grade predicted knowledge of fraction magnitudes in middle school, controlling for whole number arithmetic…

  15. The Framework of Knowledge Creation for Online Learning Environments

    ERIC Educational Resources Information Center

    Huang, Hsiu­-Mei; Liaw, Shu­-Sheng

    2004-01-01

    In today's competitive global economy characterized by knowledge acquisition, the concept of knowledge management has become increasingly prevalent in academic and business practices. Knowledge creation is an important factor and remains a source of competitive advantage over knowledge management. Information technology facilitates knowledge…

  16. Curriculum Design and Epistemic Ascent

    ERIC Educational Resources Information Center

    Winch, Christopher

    2013-01-01

    Three kinds of knowledge usually recognised by epistemologists are identified and their relevance for curriculum design is discussed. These are: propositional knowledge, know-how and knowledge by acquaintance. The inferential nature of propositional knowledge is argued for and it is suggested that propositional knowledge in fact presupposes the…

  17. Athletic Trainers' Knowledge Regarding Airway Adjuncts

    ERIC Educational Resources Information Center

    Edler, Jessica R.; Eberman, Lindsey E.; Kahanov, Leamor; Roman, Christopher; Mata, Heather Lynne

    2015-01-01

    Context: Research suggests that knowledge gaps regarding the appropriate use of airway adjuncts exist among various health care practitioners, and that knowledge is especially limited within athletic training. Objective: To determine the relationship between perceived knowledge (PK) and actual knowledge (AK) of airway adjunct use and the…

  18. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    ERIC Educational Resources Information Center

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  19. Parental nutrition knowledge and attitudes as predictors of 5-6-year-old children's healthy food knowledge.

    PubMed

    Zarnowiecki, Dorota; Sinn, Natalie; Petkov, John; Dollman, James

    2012-07-01

    Young children's knowledge about healthy food may influence the formation of their eating behaviours, and parents have a major influence on the development of children's knowledge in the early years. We investigated the extent to which parental nutrition knowledge and attitudes around food predicted young children's knowledge of healthy foods, controlling for other influences such as socio-economic status (SES) and parent education levels in a cross-sectional research design. Children were given a healthy food knowledge activity and parents completed questionnaires. Twenty primary schools in Adelaide, Australia, stratified by SES. We recruited 192 children aged 5-6 years and their parents. Structural equation modelling showed that parent nutrition knowledge predicted children's nutrition knowledge (r = 0·30, P < 0·001) independently of attitudes, SES and education level. Nutrition education for parents, targeted at low-SES areas at higher risk for obesity, may contribute to the development of healthy food knowledge in young children.

  20. Towards Knowledge Management for Smart Manufacturing.

    PubMed

    Feng, Shaw C; Bernstein, William Z; Hedberg, Thomas; Feeney, Allison Barnard

    2017-09-01

    The need for capturing knowledge in the digital form in design, process planning, production, and inspection has increasingly become an issue in manufacturing industries as the variety and complexity of product lifecycle applications increase. Both knowledge and data need to be well managed for quality assurance, lifecycle-impact assessment, and design improvement. Some technical barriers exist today that inhibit industry from fully utilizing design, planning, processing, and inspection knowledge. The primary barrier is a lack of a well-accepted mechanism that enables users to integrate data and knowledge. This paper prescribes knowledge management to address a lack of mechanisms for integrating, sharing, and updating domain-specific knowledge in smart manufacturing. Aspects of the knowledge constructs include conceptual design, detailed design, process planning, material property, production, and inspection. The main contribution of this paper is to provide a methodology on what knowledge manufacturing organizations access, update, and archive in the context of smart manufacturing. The case study in this paper provides some example knowledge objects to enable smart manufacturing.

Top