Proposed Conceptual Requirements for the CTBT Knowledge Base,
1995-08-14
knowledge available to automated processing routines and human analysts are significant, and solving these problems is an essential step in ensuring...knowledge storage in a CTBT system. In addition to providing regional knowledge to automated processing routines, the knowledge base will also address
Summary report of the workshop on the U.S. use of surface waves for monitoring the CTBT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritzwoller, M; Walter, W R
1998-09-01
The workshop addressed the following general research goals of relevance to monitoring and verifying the Comprehensive Test Ban Treaty (CTBT): A) To apprise participants of current and planned research in order to facilitate information exchange, collaboration, and peer review. B) To compare and discuss techniques for data selection, measurement, error assessment, modeling methodologies, etc. To compare results in regions where they overlap and understand the causes of obsenied differences. C) To hear about the U.S. research customer's (AFTAC and DOE Knowledge Base) current and anticipated interests in surface wave research. D) To discuss information flow and integration. How can researchmore » results be prepared for efficient use and integration into operational systems E) To identify and discuss fruitful future directions for research.« less
Policy issues facing the Comprehensive Test Ban Treaty and prospects for the future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweeney, J.
1999-04-01
This report is divided into the following 5 sections: (1) Background; (2) Major Issues Facing Ratification of CTBT; (3) Current Status on CTBT Ratification; (4) Status of CTBT Signatories and Ratifiers; and (5) CTBT Activities Not Prohibited. The major issues facing ratification of CTBT discussed here are: impact on CTBT of START II and ABM ratification; impact of India and Pakistan nuclear tests; CTBT entry into force; and establishment of the Comprehensive Nuclear Test-Ban Treaty Organization.
NASA Astrophysics Data System (ADS)
Rashid, F. I. A.; Zolkaffly, M. Z.; Jamal, N.
2018-01-01
In order to keep abreast on issues related to CTBT in Malaysia, Malaysian Nuclear Agency (Nuklear Malaysia), as the CTBT National Authority in Malaysia, has collaborated with local partners to implement various stakeholder engagement programme. This paper aims at highlighting Malaysia’s approach in promoting CTBT through stakeholder engagement programme targeted at multilevel stakeholders, both national and international. Such programmes includes participation in the international forums, inter-agency meetings, awareness seminars, training courses, technical visits to IMS station, promoting civil and scientific application of International Monitoring System (IMS) data and International Data Centre (IDC) products using Virtual Data Exploitation Center (vDEC), inviting youth groups to participate in the CTBTO Youth Group, and publications of CTBT-related topics. This approach has successfully fortify Malaysia’s commitments at the international level, enhanced national awareness of global multilateral framework, increased stakeholders awareness and their roles related to CTBT, as well as building domestic capacity on CTBT matters. In conclusion, stakeholder engagement is crucial in promoting and enhancing stakeholders understanding on CTBT. Continuous engagement with relevant stakeholders will enable effective dissemination and smooth implementation of CTBT related matters that will eventually support global universalization of CTBT.
DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,
1995-08-14
processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated
On-Site Inspection RadioIsotopic Spectroscopy (Osiris) System Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caffrey, Gus J.; Egger, Ann E.; Krebs, Kenneth M.
2015-09-01
We have designed and tested hardware and software for the acquisition and analysis of high-resolution gamma-ray spectra during on-site inspections under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The On-Site Inspection RadioIsotopic Spectroscopy—Osiris—software filters the spectral data to display only radioisotopic information relevant to CTBT on-site inspections, e.g.,132I. A set of over 100 fission-product spectra was employed for Osiris testing. These spectra were measured, where possible, or generated by modeling. The synthetic test spectral compositions include non-nuclear-explosion scenarios, e.g., a severe nuclear reactor accident, and nuclear-explosion scenarios such as a vented underground nuclear test. Comparing its computer-based analyses to expert visual analysesmore » of the test spectra, Osiris correctly identifies CTBT-relevant fission product isotopes at the 95% level or better.The Osiris gamma-ray spectrometer is a mechanically-cooled, battery-powered ORTEC Transpec-100, chosen to avoid the need for liquid nitrogen during on-site inspections. The spectrometer was used successfully during the recent 2014 CTBT Integrated Field Exercise in Jordan. The spectrometer is controlled and the spectral data analyzed by a Panasonic Toughbook notebook computer. To date, software development has been the main focus of the Osiris project. In FY2016-17, we plan to modify the Osiris hardware, integrate the Osiris software and hardware, and conduct rigorous field tests to ensure that the Osiris system will function correctly during CTBT on-site inspections. The planned development will raise Osiris to technology readiness level TRL-8; transfer the Osiris technology to a commercial manufacturer, and demonstrate Osiris to potential CTBT on-site inspectors.« less
NASA Astrophysics Data System (ADS)
Ringbom, A.
2010-12-01
A detailed knowledge of both the spatial and isotopic distribution of anthropogenic radioxenon is essential in investigations of the performance of the radioxenon part of the IMS, as well as in the development of techniques to discriminate radioxenon signatures from a nuclear explosion from other sources. Further, the production processes in the facilities causing the radioxenon background has to be understood and be compatible with simulations. In this work, several aspects of the observed atmospheric radioxenon background are investigated, including the global distribution as well as the current understanding of the observed isotopic ratios. Analyzed radioxenon data from the IMS, as well as from other measurement stations, are used to create an up-to-date description of the global radioxenon background, including all four CTBT relevant xenon isotopes (133Xe, 131mXe, 133mXe, and 135Xe). In addition, measured isotopic ratios will be compared to simulations of neutron induced fission of 235U, and the uncertainties will be discussed. Finally, the impact of the radioxenon background on the detection capability of the IMS will be investigated. This work is a continuation of studies [1,2] that was presented at the International Scientific Studies conference held in Vienna in 2009. [1] A. Ringbom, et.al., “Characterization of the global distribution of atmospheric radioxenons”, International Scientific Studies Conference on CTBT Verification, 10-12 June 2009. [2] R. D'Amours and A. Ringbom, “A study on the global detection capability of IMS for all CTBT relevant xenon isotopes“, International Scientific Studies Conference on CTBT Verification, 10-12 June 2009.
A high-efficiency HPGe coincidence system for environmental analysis.
Britton, R; Davies, A V; Burnett, J L; Jackson, M J
2015-08-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is supported by a network of certified laboratories which must meet certain sensitivity requirements for CTBT relevant radionuclides. At the UK CTBT Radionuclide Laboratory (GBL15), a high-efficiency, dual-detector gamma spectroscopy system has been developed to improve the sensitivity of measurements for treaty compliance, greatly reducing the time required for each sample. Utilising list-mode acquisition, each sample can be counted once, and processed multiple times to further improve sensitivity. For the 8 key radionuclides considered, Minimum Detectable Activities (MDA's) were improved by up to 37% in standard mode (when compared to a typical CTBT detector system), with the acquisition time required to achieve the CTBT sensitivity requirements reduced from 6 days to only 3. When utilising the system in coincidence mode, the MDA for (60) Co in a high-activity source was improved by a factor of 34 when compared to a standard CTBT detector, and a factor of 17 when compared to the dual-detector system operating in standard mode. These MDA improvements will allow the accurate and timely quantification of radionuclides that decay via both singular and cascade γ emission, greatly enhancing the effectiveness of CTBT laboratories. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
Technical Issues Related to the Comprehensive Nuclear Test Ban Treaty
NASA Astrophysics Data System (ADS)
Garwin, Richard L.
2003-04-01
The National Academy of Sciences recently published a detailed study of technical factors related to the Comprehensive Nuclear Test Ban Treaty (CTBT), with emphasis on those issues that arose when the Senate declined to ratify the Treaty in 1999. The study considered (1) the capacity of the United States to maintain confidence in the safety and reliability of its nuclear weapons without nuclear testing; (2) the capabilities of the international nuclear-test monitoring system; and (3) the advances in nuclear weapons capabilities that other countries might make through low-yield testing that might escape detection. Excluding political factors, the committee considered three possible future worlds: (1) a world without a CTBT; (2) a world in which the signatories comply with a CTBT; and (3) a world in the signatories evade its strictures within the limits set by the detection system. The talk and ensuing discussion will elaborate on the study. The principal conclusion of the report, based solely on technical reasons, is that the national security of the United States is better served with a CTBT in force than without it, whether or not other signatories conduct low level but undetected tests in violation of the treaty. Moreover, the study finds that nuclear testing would not add substantially to the US Stockpile Stewardship Program in allowing the United States to maintain confidence in the assessment of its existing nuclear weapons.
Technical Issues Related to the Comprehensive Nuclear Test Ban Treaty
NASA Astrophysics Data System (ADS)
2003-03-01
The National Academy of Sciences recently completed a detailed study of the technical factors related to the Comprehensive Nuclear Test Ban Treaty (CTBT), with emphasis on those issues that arose when the Senate declined to ratify the Treaty in 1999. The study considered (1) the capacity of the United States to maintain confidence in the safety and reliability of its nuclear weapons without nuclear testing; (2) the capabilities of the international nuclear-test monitoring system; and (3) the advances in nuclear weapons capabilities that other countries might make through low-yield testing that might escape detection. While political factors were excluded, the committee considered three possible future worlds: (1) a world without a CTBT; (2) a world in which the signatories comply with a CTBT; and (3) a world in the signatories evade its strictures within the limits set by the detection system. The talk will elaborate on the study. The primary conclusion, based solely on technical reasons, is that the national security of the United States is better served with a CTBT in force than without it, whether or not other signatories conduct low level but undetected tests in violation of the treaty. Moreover, the study finds that nuclear testing would not add substantially to the US Stockpile Stewardship Program in allowing the United States to maintain confidence in the assessment of its existing nuclear weapons."
Comprehensive Test Ban Treaty (CTBT): Current Status
NASA Astrophysics Data System (ADS)
Chaturvedi, Ram
2003-04-01
After an effort of nearly a half century the CTBT was approved by the U.N. on September 10, 1996. Out of 185 member nations (at the time), 158 voted in favor, 3 against, and the remaining either abstained or were diplomatically absent. In spite of such an overwhelming support of the international community, the CTBT may well remain on paper. The reason being that one of the opposing nations, India, is considered a "threshold Nuclear Nation" and must approve the treaty to enter into force according to the rules of Conference of Disarmament (CD). India's U.N. representative said that her country would "never sign this unequal treaty, not now, not later." "Unequal" because it does not provide a time table for elimination of the existing nuclear weapons, testing of weapons, etc., which favor nuclear states. This paper will provide details of the above issues and the current status of the CTBT.
Technology Innovation for the CTBT, the National Laboratory Contribution
NASA Astrophysics Data System (ADS)
Goldstein, W. H.
2016-12-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) and its Protocol are the result of a long history of scientific engagement and international technical collaboration. The U.S. Department of Energy National Laboratories have been conducting nuclear explosive test-ban research for over 50 years and have made significant contributions to this legacy. Recent examples include the RSTT (regional seismic travel time) computer code and the Smart Sampler—both of these products are the result of collaborations among Livermore, Sandia, Los Alamos, and Pacific Northwest National Laboratories. The RSTT code enables fast and accurate seismic event locations using regional data. This code solves the long-standing problem of using teleseismic and regional seismic data together to locate events. The Smart Sampler is designed for use in On-site Inspections to sample soil gases to look for noble gas fission products from a potential underground nuclear explosive test. The Smart Sampler solves the long-standing problem of collecting soil gases without contaminating the sample with gases from the atmosphere by operating only during atmospheric low-pressure events. Both these products are being evaluated by the Preparatory Commission for the CTBT Organization and the international community. In addition to R&D, the National Laboratories provide experts to support U.S. policy makers in ongoing discussions such as CTBT Working Group B, which sets policy for the development of the CTBT monitoring and verification regime.
Cosmic veto gamma-spectrometry for Comprehensive Nuclear-Test-Ban Treaty samples
NASA Astrophysics Data System (ADS)
Burnett, J. L.; Davies, A. V.
2014-05-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is supported by a global network of monitoring stations that perform high-resolution gamma-spectrometry on air filter samples for the identification of 85 radionuclides. At the UK CTBT Radionuclide Laboratory (GBL15), a novel cosmic veto gamma-spectrometer has been developed to improve the sensitivity of station measurements, providing a mean background reduction of 80.8% with mean MDA improvements of 45.6%. The CTBT laboratory requirement for a 140Ba MDA is achievable after 1.5 days counting compared to 5-7 days using conventional systems. The system consists of plastic scintillation plates that detect coincident cosmic-ray interactions within an HPGe gamma-spectrometer using the Canberra LynxTM multi-channel analyser. The detector is remotely configurable using a TCP/IP interface and requires no dedicated coincidence electronics. It would be especially useful in preventing false-positives at remote station locations (e.g. Halley, Antarctica) where sample transfer to certified laboratories is logistically difficult. The improved sensitivity has been demonstrated for a CTBT air filter sample collected after the Fukushima incident.
Travel-time source-specific station correction improves location accuracy
NASA Astrophysics Data System (ADS)
Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo
2013-04-01
Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.
Seismological investigation of the National Data Centre Preparedness Exercise 2013
NASA Astrophysics Data System (ADS)
Gestermann, Nicolai; Hartmann, Gernot; Ross, J. Ole; Ceranna, Lars
2015-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions conducted on Earth - underground, underwater or in the atmosphere. The verification regime of the CTBT is designed to detect any treaty violation. While the data of the International Monitoring System (IMS) is collected, processed and technically analyzed at the International Data Centre (IDC) of the CTBT-Organization, National Data Centres (NDC) of the member states provide interpretation and advice to their government concerning suspicious detections. The NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies. These exercises should help to evaluate the effectiveness of analysis procedures applied at NDCs and the quality, completeness and usefulness of IDC products for example. The exercise trigger of NPE2013 is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The potential connection between the waveform and radionuclide evidence remains unclear for exercise participants. The verification task was to identify the waveform event and to investigate potential sources of the radionuclide findings. The final question was whether the findings are CTBT relevant and justify a request for On-Site-Inspection in "Frisia". The seismic event was not included in the Reviewed Event Bulletin (REB) of the IDC. The available detections from the closest seismic IMS stations lead to a epicenter accuracy of about 24 km which is not sufficient to specify the 1000 km2 inspection area in case of an OSI. With use of data from local stations and adjusted velocity models the epicenter accuracy could be improved to less than 2 km, which demonstrates the crucial role of national technical means for verification tasks. The seismic NPE2013 event could be identified as induced from natural gas production in the source region. Similar waveforms and comparable spectral characteristic as a set of events in the same region are clear indications. The scenario of a possible treaty violation at the location of the seismic NPE2013 event could be disproved.
Atmospheric Transport Modelling and Radionuclide Analysis for the NPE 2015 scenario
NASA Astrophysics Data System (ADS)
Ross, J. Ole; Bollhöfer, Andreas; Heidmann, Verena; Krais, Roman; Schlosser, Clemens; Gestermann, Nicolai; Ceranna, Lars
2017-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. The International Monitoring System (IMS) is in place and at about 90% complete to verify compliance with the CTBT. The stations of the waveform technologies are capable to detect seismic, hydro-acoustic and infrasonic signals for detection, localization, and characterization of explosions. For practicing Comprehensive Nuclear-Test-Ban Treaty (CTBT) verification procedures and interplay between the International Data Centre (IDC) and National Data Centres (NDC), prepardness exercises (NPE) are regularly performed with selected events of fictitious CTBT-violation. The German NDC's expertise for radionuclide analyses and operation of station RN33 is provided by the Federal Office for Radiation Protection (BfS) while Atmospheric Transport Modelling (ATM) for CTBT purposes is performed at the Federal Institute for Geosciences and Natural Resources (BGR) for the combination of the radionuclide findings with waveform evidence. The radionuclide part of the NPE 2015 scenario is tackled in a joint effort by BfS and BGR. First, the NPE 2015 spectra are analysed, fission products are identified, and respective activity concentrations are derived. Special focus is on isotopic ratios which allow for source characterization and event timing. For atmospheric backtracking the binary coincidence method is applied for both, SRS fields from IDC and WMO-RSMC, and for in-house backward simulations in higher resolution for the first affected samples. Results are compared with the WebGrape PSR and the spatio-temporal domain with high atmospheric release probability is determined. The ATM results together with the radionuclide fingerprint are used for identification of waveform candidate events. Comparative forward simulations of atmospheric dispersion for candidate events are performed. Finally the overall consistency of various source scenarios is assessed and a fictitious government briefing on the findings is given.
Le Petit, G; Cagniant, A; Morelle, M; Gross, P; Achim, P; Douysset, G; Taffary, T; Moulin, C
The verification regime of the comprehensive test ban treaty (CTBT) is based on a network of three different waveform technologies together with global monitoring of aerosols and noble gas in order to detect, locate and identify a nuclear weapon explosion down to 1 kt TNT equivalent. In case of a low intensity underground or underwater nuclear explosion, it appears that only radioactive gases, especially the noble gas which are difficult to contain, will allow identification of weak yield nuclear tests. Four radioactive xenon isotopes, 131m Xe, 133m Xe, 133 Xe and 135 Xe, are sufficiently produced in fission reactions and exhibit suitable half-lives and radiation emissions to be detected in atmosphere at low level far away from the release site. Four different monitoring CTBT systems, ARIX, ARSA, SAUNA, and SPALAX™ have been developed in order to sample and to measure them with high sensitivity. The latest developed by the French Atomic Energy Commission (CEA) is likely to be drastically improved in detection sensitivity (especially for the metastable isotopes) through a higher sampling rate, when equipped with a new conversion electron (CE)/X-ray coincidence spectrometer. This new spectrometer is based on two combined detectors, both exhibiting very low radioactive background: a well-type NaI(Tl) detector for photon detection surrounding a gas cell equipped with two large passivated implanted planar silicon chips for electron detection. It is characterized by a low electron energy threshold and a much better energy resolution for the CE than those usually measured with the existing CTBT equipments. Furthermore, the compact geometry of the spectrometer provides high efficiency for X-ray and for CE associated to the decay modes of the four relevant radioxenons. The paper focus on the design of this new spectrometer and presents spectroscopic performances of a prototype based on recent results achieved from both radioactive xenon standards and air sample measurements. Major improvements in detection sensitivity have been reached and quantified, especially for metastable radioactive isotopes 131m Xe and 133m Xe with a gain in minimum detectable activity (about 2 × 10 -3 Bq) relative to current CTBT SPALAX™ system (air sampling frequency normalized to 8 h) of about 70 and 30 respectively.
NASA Astrophysics Data System (ADS)
Becker, A.; Wotawa, G.; de Geer, L.
2006-05-01
The Provisional Technical Secretariat (PTS) of the CTBTO Preparatory Commission maintains and permanently updates a source-receptor matrix (SRM) describing the global monitoring capability of a highly sensitive 80 stations radionuclide (RN) network in order to verify states signatories' compliance of the comprehensive nuclear-test-ban treaty (CTBT). This is done by means of receptor-oriented Lagrangian particle dispersion modeling (LPDM) to help determine the region from which suspicious radionuclides may originate. In doing so the LPDM FLEXPART5.1 is integrated backward in time based on global analysis wind fields yielding global source-receptor sensitivity (SRS) fields stored in three-hour frequency and at 1º horizontal resolution. A database of these SRS fields substantially helps in improving the interpretation of the RN samples measurements and categorizations because it enables the testing of source-hypothesis's later on in a pure post-processing (SRM inversion) step being feasible on hardware with specifications comparable to currently sold PC's or Notebooks and at any place (decentralized), provided access to the SRS fields is warranted. Within the CTBT environment it is important to quickly achieve decision-makers confidence in the SRM based backtracking products issued by the PTS in the case of the occurrence of treaty relevant radionuclides. Therefore the PTS has set up a highly automated response system together with the Regional Specialized Meteorological Centers of the World Meteorological Organization in the field of dispersion modeling who committed themselves to provide the PTS with the same standard SRS fields as calculated by their systems for CTBT relevant cases. This system was twice utilized in 2005 in order to perform adjoint ensemble dispersion modeling (EDM) and demonstrated the potential of EDM based backtracking to improve the accuracy of the source location related to singular nuclear events thus serving the backward analogue to the findings of the ensemble dispersion modeling (EDM) technique No. 5 efforts performed by Galmarini et al, 2004 (Atmos. Env. 38, 4607-4617). As the scope of the adjoint EDM methodology is not limited to CTBT verification but can be applied to any kind of nuclear event monitoring and location it bears the potential to improve the design of manifold emergency response systems towards preparedness concepts as needed for mitigation of disasters (like Chernobyl) and pre-emptive estimation of pollution hazards.
NASA Astrophysics Data System (ADS)
Zucca, J. J.
2014-05-01
On-site inspection (OSI) is a critical part of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The OSI verification regime provides for international inspectors to make a suite of measurements and observations on site at the location of an event of interest. The other critical component of the verification regime is the International Monitoring System (IMS), which is a globally distributed network of monitoring stations. The IMS along with technical monitoring data from CTBT member countries, as appropriate, will be used to trigger an OSI. After the decision is made to carry out an OSI, it is important for the inspectors to deploy to the field site rapidly to be able to detect short-lived phenomena such as the aftershocks that may be observable after an underground nuclear explosion. The inspectors will be on site from weeks to months and will be working with many tens of tons of equipment. Parts of the OSI regime will be tested in a field exercise in the country of Jordan late in 2014. The build-up of the OSI regime has been proceeding steadily since the CTBT was signed in 1996 and is on track to becoming a deterrent to someone considering conducting a nuclear explosion in violation of the Treaty.
NASA Astrophysics Data System (ADS)
Labak, Peter; Lindblom, Pasi; Malich, Gregor
2017-04-01
The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) during which the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI) were tested in integrated manner. Many of the inspection techniques permitted by the CTBT were applied during IFE14 including a range of geophysical techniques, however, one of the techniques foreseen by the CTBT but not yet developed is resonance seismometry. During August and September 2016, seismic field measurements have been conducted in the region of Kylylahti, Finland, in support of the further development of geophysical seismic techniques for OSIs. 45 seismic stations were used to continuously acquire seismic signals. During that period, data from local, regional and teleseismic natural events and man-made events were acquired, including from a devastating earthquake in Italy and the nuclear explosion announced by the Democratic People's Republic of Korea on 9 September 2016. Also, data were acquired following the small-scale use of man-made chemical explosives in the area and of vibratory sources. This presentation will show examples from the data set and will discuss its use for the development of resonance seimometry for OSIs.
Atmospheric transport modelling in support of CTBT verification—overview and basic concepts
NASA Astrophysics Data System (ADS)
Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi
Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.
Nuclear Weapons: Comprehensive Test Ban Treaty
2007-11-30
itself, which has been done. Critics raised concerns about the implications of these policies for testing and new weapons. At present, Congress...CTBT in lieu of the current treaty.1 On October 24, Senator Jon Kyl delivered a speech critical of the CTBT and of Section 3122 in H.R. 1585, the FY2008...to do so.’”6 Critics expressed concern about the implications of these policies for testing and new weapons. A statement by Physicians for Social
Characterization of the Infrasound Field in the Central Pacific
2006-06-01
Treaty (CTBT) in December 2001. The array site is in a tropical rainforest on the slopes of Hualalai Volcano , Hawaii Island, Hawaii . Per IMS...Test-Ban Treaty (CTBT) in December 2001. ’The array site is in a tropical rainforest on the slopes of Hualalai Volcano , Hawaii Island, Hawaii . Per IMS...general direction of Kilauea Volcano . These signals are tentatively assigned to the "iv" phase. To date the majority of these events have featured
NASA Astrophysics Data System (ADS)
Achim, Pascal; Generoso, Sylvia; Morin, Mireille; Gross, Philippe; Le Petit, Gilbert; Moulin, Christophe
2016-05-01
Monitoring atmospheric concentrations of radioxenons is relevant to provide evidence of atmospheric or underground nuclear weapon tests. However, when the design of the International Monitoring Network (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was set up, the impact of industrial releases was not perceived. It is now well known that industrial radioxenon signature can interfere with that of nuclear tests. Therefore, there is a crucial need to characterize atmospheric distributions of radioxenons from industrial sources—the so-called atmospheric background—in the frame of the CTBT. Two years of Xe-133 atmospheric background have been simulated using 2013 and 2014 meteorological data together with the most comprehensive emission inventory of radiopharmaceutical facilities and nuclear power plants to date. Annual average simulated activity concentrations vary from 0.01 mBq/m3 up to above 5 mBq/m3 nearby major sources. Average measured and simulated concentrations agree on most of the IMS stations, which indicates that the main sources during the time frame are properly captured. Xe-133 atmospheric background simulated at IMS stations turn out to be a complex combination of sources. Stations most impacted are in Europe and North America and can potentially detect Xe-133 every day. Predicted occurrences of detections of atmospheric Xe-133 show seasonal variations, more accentuated in the Northern Hemisphere, where the maximum occurs in winter. To our knowledge, this study presents the first global maps of Xe-133 atmospheric background from industrial sources based on two years of simulation and is a first attempt to analyze its composition in terms of origin at IMS stations.
Hydroacoustic propagation grids for the CTBT knowledge databaes BBN technical memorandum W1303
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Angell
1998-05-01
The Hydroacoustic Coverage Assessment Model (HydroCAM) has been used to develop components of the hydroacoustic knowledge database required by operational monitoring systems, particularly the US National Data Center (NDC). The database, which consists of travel time, amplitude correction and travel time standard deviation grids, is planned to support source location, discrimination and estimation functions of the monitoring network. The grids will also be used under the current BBN subcontract to support an analysis of the performance of the International Monitoring System (IMS) and national sensor systems. This report describes the format and contents of the hydroacoustic knowledgebase grids, and themore » procedures and model parameters used to generate these grids. Comparisons between the knowledge grids, measured data and other modeled results are presented to illustrate the strengths and weaknesses of the current approach. A recommended approach for augmenting the knowledge database with a database of expected spectral/waveform characteristics is provided in the final section of the report.« less
The LANL/LLNL/AFTAC Black Thunder Coal Mine regional mine monitoring experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pearson, D.C.; Stump, B.W.; Baker, D.F.
Cast blasting operations associated with near surface coal recovery provide relatively large explosive sources that generate regional seismograms of interest in monitoring a Comprehensive Test Ban Treaty (CTBT). This paper describes preliminary results of a series of experiments currently being conducted at the Black Thunder Coal Mine in northeast Wyoming as part of the DOE CTBT Research and Development Program. These experiments are intended to provide an integrated set of near-source and regional seismic data for the purposes of quantifying the coupling and source characterization of the explosions. The focus of this paper is on the types of data beingmore » recovered with some preliminary implications. The Black Thunder experiments are designed to assess three major questions: (1) how many mining explosions produce seismograms at regional distances that will have to be detected, located and ultimately identified by the National Data Center and what are the waveform characteristics of these particular mining explosions; (2) can discrimination techniques based on empirical studies be placed on a firm physical basis so that they can be applied to other regions where there is little monitoring experience; (3) can large scale chemical explosions (possibly mining explosions) be used to calibrate source and propagation path effects to regional stations, can source depth of burial and decoupling effects be studied in such a controlled environment? With these key questions in mind and given the cooperation of the Black Thunder Mine, a suite of experiments have been and are currently being conducted. This paper will describe the experiments and their relevance to CTBT issues.« less
NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios
NASA Astrophysics Data System (ADS)
Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.
2012-04-01
For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.
Scenario design and basic analysis of the National Data Centre Preparedness Exercise 2013
NASA Astrophysics Data System (ADS)
Ross, Ole; Ceranna, Lars; Hartmann, Gernot; Gestermann, Nicolai; Bönneman, Christian
2014-05-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and for the mutual exchange of information between NDC and also with the IDC. The NPE2010 and NPE2012 trigger scenarios were based on selected seismic events from the Reviewed Event Bulletin (REB) serving as starting point for fictitious Radionuclide dispersion. The main task was the identification of the original REB event and the discrimination between earthquakes and explosions as source. The scenario design of NPE2013 differs from those of previous NPEs. The waveform event selection is not constrained to events in the REB. The exercise trigger is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The synthetic radionuclide detections start in Vienna (8 Sept, I-131) and Schauinsland (11 Sept, Xe-133) with rather low activity concentrations and are most prominent in Stockholm and Spitsbergen mid of September 2013. Smaller concentrations in Asia follow later on. The potential connection between the waveform and radionuclide evidence remains unclear. The verification task is to identify the waveform event and to investigate potential sources of the radionuclide findings. Finally the potential conjunction between the sources and the CTBT-relevance of the whole picture has to be evaluated. The overall question is whether requesting an On-Site-Inspection in "Frisia" would be justified. The poster presents the NPE2013 scenario and gives a basic analysis of the initial situation concerning both waveform detections and atmospheric dispersion conditions in Central Europe in early September 2013. The full NPE2013 scenario will be presented at the NDC Workshop mid of May 2014.
NASA Astrophysics Data System (ADS)
Khrustalev, K.
2016-12-01
Current process for the calibration of the beta-gamma detectors used for radioxenon isotope measurements for CTBT purposes is laborious and time consuming. It uses a combination of point sources and gaseous sources resulting in differences between energy and resolution calibrations. The emergence of high resolution SiPIN based electron detectors allows improvements in the calibration and analysis process to be made. Thanks to high electron resolution of SiPIN detectors ( 8-9 keV@129 keV) compared to plastic scintillators ( 35 keV@129keV) there are a lot more CE peaks (from radioxenon and radon progenies) can be resolved and used for energy and resolution calibration in the energy range of the CTBT-relevant radioxenon isotopes. The long term stability of the SiPIN energy calibration allows one to significantly reduce the time of the QC measurements needed for checking the stability of the E/R calibration. The currently used second order polynomials for the E/R calibration fitting are unphysical and shall be replaced by a linear energy calibration for NaI and SiPIN, owing to high linearity and dynamic range of the modern digital DAQ systems, and resolution calibration functions shall be modified to reflect the underlying physical processes. Alternatively, one can completely abandon the use of fitting functions and use only point-values of E/R (similar to the efficiency calibration currently used) at the energies relevant for the isotopes of interest (ROI - Regions Of Interest ). Current analysis considers the detector as a set of single channel analysers, with an established set of coefficients relating the positions of ROIs with the positions of the QC peaks. The analysis of the spectra can be made more robust using peak and background fitting in the ROIs with a single free parameter (peak area) of the potential peaks from the known isotopes and a fixed E/R calibration values set.
Getting to Zero Yield: The Evolution of the U.S. Position on the CTBT
NASA Astrophysics Data System (ADS)
Zimmerman, Peter D.
1998-03-01
In 1994 the United States favored a Comprehensive Test Ban Treaty (CTBT) which permitted tiny "hydronuclear" experiments with a nuclear energy release of four pounds or less. Other nuclear powers supported yield limits as high as large fractions of a kiloton, while most non-nuclear nations participating in the discussions at the United Nations Conference on Disarmament wanted to prohibit all nuclear explosions -- some even favoring an end to computer simulations. On the other hand, China wished an exception to permit high yield "peaceful" nuclear explosions. For the United States to adopt a new position favoring a "true zero" several pieces had to fall into place: 1) The President had to be assured that the U.S. could preserve the safety and reliability of the enduring stockpile without yield testing; 2) the U.S. needed to be sure that the marginal utility of zero-yield experiments was at least as great for this country as for any other; 3) that tests with any nuclear yield might have more marginal utility for nuclear proliferators than for the United States, thus marginally eroding this country's position; 4) the United States required a treaty which would permit maintenance of the capacity to return to testing should a national emergency requiring a nuclear test arise; and 5) all of the five nuclear weapons states had to realize that only a true-zero CTBT would have the desired political effects. This paper will outline the physics near zero yield and show why President Clinton was persuaded by arguments from many viewpoints to endorse a true test ban in August, 1996 and to sign the CTBT in September, 1997.
Precipitation Nowcast using Deep Recurrent Neural Network
NASA Astrophysics Data System (ADS)
Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.
2016-12-01
An accurate precipitation nowcast (0-6 hours) with a fine temporal and spatial resolution has always been an important prerequisite for flood warning, streamflow prediction and risk management. Most of the popular approaches used for forecasting precipitation can be categorized into two groups. One type of precipitation forecast relies on numerical modeling of the physical dynamics of atmosphere and another is based on empirical and statistical regression models derived by local hydrologists or meteorologists. Given the recent advances in artificial intelligence, in this study a powerful Deep Recurrent Neural Network, termed as Long Short-Term Memory (LSTM) model, is creatively used to extract the patterns and forecast the spatial and temporal variability of Cloud Top Brightness Temperature (CTBT) observed from GOES satellite. Then, a 0-6 hours precipitation nowcast is produced using a Precipitation Estimation from Remote Sensing Information using Artificial Neural Network (PERSIANN) algorithm, in which the CTBT nowcast is used as the PERSIANN algorithm's raw inputs. Two case studies over the continental U.S. have been conducted that demonstrate the improvement of proposed approach as compared to a classical Feed Forward Neural Network and a couple simple regression models. The advantages and disadvantages of the proposed method are summarized with regard to its capability of pattern recognition through time, handling of vanishing gradient during model learning, and working with sparse data. The studies show that the LSTM model performs better than other methods, and it is able to learn the temporal evolution of the precipitation events through over 1000 time lags. The uniqueness of PERSIANN's algorithm enables an alternative precipitation nowcast approach as demonstrated in this study, in which the CTBT prediction is produced and used as the inputs for generating precipitation nowcast.
The National Data Center Preparedness Exercise 2009 - First Results
NASA Astrophysics Data System (ADS)
Gestermann, Nicolai; Bönnemann, Christian; Ceranna, Lars; Wotawa, Gerhard
2010-05-01
The NDC preparedness initiative was initiated by 8 signature states. It has now a history of more than 2 years with two successful exercises and subsequent fruitful discussions during the NDC Evaluation Workshops of the CTBTO. The first exercise was carried out in 2007 (NPE07). The objectives of and the idea behind this exercise have been described in the working paper CTBT/WGB-28/DE-IT/1 of the CTBTO. The exercise simulates a fictitious violation of the CTBT and all NDCs are invited to clarify the nature of the selected event. This exercise should help to evaluate the effectiveness of analysis procedures applied at NDCs, as well as the quality, completeness, and usefulness of IDC products. Moreover, the NPE is a measure for the readiness of the NDCs to fulfil their duties in regard of the CTBT verification: the treaty compliance based judgments about the nature of events as natural or artificial and chemical or nuclear, respectively. The NPE09 has started on 1 October 2009, 00:00 UTC. In addition to the previous exercises, three technologies (seismology, infrasound, and radionuclide) have been taken into account leading to tentative mock events generated by strong explosions in open pit mines. Consequently, the first event, which fulfils all previously defined criteria, was close to the Kara-Zhyra mine in Eastern Kazakhstan and occurred on 28 November 2009 at 07:20:31 UTC. It generated seismic signals as well as infrasound signals at the closest IMS stations. The forward atmospheric transport modelling indicated that a sufficient number of radionuclide stations were also affected to enable the application of a negative testing scenario. First results of the seismo-acoustic analysis of the NPE09 event were presented along with details on the event selection process.
NASA Astrophysics Data System (ADS)
Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter
2016-04-01
The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.
Reviews of the Comprehensive Nuclear-Test-Ban Treaty and U.S. security
NASA Astrophysics Data System (ADS)
Jeanloz, Raymond
2017-11-01
Reviews of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) by the National Academy of Sciences concluded that the United States has the technical expertise and physical means to i) maintain a safe, secure and reliable nuclear-weapons stockpile without nuclear-explosion testing, and ii) effectively monitor global compliance once the Treaty enters into force. Moreover, the CTBT is judged to help constrain proliferation of nuclear-weapons technology, so it is considered favorable to U.S. security. Review of developments since the studies were published, in 2002 and 2012, show that the study conclusions remain valid and that technical capabilities are better than anticipated.
Geologic constraints on clandestine nuclear testing in South Asia
Davis, Dan M.; Sykes, Lynn R.
1999-01-01
Cavity decoupling in salt is the most plausible means by which a nation could conduct clandestine testing of militarily significant nuclear weapons. The conditions under which solution-mined salt can be used for this purpose are quite restrictive. The salt must be thick and reasonably pure. Containment of explosions sets a shallow limit on depth, and cavity stability sets a deep limit. These constraints are met in considerably <1% of the total land area of India and Pakistan. Most of that area is too dry for cavity construction by solution mining; disposal of brine in rivers can be detected easily. Salt domes, the most favorable structures for constructing large cavities, are not present in India and Pakistan. Confidence that they are adhering to the Comprehensive Test Ban Treaty (CTBT) is enhanced by their geological conditions, which are quite favorable to verification, not evasion. Thus, their participation in the CTBT is constrained overwhelmingly by political, not scientific, issues. Confidence in the verification of the CTBT could be enhanced if India and Pakistan permitted stations of the various monitoring technologies that are now widely deployed elsewhere to be operated on their territories. PMID:10500134
NASA Astrophysics Data System (ADS)
Becker, A.; Ceranna, L.; Ross, O.; Schneider, U.; Meyer-Christoffer, A.; Ziese, M.; Lehner, K.; Rudolf, B.
2012-04-01
As contribution to the World Climate Research Program (WCRP) and in support of the Global Climate Observing System (GCOS) of the World Meteorological Organization (WMO), the Deutscher Wetterdienst (DWD) operates the Global Precipitation Climatology Centre (GPCC). The GPCC re-analysis and near-real time monitoring products are recognized world-wide as the most reliable global data set on rain-gauge based (in-situ) precipitation measurements. The GPCC Monitoring Product (Schneider et al, 2011; Becker et al. 2012, Ziese et al, EGU2012-5442) is available two months after the fact based on the data gathered while listening to the GTS to fetch the SYNOP and CLIMAT messages. This product serves also the reference data to calibrate satellite based precipitation measurements yielding the Global Precipitation Climatology Project (GPCP) data set (Huffmann et al., 2009). The quickest GPCC product is the First Guess version of the GPCC Monitoring Product being available already 3-5 days after the month regarded. Both, the GPCC and the GPCP products bear the capability to serve as data base for the computational light-weight post processing of the wet deposition impact on the radionuclide (RN) monitoring capability of the CTBT network (Wotawa et al., 2009) on the regional and global scale, respectively. This is of major importance any time, a reliable quantitative assessment of the source-receptor sensitivity is needed, e.g. for the analysis of isotopic ratios. Actually the wet deposition recognition is a prerequisite if ratios of particulate and noble gas measurements come into play. This is so far a quite unexplored field of investigation, but would alleviate the clearance of several apparently CTBT relevant detections, encountered in the past, as bogus and provide an assessment for the so far overestimation of the RN detection capability of the CTBT network. Besides the climatological kind of wet deposition assessment for threshold monitoring purposes, there are also singular release events like the Fukushima accident that need to be classified as bogus by a properly working RN verification regime. For these kinds of events a higher temporal resolution of the precipitation data sets is needed. In course of the research project 'Global DAily Precipitation Analysis for the validation of medium-range CLImate Predictions (DAPACLIP) within the Framework Research Programme MiKlip (Mittelfristige Klimaprognose), funded by the German ministry for research (BMBF), a new quality controlled and globally gridded daily precipitation data set is built up, where GPCC will serve the land-surface compartment. The data set is primarily constructed to study decadal behaviour of the essential climate variable precipitation, but as a collateral benefit it will also serve RN verification regime needs. The Fukushima accident has also provided impetus to construct even hourly in-situ precipitation data sets as will be presented in the same session by Yatagai (2012). A comprehensive overview on available precipitation data sets based on in-situ (rain gauge), satellite measurements or the combination of both systems is available from the International Precipitation Working Group (IPWG) web pages (http://www.isac.cnr.it/~ipwg/data/datasets.html).
NASA Astrophysics Data System (ADS)
Sussman, A. J.; Macleod, G.; Labak, P.; Malich, G.; Rowlands, A. P.; Craven, J.; Sweeney, J. J.; Chiappini, M.; Tuckwell, G.; Sankey, P.
2015-12-01
The Integrated Field Exercise of 2014 (IFE14) was an event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of an on-site inspection (OSI) within the CTBT verification regime. During an OSI, up to 40 international inspectors will search an area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of a real OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams (which executed the scenario in which the exercise was played) and those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test and integrate Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, suites of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, in addition to other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection using other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials, and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of the goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.
An Improved Method for Seismic Event Depth and Moment Tensor Determination: CTBT Related Application
NASA Astrophysics Data System (ADS)
Stachnik, J.; Rozhkov, M.; Baker, B.
2016-12-01
According to the Protocol to CTBT, International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event. Determination of seismic event source mechanism and its depth is a part of these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. We show preliminary results using the latter approach from an improved software design and applied on a moderately powered computer. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK 2009, 2013 and 2016 events and shallow earthquakes using a new implementation of waveform fitting of teleseismic P waves. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Moment tensors for DPRK events show isotropic percentages greater than 50%. Depth estimates for the DPRK events range from 1.0-1.4 km. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.
Technology Advancement and the CTBT: Taking One Step Back from the Nuclear Brink
NASA Astrophysics Data System (ADS)
Perry, W. J.
2016-12-01
Technology plays a pivotal role in international nuclear security and technological advancement continues to support a path toward stability. One near-term and readily-obtainable step back from the nuclear brink is the Comprehensive Nuclear-test Ban Treaty (CTBT). The technology to independently verify adherence to the CTBT has matured in the 20 years since the Treaty was opened for signature. Technology has also improved the safety and reliability of the US nuclear stockpile in the absence of testing. Due to these advances over the past two decades neither verification nor stockpiles effectiveness should be an impediment to the Treaty's entry into force. Other technical and geo-political evolution in this same period has changed the perceived benefit of nuclear weapons as instruments of security. Recognizing the change technology has brought to deliberation of nuclear security, nations are encouraged to take this one step away from instability.This presentation will reflect on the history and assumptions that have been used to justify the build-up and configuration of nuclear stockpiles, the changes in technology and conditions that alter the basis of these original assumptions, and the re-analysis of security using current and future assumptions that point to the need for revised nuclear policies. The author has a unique and well informed perspective as both the most senior US Defense Official and a technologist.
Silicon PIN diode based electron-gamma coincidence detector system for Noble Gases monitoring.
Khrustalev, K; Popov, V Yu; Popov, Yu S
2017-08-01
We present a new second generation SiPIN based electron-photon coincidence detector system developed by Lares Ltd. for use in the Noble Gas measurement systems of the International Monitoring System and the On-site Inspection verification regimes of the Comprehensive Nuclear-Test Ban Treaty (CTBT). The SiPIN provide superior energy resolution for electrons. Our work describes the improvements made in the second generation detector cells and the potential use of such detector systems for other applications such as In-Situ Kr-85 measurements for non-proliferation purposes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fifty Years of Seismic Monitoring in Davao,Philippines
NASA Astrophysics Data System (ADS)
McNamara, D. J.
2016-12-01
The Manila Observatory was a 150 years old as of 2015. Fiftry years ago it began a seismic monitoring station in the Island of Mindanao, outside the city of Davao, 7 deg. N and 121 deg. E. approxiamtely. This station was chosen not only for its position on the Ring of Fire but also for the fact the the dip angle of the earth's manetic field is zeo at that location. When the CTBT was established and the Republic of the Philippines (RP) a signatory, the Davao station by agreement with RP, began to send its seismic data to the CTBT database in Vienna. This has continued to the present day with support from CTBTO for updates in equipment and maintainence. We discuss if such a private+government model is the way forward for more comprehensive monitoring in the future.
Merging Infrasound and Electromagnetic Signals as a Means for Nuclear Explosion Detection
NASA Astrophysics Data System (ADS)
Ashkenazy, Joseph; Lipshtat, Azi; Kesar, Amit S.; Pistinner, Shlomo; Ben Horin, Yochai
2016-04-01
The infrasound monitoring network of the CTBT consists of 60 stations. These stations are capable of detecting atmospheric events, and may provide approximate location within time scale of a few hours. However, the nature of these events cannot be deduced from the infrasound signal. More than two decades ago it was proposed to use the electromagnetic pulse (EMP) as a means of discriminating nuclear explosion from other atmospheric events. An EMP is a unique signature of nuclear explosion and is not detected from chemical ones. Nevertheless, it was decided to exclude the EMP technology from the official CTBT verification regime, mainly because of the risk of high false alarm rate, due to lightning electromagnetic pulses [1]. Here we present a method of integrating the information retrieved from the infrasound system with the EMP signal which enables us to discriminate between lightning discharges and nuclear explosions. Furthermore, we show how spectral and other characteristics of the electromagnetic signal emitted from a nuclear explosion are distinguished from those of lightning discharge. We estimate the false alarm probability of detecting a lightning discharge from a given area of the infrasound event, and identifying it as a signature of a nuclear explosion. We show that this probability is very low and conclude that the combination of infrasound monitoring and EMP spectral analysis may produce a reliable method for identifying nuclear explosions. [1] R. Johnson, Unfinished Business: The Negotiation of the CTBT and the End of Nuclear Testing, United Nations Institute for Disarmament Research, 2009.
Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis
NASA Astrophysics Data System (ADS)
Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.
2014-03-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.
NASA Astrophysics Data System (ADS)
Hawkins, W.; Sussman, A. J.; Kelley, R. E.; Wohletz, K. H.; Schultz-Fellenz, E. S.
2013-12-01
On-site inspection (OSI) is the final verification measure of the Comprehensive Nuclear Test Ban Treaty (CTBT). OSIs rely heavily on geologic and geophysical investigations. The objective is to apply methods that are effective, efficient and minimally intrusive. We present a general overview of the OSI as provisioned in the CTBT, specifying the allowed techniques and the timeline for their application. A CTBT OSI relies on many geological, geophysical and radiological methods. The search area for an OSI is mostly defined by uncertainty in the location of a suspect event detected by the International Monitoring System (IMS) and reported through the International Data Center and can be as large as 1000 km2. Thus OSI methods are fundamentally divided into general survey methods that narrow the search area and more focused, detailed survey methods to look for evidence of a potential underground explosion and try to find its location within an area of several km2. The purpose and goal of a CTBT OSI, as specified in the Article IV of the Treaty, is 'to clarify whether a nuclear explosion has been carried out in violation of the Treaty' and to 'gather any facts which might assist in identifying any possible violator.' Through the use of visual, geophysical, and radiological techniques, OSIs can detect and characterize anomalies and artifacts related to the event that triggered the inspection. In the context of an OSI, an 'observable' is a physical property that is important to recognize and document because of its relevance to the purpose of the inspection. Potential observables include: (1) visual observables such as ground/environmental disturbances and manmade features, (2) geophysical techniques that provide measurements of altered and damaged ground and buried artifacts, and (3) radiological measurements on samples. Information provided in this presentation comes from observations associated with historical testing activities that were not intended to go undetected. Every CTBT OSI will be different, and the observables present and detectable within an Inspection Area (IA) will depend on many factors, such as location, geology, emplacement configuration, climate, and the time elapsed after the event before the deployment of the Inspection Team (IT). A successful OSI is contingent on familiarity with potential observables, the suitability of the equipment to detect and characterize relevant observables, and the team's ability to document and integrate all the information into comprehensive, logical, and factual reports. In preparation for an OSI, a variety of types, scales, and generations of open-source digital imagery can be compared using geographic information systems (GIS) to focus on areas of interest. Simple image comparison from various open sources within GIS afford the opportunity to view anthropogenic and natural changes to locations of interest over time, thus remotely elucidating information about a site's use and level of activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P E; Rock, D; Rodgers, A J
1999-07-23
Calibration of hydroacoustic and T-phase stations for Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring will be an important element in establishing new operational stations and upgrading existing stations. Calibration of hydroacoustic stations is herein defined as precision location of the hydrophones and determination of the amplitude response from a known source energy. T-phase station calibration is herein defined as a determination of station site attenuation as a function of frequency, bearing, and distance for known impulsive energy sources in the ocean. To understand how to best conduct calibration experiments for both hydroacoustic and T-phase stations, an experiment was conducted in May, 1999more » at Ascension Island in the South Atlantic Ocean. The experiment made use of a British oceanographic research vessel and collected data that will be used for CTBT issues and for fundamental understanding of the Ascension Island volcanic edifice.« less
National data centre preparedness exercise 2015 (NPE2015): MY-NDC progress result and experience
NASA Astrophysics Data System (ADS)
Rashid, Faisal Izwan Abdul; Zolkaffly, Muhammed Zulfakar
2017-01-01
Malaysia has established the National Data Centre (MY-NDC) in December 2005. MY-NDC is tasked to perform the Comprehensive Nuclear-Test-Ban-Treaty (CTBT) data management as well as providing relevant information for Treaty related events to the Malaysian Nuclear Agency (Nuclear Malaysia) as the CTBT National Authority. In the late 2015, MY-NDC has participated in the National Data Centre Preparedness Exercise 2015 (NPE 2015) which aims to access the level of readiness at MY-NDC. This paper aims at presenting the progress result of NPE 2015 as well as highlighting MY-NDC experience in NPE 2015 compared to previous participation in NPE 2013. MY-NDC has utilised available resources for NPE 2015. In NPE 2015, MY-NDC has performed five type of analyses compared with only two analyses in NPE 2013. Participation in the NPE 2015 has enabled MY-NDC to assess its capability and identify rooms for improvement.
Three years of operational experience from Schauinsland CTBT monitoring station.
Zähringer, M; Bieringer, J; Schlosser, C
2008-04-01
Data from three years of operation of a low-level aerosol sampler and analyzer (RASA) at Schauinsland monitoring station are reported. The system is part of the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The fully automatic system is capable to measure aerosol borne gamma emitters with high sensitivity and routinely quantifies 7Be and 212Pb. The system had a high level of data availability of 90% within the reporting period. A daily screening process rendered 66 tentative identifications of verification relevant radionuclides since the system entered IMS operation in February 2004. Two of these were real events and associated to a plausible source. The remaining 64 cases can consistently be explained by detector background and statistical phenomena. Inter-comparison with data from a weekly sampler operated at the same station shows instabilities of the calibration during the test phase and a good agreement since certification of the system.
Listening to sounds from an exploding meteor and oceanic waves
NASA Astrophysics Data System (ADS)
Evers, L. G.; Haak, H. W.
Low frequency sound (infrasound) measurements have been selected within the Comprehensive Nuclear-Test-Ban Treaty (CTBT) as a technique to detect and identify possible nuclear explosions. The Seismology Division of the Royal Netherlands Meteorological Institute (KNMI) operates since 1999 an experimental infrasound array of 16 micro-barometers. Here we show the rare detection and identification of an exploding meteor above Northern Germany on November 8th, 1999 with data from the Deelen Infrasound Array (DIA). At the same time, sound was radiated from the Atlantic Ocean, South of Iceland, due to the atmospheric coupling of standing ocean waves, called microbaroms. Occurring with only 0.04 Hz difference in dominant frequency, DIA proved to be able to discriminate between the physically different sources of infrasound through its unique lay-out and instruments. The explosive power of the meteor being 1.5 kT TNT is in the range of nuclear explosions and therefore relevant to the CTBT.
NASA Astrophysics Data System (ADS)
Becker, A.; Wotawa, G.; Zähringer, M.
2009-04-01
Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), airborne radioactivity is measured by means of high purity Germanium gamma ray detectors deployed in a global monitoring network. Almost 60 of the scheduled 80 stations have been put in provisional operations by the end of 2008. Each station daily sends the 24 hour samples' spectroscopic data to the Vienna based Provisional Technical Secretariat (PTS) of the CTBT Organization (CTBTO) for review for treaty-relevant nuclides. Cs-137 is one of these relevant isotopes. Its typical minimum detectable concentration is in the order of a few Bq/m3. However, this isotope is also known to occur in atmospheric trace concentrations, due to known non CTBT relevant processes and sources related to, for example, the re-suspension of cesium from historic nuclear tests and/or the Chernobyl reactor disaster, temporarily enhanced by bio-mass burning (Wotawa et al. 2006). Properly attributed cesium detections can be used as a proxy to detect Aeolian dust events (Igarashi et al, 2001) that potentially carry cesium from all aforementioned sources but are also known to play an important role for the radiative forcing in the atmosphere (shadow effect), at the surface (albedo) and the carbon dioxide cycle when interacting with oceanic phytoplankton (Mikami and Shi, 2005). In this context this paper provides a systematic attribution of recent Cs-137 detections in the PTS monitoring network in order to Characterize those stations which are regularly affected by Cs-137 Provide input for procedures that distinguish CTBT relevant detection from other sources (event screening) Explore on the capability of certain stations to use their Cs-137 detections as a proxy to detect aeolian dust events and to flag the belonging filters to be relevant for further investigations in this field (-> EGU-2009 Session CL16/AS4.6/GM10.1: Aeolian dust: initiator, player, and recorder of environmental change). References Igarashi, Y., M. Aoyama, K. Hirose,M. Takashi and S. Yabuki, 2001: Is It Possible to Use 90Sr and 137Cs As Tracers for the Aeolian Dust Transport? Water, Air, & Soil Pollution 130, 349-354. Mikami, M. and G. Shi, 2005: Preliminary summary of aeolian dust experiment on climate impact -Japan-Sino joint project ADEC. Geophysical Research Abstracts, 7, 05985 Wotawa, G., L.-E. De Geer, A. Becker, R.D'Amours, M. Jean, R. Servranck and K. Ungar, 2006: Inter- and intra-continental transport of radioactive cesium released by boreal forest fires, Geophys. Res. Lett. 33, L12806, doi: 10.1029/2006GL026206 Disclaimer The views expressed in this publication are those of the author and do not necessarily reflect the views of the CTBTO Preparatory Commission.
Using the DOE Knowledge Base for Special Event Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, H.M.; Harris, J.M.; Young, C.J.
1998-10-20
The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by spatial proximity searches or through waveform correlation processing. The locations and waveforms of these events can then be made available for side-by-side comparison and processing. If synthetic modeling is thought to be warranted, a wide variety of rele- vant contextu~l information (e.g. crustal thickness and layering, seismic velocities, attenuation factors) can be retrieved and sent to the appropriate applications. Once formedj the synthetics can then be brought in for side-by-side comparison and fhrther processing. Based on our study, we make two general recommendations. First, proper inter-process communication between sensor data analysis software and contextual data analysis sofisvare should be developed. Second, some of the Knowl- edge Base data sets should be prioritized or winnowed to streamline comparison with observed quantities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saey, P. R.J.; Ringbom, Anders; Bowyer, Ted W.
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) specifies that radioxenon measurements should be performed at 40 or more stations worldwide within the International Monitoring System (IMS). Measuring radioxenon is one of the principle techniques to detect underground nuclear explosions. Specifically, presence and ratios of different radioxenon isotopes allows determining whether a detection event under consideration originated from a nuclear explosion or a civilian source. However, radioxenon monitoring on a global scale is a novel technology and the global civil background must be characterized sufficiently. This paper lays out a study, based on several unique measurement campaigns, of the worldwide concentrations and sourcesmore » of verification relevant xenon isotopes. It complements the experience already gathered with radioxenon measurements within the CTBT IMS programme and focuses on locations in Belgium, Germany, Kuwait, Thailand and South Africa where very little information was available on ambient xenon levels or interesting sites offered opportunities to learn more about emissions from known sources. The findings corroborate the hypothesis that a few major radioxenon sources contribute in great part to the global radioxenon background. Additionally, the existence of independent sources of 131mXe (the daughter of 131I) has been demonstrated, which has some potential to bias the isotopic signature of signals from nuclear explosions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, James W., LTC
2000-09-15
These proceedings contain papers prepared for the 22nd Annual DoD/DOE Seismic Research Symposium: Planning for Verification of and Compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), held 13-15 September 2000 in New Orleans, Louisiana. These papers represent the combined research related to ground-based nuclear explosion monitoring funded by the National Nuclear Security Administration (NNSA), Defense Threat Reduction Agency (DTRA), Air Force Technical Applications Center (AFTAC), Department of Defense (DoD), US Army Space and Missile Defense Command, Defense Special Weapons Agency (DSWA), and other invited sponsors. The scientific objectives of the research are to improve the United States capability to detect, locate,more » and identify nuclear explosions. The purpose of the meeting is to provide the sponsoring agencies, as well as potential users, an opportunity to review research accomplished during the preceding year and to discuss areas of investigation for the coming year. For the researchers, it provides a forum for the exchange of scientific information toward achieving program goals, and an opportunity to discuss results and future plans. Paper topics include: seismic regionalization and calibration; detection and location of sources; wave propagation from source to receiver; the nature of seismic sources, including mining practices; hydroacoustic, infrasound, and radionuclide methods; on-site inspection; and data processing.« less
NASA Astrophysics Data System (ADS)
Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.
2013-12-01
The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.
CTBT infrasound network performance to detect the 2013 Russian fireball event
Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; ...
2015-03-18
The explosive fragmentation of the 2013 Chelyabinsk meteorite generated a large airburst with an equivalent yield of 500 kT TNT. It is the most energetic event recorded by the infrasound component of the Comprehensive Nuclear-Test-Ban Treaty-International Monitoring System (CTBT-IMS), globally detected by 20 out of 42 operational stations. This study performs a station-by-station estimation of the IMS detection capability to explain infrasound detections and nondetections from short to long distances, using the Chelyabinsk meteorite as global reference event. Investigated parameters influencing the detection capability are the directivity of the line source signal, the ducting of acoustic energy, and the individualmore » noise conditions at each station. Findings include a clear detection preference for stations perpendicular to the meteorite trajectory, even over large distances. Only a weak influence of stratospheric ducting is observed for this low-frequency case. As a result, a strong dependence on the diurnal variability of background noise levels at each station is observed, favoring nocturnal detections.« less
NASA Astrophysics Data System (ADS)
Young, John; Peacock, Sheila
2016-04-01
The year 1996 has particular significance for forensic seismologists. This was the year when the Comprehensive Test Ban Treaty (CTBT) was signed in September at the United Nations, setting an international norm against nuclear testing. Blacknest, as a long time seismic centre for research into detecting and identifying underground explosions using seismology, provided significant technical advice during the CTBT negotiations. Since 1962 seismic recordings of both presumed nuclear explosions and earthquakes from the four seismometer arrays Eskdalemuir, Scotland (EKA), Yellowknife, Canada (YKA), Gauribidanur, India (GBA), and Warramunga, Australia (WRA) have been copied, digitised, and saved. There was a possibility this archive would be lost. It was decided to process the records and catalogue them for distribution to other groups and institutions. This work continues at Blacknest but the archive is no longer under threat. In addition much of the archive of analogue tape recordings has been re-digitised with modern equipment, allowing sampling rates of 100 rather than 20 Hz.
Nuclear Explosion Monitoring History and Research and Development
NASA Astrophysics Data System (ADS)
Hawkins, W. L.; Zucca, J. J.
2008-12-01
Within a year after the nuclear detonations over Hiroshima and Nagasaki the Baruch Plan was presented to the newly formed United Nations Atomic Energy Commission (June 14, 1946) to establish nuclear disarmament and international control over all nuclear activities. These controls would allow only the peaceful use of atomic energy. The plan was rejected through a Security Council veto primarily because of the resistance to unlimited inspections. Since that time there have been many multilateral, and bilateral agreements, and unilateral declarations to limit or eliminate nuclear detonations. Almost all of theses agreements (i.e. treaties) call for some type of monitoring. We will review a timeline showing the history of nuclear testing and the more important treaties. We will also describe testing operations, containment, phenomenology, and observations. The Comprehensive Nuclear Test Ban Treaty (CTBT) which has been signed by 179 countries (ratified by 144) established the International Monitoring System global verification regime which employs seismic, infrasound, hydroacoustic and radionuclide monitoring techniques. The CTBT also includes on-site inspection to clarify whether a nuclear explosion has been carried out in violation of the Treaty. The US Department of Energy (DOE) through its National Nuclear Security Agency's Ground-Based Nuclear Explosion Monitoring R&D Program supports research by US National Laboratories, and universities and industry internationally to detect, locate, and identify nuclear detonations. This research program builds on the broad base of monitoring expertise developed over several decades. Annually the DOE and the US Department of Defense jointly solicit monitoring research proposals. Areas of research include: seismic regional characterization and wave propagation, seismic event detection and location, seismic identification and source characterization, hydroacoustic monitoring, radionuclide monitoring, infrasound monitoring, and data processing and analysis. Reports from the selected research projects are published in the proceedings of the annual Monitoring Research Review conference.
Detection capability of the IMS seismic network based on ambient seismic noise measurements
NASA Astrophysics Data System (ADS)
Gaebler, Peter J.; Ceranna, Lars
2016-04-01
All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection threshold can be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.
OSI Passive Seismic Experiment at the Former Nevada Test Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweeney, J J; Harben, P
On-site inspection (OSI) is one of the four verification provisions of the Comprehensive Nuclear Test Ban Treaty (CTBT). Under the provisions of the CTBT, once the Treaty has entered into force, any signatory party can request an on-site inspection, which can then be carried out after approval (by majority voting) of the Executive Council. Once an OSI is approved, a team of 40 inspectors will be assembled to carry out an inspection to ''clarify whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of Article I''. One challenging aspect of carrying outmore » an on-site inspection (OSI) in the case of a purported underground nuclear explosion is to detect and locate the underground effects of an explosion, which may include an explosion cavity, a zone of damaged rock, and/or a rubble zone associated with an underground collapsed cavity. The CTBT (Protocol, Section II part D, paragraph 69) prescribes several types of geophysical investigations that can be carried out for this purpose. One of the methods allowed by the CTBT for geophysical investigation is referred to in the Treaty Protocol as ''resonance seismometry''. This method, which was proposed and strongly promoted by Russia during the Treaty negotiations, is not described in the Treaty. Some clarification about the nature of the resonance method can be gained from OSI workshop presentations by Russian experts in the late 1990s. Our understanding is that resonance seismometry is a passive method that relies on seismic reverberations set up in an underground cavity by the passage of waves from regional and teleseismic sources. Only a few examples of the use of this method for detection of underground cavities have been presented, and those were done in cases where the existence and precise location of an underground cavity was known. As is the case with many of the geophysical methods allowed during an OSI under the Treaty, how resonance seismology really works and its effectiveness for OSI purposes has yet to be determined. For this experiment, we took a broad approach to the definition of ''resonance seismometry''; stretching it to include any means that employs passive seismic methods to infer the character of underground materials. In recent years there have been a number of advances in the use of correlation and noise analysis methods in seismology to obtain information about the subsurface. Our objective in this experiment was to use noise analysis and correlation analysis to evaluate these techniques for detecting and characterizing the underground damage zone from a nuclear explosion. The site that was chosen for the experiment was the Mackerel test in Area 4 of the former Nevada Test Site (now named the Nevada National Security Site, or NNSS). Mackerel was an underground nuclear test of less than 20 kT conducted in February of 1964 (DOENV-209-REV 15). The reason we chose this site is because there was a known apical cavity occurring at about 50 m depth above a rubble zone, and that the site had been investigated by the US Geological Survey with active seismic methods in 1965 (Watkins et al., 1967). Note that the time delay between detonation of the explosion (1964) and the time of the present survey (2010) is nearly 46 years - this would not be typical of an expected OSI under the CTBT.« less
Definition of Exclusion Zones Using Seismic Data
NASA Astrophysics Data System (ADS)
Bartal, Y.; Villagran, M.; Ben Horin, Y.; Leonard, G.; Joswig, M.
- In verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), there is a motivation to be effective, efficient and economical and to prevent abuse of the right to conduct an On-site Inspection (OSI) in the territory of a challenged State Party. In particular, it is in the interest of a State Party to avoid irrelevant search in specific areas. In this study we propose several techniques to determine `exclusion zones', which are defined as areas where an event could not have possibly occurred. All techniques are based on simple ideas of arrival time differences between seismic stations and thus are less prone to modeling errors compared to standard event location methods. The techniques proposed are: angular sector exclusion based on a tripartite micro array, half-space exclusion based on a station pair, and closed area exclusion based on circumferential networks.
WOSMIP II- Workshop on Signatures of Medical and Industrial Isotope Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Murray; Achim, Pascal; Auer, M.
2011-11-01
Medical and industrial fadioisotopes are fundamental tools used in science, medicine and industry with an ever expanding usage in medical practice where their availability is vital. Very sensitive environmental radionuclide monitoring networks have been developed for nuclear-security-related monitoring [particularly Comprehensive Test-Ban-Treaty (CTBT) compliance verification] and are now operational.
NASA Astrophysics Data System (ADS)
Le Pichon, Alexis; Ceranna, Lars; Taillepied, Doriane
2015-04-01
To monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a dedicated network is being deployed. Multi-year observations recorded by the International Monitoring System (IMS) infrasound network confirm that its detection capability is highly variable in space and time. Today, numerical modeling techniques provide a basis to better understand the role of different factors describing the source and the atmosphere that influence propagation predictions. Previous studies estimated the radiated source energy from remote observations using frequency dependent attenuation relation and state-of-the-art specifications of the stratospheric wind. In order to account for a realistic description of the dynamic structure of the atmosphere, model predictions are further enhanced by wind and temperature error distributions as measured in the framework of the ARISE project (http://arise-project.eu/). In the context of the future verification of the CTBT, these predictions quantify uncertainties in the spatial and temporal variability of the IMS infrasound network performance in higher resolution, and will be helpful for the design and prioritizing maintenance of any arbitrary infrasound monitoring network.
NASA Astrophysics Data System (ADS)
Le Pichon, Alexis; Blanc, Elisabeth; Rüfenacht, Rolf; Kämpfer, Niklaus; Keckhut, Philippe; Hauchecorne, Alain; Ceranna, Lars; Pilger, Christoph; Ross, Ole
2014-05-01
To monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a dedicated network is being deployed. Multi-year observations recorded by the International Monitoring System (IMS) infrasound network confirm that its detection capability is highly variable in space and time. Today, numerical modeling techniques provide a basis to better understand the role of different factors describing the source and the atmosphere that influence propagation predictions. Previous studies estimated the radiated source energy from remote observations using frequency dependent attenuation relation and state-of-the-art specifications of the stratospheric wind. In order to account for a realistic description of the dynamic structure of the atmosphere, model predictions are further enhanced by wind and temperature error distributions as measured in the framework of the ARISE project (http://arise-project.eu/). In the context of the future verification of the CTBT, these predictions quantify uncertainties in the spatial and temporal variability of the IMS infrasound network performance in higher resolution, and will be helpful for the design and prioritizing maintenance of any arbitrary infrasound monitoring network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, J.R.; Marshall, M.E.; Barker, B.W.
In situations where cavity decoupling of underground nuclear explosions is a plausible evasion scenario, comprehensive seismic monitoring of any eventual CTBT will require the routine identification of many small seismic events with magnitudes in the range 2.0 < m sub b < 3.5. However, since such events are not expected to be detected teleseismically, their magnitudes will have to be estimated from regional recordings using seismic phases and frequency bands which are different from those employed in the teleseismic m sub b scale which is generally used to specify monitoring capability. Therefore, it is necessary to establish the m submore » b equivalences of any selected regional magnitude measures in order to estimate the expected detection statistics and thresholds of proposed CTBT seismic monitoring networks. In the investigations summarized in this report, this has been accomplished through analyses of synthetic data obtained by theoretically scaling observed regional seismic data recorded in Scandinavia and Central Asia from various tamped nuclear tests to obtain estimates of the corresponding seismic signals to be expected from small cavity decoupled nuclear tests at those same source locations.« less
The Nuclear Non-Proliferation Treaty and the Comprehensive Nuclear-Test-Ban Treaty, the relationship
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Thomas Jr.
The Nuclear Non-Proliferation Treaty (NPT) is the most important international security arrangement that we have that is protecting the world community and this has been true for many years. But it did not happen by accident, it is a strategic bargain in which 184 states gave up the right forever to acquire the most powerful weapon ever created in exchange for a commitment from the five states allowed to keep nuclear weapons under the NPT (U.S., U.K., Russia, France and China), to share peaceful nuclear technology and to engage in disarmament negotiations aimed at the ultimate elimination of their nuclearmore » stockpiles. The most important part of this is the comprehensive nuclear test ban (CTBT); the thinking by the 184 NPT non-nuclear weapon states was and is that they understand that the elimination of nuclear weapon stockpiles is a long way off, but at least the NPT nuclear weapon states could stop testing the weapons. The CTBT has been ratified by 161 states but by its terms it can only come into force if 44 nuclear potential states ratify; 36 have of the 44 have ratified it, the remaining eight include the United States and seven others, most of whom are in effect waiting for the United States. No state has tested a nuclear weapon-except for complete outlier North Korea-in 15 years. There appears to be no chance that the U.S. Senate will approve the CTBT for ratification in the foreseeable future, but the NPT may not survive without it. Perhaps it is time to consider an interim measure, for the UN Security Council to declare that any future nuclear weapon test any time, anywhere is a 'threat to peace and security', in effect a violation of international law, which in today's world it clearly would be.« less
NASA Astrophysics Data System (ADS)
Labak, Peter; Sussman, Aviva; Rowlands, Aled; Chiappini, Massimo; Malich, Gregor; MacLeod, Gordon; Sankey, Peter; Sweeney, Jerry; Tuckwell, George
2016-04-01
The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) that tested the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI). During an OSI, up to 40 inspectors search a 1000km2 inspection area for evidence of a nuclear explosion. Over 250 experts from ~50 countries were involved in IFE14 (the largest simulation of an OSI to date) and worked from a number of different directions, such as the Exercise Management and Control Teams to execute the scenario in which the exercise was played, to those participants performing as members of the Inspection Team (IT). One of the main objectives of IFE14 was to test Treaty allowed inspection techniques, including a number of geophysical and remote sensing methods. In order to develop a scenario in which the simulated exercise could be carried out, a number of physical features in the IFE14 inspection area were designed and engineered by the Scenario Task Force Group (STF) that the IT could detect by applying the geophysical and remote sensing inspection technologies, as well as other techniques allowed by the CTBT. For example, in preparation for IFE14, the STF modeled a seismic triggering event that was provided to the IT to prompt them to detect and localize aftershocks in the vicinity of a possible explosion. Similarly, the STF planted shallow targets such as borehole casings and pipes for detection by other geophysical methods. In addition, airborne technologies, which included multi-spectral imaging, were deployed such that the IT could identify freshly exposed surfaces, imported materials and other areas that had been subject to modification. This presentation will introduce the CTBT and OSI, explain the IFE14 in terms of goals specific to geophysical and remote sensing methods, and show how both the preparation for and execution of IFE14 meet those goals.
Broadband seismology and the detection and verification of underground nuclear explosions
NASA Astrophysics Data System (ADS)
Tinker, Mark Andrew
1997-10-01
On September 24, 1996, President Clinton signed the Comprehensive Test Ban Treaty (CTBT), which bans the testing of all nuclear weapons thereby limiting their future development. Seismology is the primary tool used for the detection and identification of underground explosions and thus, will play a key role in monitoring a CTBT. The detection and identification of low yield explosions requires seismic stations at regional distances (<1500 km). However, because the regional wavefield propagates within the extremely heterogeneous crustal waveguide, the seismic waveforms are also very complicated. Therefore, it is necessary to have a solid understanding of how the phases used in regional discriminants develop within different tectonic regimes. Thus, the development of the seismic phases Pn and Lg, which compose the seismic discriminant Pn/Lg, within the western U.S. from the Non-Proliferation Experiment are evaluated. The most fundamental discriminant is event location as 90% of all seismic sources occur too deep within the earth to be unnatural. France resumed its nuclear testing program after a four year moratorium and conducted six tests during a five month period starting in September of 1995. Using teleseismic data, a joint hypocenter determination algorithm was used to determine the hypocenters of these six explosions. One of the most important problems in monitoring a CTBT is the detection and location of small seismic events. Although seismic arrays have become the central tool for event detection, in the context of a global monitoring treaty, there will be some dependence on sparse regional networks of three-component broadband seismic stations to detect low yield explosions. However, the full power of the data has not been utilized, namely using phases other than P and S. Therefore, the information in the surface wavetrain is used to improve the locations of small seismic events recorded on a sparse network in Bolivia. Finally, as a discrimination example in a complex region, P to S ratios are used to determine source parameters of the Msb{w} 8.3 deep Bolivia earthquake.
The Nuclear Non-Proliferation Treaty and the Comprehensive Nuclear-Test-Ban Treaty, the relationship
NASA Astrophysics Data System (ADS)
Graham, Thomas, Jr.
2014-05-01
The Nuclear Non-Proliferation Treaty (NPT) is the most important international security arrangement that we have that is protecting the world community and this has been true for many years. But it did not happen by accident, it is a strategic bargain in which 184 states gave up the right forever to acquire the most powerful weapon ever created in exchange for a commitment from the five states allowed to keep nuclear weapons under the NPT (U.S., U.K., Russia, France and China), to share peaceful nuclear technology and to engage in disarmament negotiations aimed at the ultimate elimination of their nuclear stockpiles. The most important part of this is the comprehensive nuclear test ban (CTBT); the thinking by the 184 NPT non-nuclear weapon states was and is that they understand that the elimination of nuclear weapon stockpiles is a long way off, but at least the NPT nuclear weapon states could stop testing the weapons. The CTBT has been ratified by 161 states but by its terms it can only come into force if 44 nuclear potential states ratify; 36 have of the 44 have ratified it, the remaining eight include the United States and seven others, most of whom are in effect waiting for the United States. No state has tested a nuclear weapon-except for complete outlier North Korea-in 15 years. There appears to be no chance that the U.S. Senate will approve the CTBT for ratification in the foreseeable future, but the NPT may not survive without it. Perhaps it is time to consider an interim measure, for the UN Security Council to declare that any future nuclear weapon test any time, anywhere is a "threat to peace and security", in effect a violation of international law, which in today's world it clearly would be.
The European Infrasound Bulletin
NASA Astrophysics Data System (ADS)
Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Vergoz, Julien; Le Pichon, Alexis; Brachet, Nicolas; Blanc, Elisabeth; Kero, Johan; Liszka, Ludwik; Gibbons, Steven; Kvaerna, Tormod; Näsholm, Sven Peter; Marchetti, Emanuele; Ripepe, Maurizio; Smets, Pieter; Evers, Laslo; Ghica, Daniela; Ionescu, Constantin; Sindelarova, Tereza; Ben Horin, Yochai; Mialle, Pierrick
2018-05-01
The European Infrasound Bulletin highlights infrasound activity produced mostly by anthropogenic sources, recorded all over Europe and collected in the course of the ARISE and ARISE2 projects (Atmospheric dynamics Research InfraStructure in Europe). Data includes high-frequency (> 0.7 Hz) infrasound detections at 24 European infrasound arrays from nine different national institutions complemented with infrasound stations of the International Monitoring System for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Data were acquired during 16 years of operation (from 2000 to 2015) and processed to identify and locate ˜ 48,000 infrasound events within Europe. The source locations of these events were derived by combining at least two corresponding station detections per event. Comparisons with ground-truth sources, e.g., Scandinavian mining activity, are provided as well as comparisons with the CTBT Late Event Bulletin (LEB). Relocation is performed using ray-tracing methods to estimate celerity and back-azimuth corrections for source location based on meteorological wind and temperature values for each event derived from European Centre for Medium-range Weather Forecast (ECMWF) data. This study focuses on the analysis of repeating, man-made infrasound events (e.g., mining blasts and supersonic flights) and on the seasonal, weekly and diurnal variation of the infrasonic activity of sources in Europe. Drawing comparisons to previous studies shows that improvements in terms of detection, association and location are made within this study due to increasing the station density and thus the number of events and determined source regions. This improves the capability of the infrasound station network in Europe to more comprehensively estimate the activity of anthropogenic infrasound sources in Europe.
On the exploitation of seismic resonances for cavity detection
NASA Astrophysics Data System (ADS)
Schneider, Felix M.; Esterhazy, Sofi; Perugia, Ilaria; Bokelmann, Götz
2017-04-01
We study the interaction of a seismic wave-field with a spherical acoustic gas- or fluid-filled cavity. The intention of this study is to clarify whether seismic resonances can be expected, a characteristic feature, which may help detecting cavities in the subsurface. This is important for many applications, as in particular the detection of underground nuclear explosions which are to be prohibited by the Comprehensive-Test-Ban-Treaty (CTBT). On-Site Inspections (OSI) should assure possible violation of the CTBT to be convicted after detection of a suspicious event from a nuclear explosion by the international monitoring system (IMS). One primary structural target for the field team during an OSI is the detection of cavities created by underground nuclear explosions. The application of seismic resonances of the cavity for its detection has been proposed in the CTBT by mentioning "resonance seismometry" as possible technique during OSIs. In order to calculate the full seismic wave-field from an incident plane wave that interacts with the cavity, we considered an analytic formulation of the problem. The wave-field interaction consists of elastic scattering and the wave-field interaction between the acoustic and elastic media. Acoustic resonant modes, caused by internal reflections in the acoustic cavity, show up as spectral peaks in the frequency domain. The resonant peaks are in close correlation to the eigenfrequencies of the undamped system described by the particular acoustic medium bounded in a sphere with stiff walls. The filling of the cavity could thus be determined by the observation of spectral peaks from acoustic resonances. By energy transmission from the internal oscillations back into the elastic domain and intrisic attenuation, the oscillations experience damping, resulting in a frequency shift and a limitation of the resonance amplitudes. In case of a gas-filled cavity the impedance contrast is high resulting in very narrow, high-amplitude resonances. In synthetic seismograms calculated in the surrounding elastic domain, the acoustic resonances of gas-filled cavities show up as persisting oscillations. However, due to the weak acoustic-elastic coupling in this case the amplitudes of the oscillations are very low. Due to a lower impedance contrast, a fluid-filled cavity has a stronger acoustic-elastic coupling, which results in wide spectral peaks of lower amplitudes. In the synthetic seismograms derived in the surrounding medium of fluid-filled cavities, acoustic resonances show up as strong but fast decaying reverberations. Based on the analytical modeling methods for exploitation of these resonance features are developed and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foxall, W; Vincent, P; Walter, W
1999-07-23
We have previously presented simple elastic deformation modeling results for three classes of seismic events of concern in monitoring the CTBT--underground explosions, mine collapses and earthquakes. Those results explored the theoretical detectability of each event type using synthetic aperture radar interferometry (InSAR) based on commercially available satellite data. In those studies we identified and compared the characteristics of synthetic interferograms that distinguish each event type, as well the ability of the interferograms to constrain source parameters. These idealized modeling results, together with preliminary analysis of InSAR data for the 1995 mb 5.2 Solvay mine collapse in southwestern Wyoming, suggested thatmore » InSAR data used in conjunction with regional seismic monitoring holds great potential for CTBT discrimination and seismic source analysis, as well as providing accurate ground truth parameters for regional calibration events. In this paper we further examine the detectability and ''discriminating'' power of InSAR by presenting results from InSAR data processing, analysis and modeling of the surface deformation signals associated with underground explosions. Specifically, we present results of a detailed study of coseismic and postseismic surface deformation signals associated with underground nuclear and chemical explosion tests at the Nevada Test Site (NTS). Several interferograms were formed from raw ERS-1/2 radar data covering different time spans and epochs beginning just prior to the last U.S. nuclear tests in 1992 and ending in 1996. These interferograms have yielded information about the nature and duration of the source processes that produced the surface deformations associated with these events. A critical result of this study is that significant post-event surface deformation associated with underground nuclear explosions detonated at depths in excess of 600 meters can be detected using differential radar interferometry. An immediate implication of this finding is that underground nuclear explosions may not need to be captured coseismically by radar images acquired before and after an event in order to be detectable. This has obvious advantages in CTBT monitoring since suspect seismic events--which usually can be located within a 100 km by 100 km area of an ERS-1/2 satellite frame by established seismic methods-can be imaged after the event has been identified and located by existing regional seismic networks. Key Words: InSAR, SLC images, interferogram, synthetic interferogram, ERS-1/2 frame, phase unwrapping, DEM, coseismic, postseismic, source parameters.« less
Comprehensive Nuclear-Test-Ban Treaty: Background and Current Developments
2012-08-03
Academy of Sciences Study and Its Critics ....................................................... 39 Chronology... criticisms of that report. On February 13, the Administration rolled out its FY2013 budget request, which included funds for the CTBT Organization...So he has not ruled out testing in the future, but there are no plans to do so.”4 Critics expressed concern about the implications of these policies
Machine Learning and Data Mining for Comprehensive Test Ban Treaty Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell, S; Vaidya, S
2009-07-30
The Comprehensive Test Ban Treaty (CTBT) is gaining renewed attention in light of growing worldwide interest in mitigating risks of nuclear weapons proliferation and testing. Since the International Monitoring System (IMS) installed the first suite of sensors in the late 1990's, the IMS network has steadily progressed, providing valuable support for event diagnostics. This progress was highlighted at the recent International Scientific Studies (ISS) Conference in Vienna in June 2009, where scientists and domain experts met with policy makers to assess the current status of the CTBT Verification System. A strategic theme within the ISS Conference centered on exploring opportunitiesmore » for further enhancing the detection and localization accuracy of low magnitude events by drawing upon modern tools and techniques for machine learning and large-scale data analysis. Several promising approaches for data exploitation were presented at the Conference. These are summarized in a companion report. In this paper, we introduce essential concepts in machine learning and assess techniques which could provide both incremental and comprehensive value for event discrimination by increasing the accuracy of the final data product, refining On-Site-Inspection (OSI) conclusions, and potentially reducing the cost of future network operations.« less
Teraishi, Toshiya; Hori, Hiroaki; Sasayama, Daimei; Matsuo, Junko; Ogawa, Shintaro; Ota, Miho; Hattori, Kotaro; Kajiwara, Masahiro; Higuchi, Teruhiko; Kunugi, Hiroshi
2015-01-01
Altered tryptophan–kynurenine (KYN) metabolism has been implicated in major depressive disorder (MDD). The l-[1-13C]tryptophan breath test (13C-TBT) is a noninvasive, stable-isotope tracer method in which exhaled 13CO2 is attributable to tryptophan catabolism via the KYN pathway. We included 18 patients with MDD (DSM-IV) and 24 age- and sex-matched controls. 13C-tryptophan (150 mg) was orally administered and the 13CO2/12CO2 ratio in the breath was monitored for 180 min. The cumulative recovery rate during the 180-min test (CRR0–180; %), area under the Δ13CO2-time curve (AUC; %*min), and the maximal Δ13CO2 (Cmax; %) were significantly higher in patients with MDD than in the controls (p = 0.004, p = 0.008, and p = 0.002, respectively). Plasma tryptophan concentrations correlated negatively with Cmax in both the patients and controls (p = 0.020 and p = 0.034, respectively). Our results suggest that the 13C-TBT could be a novel biomarker for detecting a subgroup of MDD with increased tryptophan–KYN metabolism. PMID:26524975
Extreme Scale Computing to Secure the Nation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L; McGraw, J R; Johnson, J R
2009-11-10
Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less
NASA Astrophysics Data System (ADS)
Becker, Andreas; Krysta, Monika; Auer, Matthias; Brachet, Nicolas; Ceranna, Lars; Gestermann, Nicolai; Nikkinen, Mika; Zähringer, Matthias
2010-05-01
The so-called National Data Centres (NDCs) to the Provisional Technical Secretariat of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Organization are in charge to provide for the final judgement on the CTBT relevance of explosion events encountered in the PTS International Monitoring System (IMS). The latter is a 321 stations network set-up by the PTS (to date completion level: 80%) in order to globally monitor for occurrence of CTBT relevant seismo-acoustic and radionuclide signals. In doing so, NDCs learn about any seismo-acoustic or radionuclide event by active retrieval or subscription to corresponding event lists and products provided by the International Data Centre (IDC) to the PTS. To prepare for their instrumental role in case of a CTBT relevant event, the NDCs jointly conduct annually so-called NDC Preparedness Exercises. In 2009, NDC Germany was in charge to lead the exercise and to choose a seismo-acoustic event out of the list of events provided by the PTS (Gestermann et al., EGU2010-13067). The novelty in this procedure was that also the infrasound readings and the monitoring coverage of existing (certified) radionuclide stations into the area of consideration were taken into account during the event selection process (Coyne et al., EGU2010-12660). Hence, the event finally chosen and examined took place near Kara-Zhyra mine in Eastern Kazakhstan on 28 November 2009 around 07:20:31 UTC (Event-ID 5727516). NDC Austria performed forward atmospheric transport modelling in order to predict RN measurements that should have occurred in the radionuclide IMS. In doing so the fictitious case that there would have been a release of radionuclides taking place at the same location (Wotawa and Schraik, 2010; EGU2010-4907) in a strength being typical for a non-contained nuclear explosion is examined. The stations indicated should then be analysed for their actual radionuclide readings in order to confirm the non nuclear character of the event (negative testing scenario). Obviously only stations already set up and ‘certified' of being capable of full operations, could be recruited for this. In doing so an issue was encountered with regard to the availability of RN data at certified RN stations. Despite the support to the event selection, PTS also supplied so-called data fusion bulletins that apply a method to collocate the RN and seismo-acoustic source location results (Krysta and Becker, EGU2010-10218). In this paper we demonstrate the impact of gaps in network coverage that appear due to the aforementioned reduced RN data availability for source location capacities. Network coverage assessments for the set of certified stations and the reduced set of stations actually sending data shall therefore be discussed. Furthermore, the capabilities and constraints of the data fusion method to make up for the RN source location accuracy losses related to reduced RN data availability at certified stations shall be presented.
Construction of 3-D Earth Models for Station Specific Path Corrections by Dynamic Ray Tracing
2001-10-01
the numerical eikonal solution method of Vidale (1988) being used by the MIT led consortium. The model construction described in this report relies...assembled. REFERENCES Barazangi, M., Fielding, E., Isacks, B. & Seber, D., (1996), Geophysical And Geological Databases And Ctbt...preprint download6). Fielding, E., Isacks, B.L., and Baragangi. M. (1992), A Network Accessible Geological and Geophysical Database for
Monitoring Research in the Context of CTBT Negotiations and Networks,
1995-08-14
1995) estimates, using infrasound and satellite data, that these sources generate explosion-like signals worldwide at a rate of approximately 1/yr at...coupling and the waveform appearance of atmospheric explosions. In infrasound there is the development of new array designs and of new automatic detection ...sensors. The principal daily use of the hydroacoustic network is for purposes of simple discrimination of those oceanic earthquakes detected by the seismic
Xenon monitoring and the Comprehensive Nuclear-Test-Ban Treaty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Theodore W.
How do you monitor (verify) a CTBT? It is a difficult challenge to monitor the entire world for nuclear tests, regardless of size. Nuclear tests 'normally' occur underground, above ground or underwater. Setting aside very small tests (let's limit our thinking to 1 kiloton or more), nuclear tests shake the ground, emit large amounts of radioactivity, and make loud noises if in the atmosphere (or hydroacoustic waves if underwater)
Comprehensive test ban negotiations
NASA Astrophysics Data System (ADS)
Grab, G. Allen; Heckrotte, Warren
1983-10-01
Although it has been a stated policy goal of American and Soviet leaders since 1958 (with the exception of Ronald Reagan), the world today is still without a Comprehensive Test Ban Treaty. Throughout their history, test an negotiatins have been plagued by a number of persistent problems. Chief among these is East-West differences on the verification question, with the United States concerned about the problem of possible Soviet cheating and the USSR concerned about the protection of its national sovereignty. In addition, internal bureaucratic politics have played a major role in preventing the successful conclusion of an agreement. Despite these problems, the superpowers have concluded several significant partial meausres: a brief (1958-1961) total moratorium on nuclear weapons tests; the Limited Test Ban Treaty of 1963, banning tests in the air, water and outer space; the Threshold Test Ban Treaty of 1974 (150 KT limit on underground explosions); and the Peaceful Nuclear Explosions Treaty of 1976 (150 KT limit on individal PNEs). Today, the main U.S. objections to a CTBT center is the nuclear weapons laboratories, the Department of Energy, and the Pentagon, who all stress the issues of stockpile reliability and verification. Those who remain committed to a CTBT emphasize and the potential political leverage it offers in checking both horizontal and vertical proliferation.
NASA Astrophysics Data System (ADS)
Vivas Veloso, J. A.; Christie, D. R.; Hoffmann, T. L.; Campus, P.; Bell, M.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.; Wu, Sean F.
2002-11-01
The provisional operation and maintenance of IMS infrasound stations after installation and subsequent certification has the objective to prepare the infrasound network for entry into force of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The goal is to maintain and fine tune the technical capabilities of the network, to repair faulty equipment, and to ensure that stations continue to meet the minimum specifications through evaluation of data quality and station recalibration. Due to the globally dispersed nature of the network, this program constitutes a significant undertaking that requires careful consideration of possible logistic approaches and their financial implications. Currently, 11 of the 60 IMS infrasound stations are transmitting data in the post-installation Testing & Evaluation mode. Another 5 stations are under provisional operation and are maintained in post-certification mode. It is expected that 20% of the infrasound network will be certified by the end of 2002. This presentation will focus on the different phases of post-installation activities of the IMS infrasound program and the logistical challenges to be tackled to ensure a cost-efficient management of the network. Specific topics will include Testing & Evaluation and Certification of Infrasound Stations, as well as Configuration Management and Network Sustainment.
Nuclear Weapons: Comprehensive Test Ban Treaty
2007-10-29
which has been done. Critics raised concerns about the implications of these policies for testing and new weapons. At present, Congress addresses...Comprehensive Test Ban Treaty Most Recent Developments On October 24, Senator Jon Kyl delivered a speech critical of the CTBT and of Section 3122 in...future, but there are no plans to do so.’”5 Critics expressed concern about the implications of these policies for testing and new weapons. A statement
Comprehensive Nuclear-Test-Ban Treaty: Updated ’Safeguards’ and Net Assessments
2009-06-03
measures that this nation can take unilaterally within the treaty to protect its nuclear security. To compensate for “disadvantages and risk” they...and strategic forces, and could be augmented with implementation measures . While Safeguards may be part of a future CTBT debate, both supporters and...A second path involves efforts to alter the net assessment through measures intended to mitigate perceived risks of the treaty. This path has been
China’s Foreign Policy Toward North Korea: The Nuclear Issue
2012-12-01
Comprehensive National Power CTBT Comprehensive Test Ban Treaty CVID Complete, Verifiable, and Irreversible Dismantlement EAS East Asia Summit ETIM...realized that it had to take some measures to stop North Korea’s nuclear testing .5 Hong-seo Park analyzes China’s policy change from a perspective of...community have failed to find a consensus, and North Korea conducted a nuclear test in 2006. China had shown different responses between the first and
North Korea’s 2009 Nuclear Test: Containment, Monitoring, Implications
2010-04-02
inspections as prima facie evidence of a violation. One generally-accepted means of evading detection of nuclear tests, especially low-yield tests...In an attempt to extend these bans to cover all nuclear tests, negotiations on the CTBT were completed in 1996. The treaty’s basic obligation is to...Verification refers to determining whether a nation is in compliance with its treaty obligations , which in this case means determining whether a suspicious
Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise
NASA Astrophysics Data System (ADS)
Rozhkov, Mikhail; Kitov, Ivan
2015-04-01
Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the International Monitoring System of CTBTO and by small-aperture seismic array Mikhnevo (MHVAR) operated by the Institute of Geosphere Dynamics, Russian Academy of Sciences. Our approach demonstrated a good ability of separation of seismic sources with very close origin times and locations (hundreds of meters), and/or having close arrival times (fractions of seconds), and recovering their waveforms from the mixture. Perspectives and limitations of the method are discussed.
Comprehensive Nuclear-Test-Ban Treaty: Background and Current Developments
2008-04-30
resumed testing, and has no plans to test. It has reduced the time needed to conduct a nuclear test. Critics raised concerns about the implications of...lieu of the current treaty.1 On October 24, Senator Jon Kyl delivered a speech critical of the CTBT and of Section 3122 in H.R. 1585, the FY2008...2007. Critics expressed concern about the implications of these policies for testing and new weapons. A statement by Physicians for Social
The Crustal Structure And CTBT Monitoring Of India: New Insights From Deep Seismic Profiling
2000-09-01
transitional type crust as a major source of Deccan trap flows. The Narmada-Son lineament is the most conspicuous linear geological feature in the... Deccan proto-continents) buckling of the upper and middle crustal layers of the proto-continents took place, resulting in the western block’s lower...crustal column subducting below the Deccan proto-continents. Thus, the collision process was of such severe magnitude that the impact was seen in both
Topin, Sylvain; Greau, Claire; Deliere, Ludovic; Hovesepian, Alexandre; Taffary, Thomas; Le Petit, Gilbert; Douysset, Guilhem; Moulin, Christophe
2015-11-01
The SPALAX (Système de Prélèvement Automatique en Ligne avec l'Analyse du Xénon) is one of the systems used in the International Monitoring System of the Comprehensive Nuclear Test Ban Treaty (CTBT) to detect radioactive xenon releases following a nuclear explosion. Approximately 10 years after the industrialization of the first system, the CEA has developed the SPALAX New Generation, SPALAX-NG, with the aim of increasing the global sensitivity and reducing the overall size of the system. A major breakthrough has been obtained by improving the sampling stage and the purification/concentration stage. The sampling stage evolution consists of increasing the sampling capacity and improving the gas treatment efficiency across new permeation membranes, leading to an increase in the xenon production capacity by a factor of 2-3. The purification/concentration stage evolution consists of using a new adsorbent Ag@ZSM-5 (or Ag-PZ2-25) with a much larger xenon retention capacity than activated charcoal, enabling a significant reduction in the overall size of this stage. The energy consumption of the system is similar to that of the current SPALAX system. The SPALAX-NG process is able to produce samples of almost 7 cm(3) of xenon every 12 h, making it the most productive xenon process among the IMS systems. Copyright © 2015 Elsevier Ltd. All rights reserved.
Short-range quantitative precipitation forecasting using Deep Learning approaches
NASA Astrophysics Data System (ADS)
Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.
2017-12-01
Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.
DTRA's Nuclear Explosion Monitoring Research and Development Program
NASA Astrophysics Data System (ADS)
Nichols, J.; Dainty, A.; Phillips, J.
2001-05-01
The Defense Threat Reduction Agency (DTRA) has a Program in Basic Research and Development for Nuclear Explosion Technology within the Nuclear Treaties Branch of the Arms Control Technology Division. While the funding justification is Arms Control Treaties (i.e., Comprehensive Nuclear-Test-Ban Treaty, CTBT), the results are made available for any user. Funding for the Program has averaged around \\10m per year recently. By Congressional mandate, the program has disbursed money through competitive, peer-reviewed, Program Research and Development Announcements (PRDAs); there is usually (but not always) a PRDA each year. Typical awards have been for about three years at ~\\100,000 per year, currently there are over 60 contracts in place. In addition to the "typical" awards, there was an initiative 2000 to fund seismic location calibration of the International Monitoring System (IMS) of the CTBT; there are three three-year contracts of ~\\$1,000,000 per year to perform such calibration for Eurasia, and North Africa and the Middle East. Scientifically, four technological areas have been funded, corresponding to the four technologies in the IMS: seismic, infrasound, hydroacoustic, and radionuclide, with the lion's share of the funding going to the seismic area. The scientific focus of the Program for all four technologies is detection of signals, locating their origin, and trying to determine of they are unambiguously natural in origin ("event screening"). Location has been a particular and continuing focus within the Program.
Spalax™ new generation: A sensitive and selective noble gas system for nuclear explosion monitoring.
Le Petit, G; Cagniant, A; Gross, P; Douysset, G; Topin, S; Fontaine, J P; Taffary, T; Moulin, C
2015-09-01
In the context of the verification regime of the Comprehensive nuclear Test ban Treaty (CTBT), CEA is developing a new generation (NG) of SPALAX™ system for atmospheric radioxenon monitoring. These systems are able to extract more than 6cm(3) of pure xenon from air samples each 12h and to measure the four relevant xenon radioactive isotopes using a high resolution detection system operating in electron-photon coincidence mode. This paper presents the performances of the SPALAX™ NG prototype in operation at Bruyères-le-Châtel CEA centre, integrating the most recent CEA developments. It especially focuses on an innovative detection system made up of a gas cell equipped with two face-to-face silicon detectors associated to one or two germanium detectors. Minimum Detectable activity Concentrations (MDCs) of environmental samples were calculated to be approximately 0.1 mBq/m(3) for the isotopes (131m)Xe, (133m)Xe, (133)Xe and 0.4 mBq/m(3) for (135)Xe (single germanium configuration). The detection system might be used to simultaneously measure particulate and noble gas samples from the CTBT International Monitoring System (IMS). That possibility could lead to new capacities for particulate measurements by allowing electron-photon coincidence detection of certain fission products. Copyright © 2015 Elsevier Ltd. All rights reserved.
Completing and sustaining IMS network for the CTBT Verification Regime
NASA Astrophysics Data System (ADS)
Meral Ozel, N.
2015-12-01
The CTBT International Monitoring System is to be comprised of 337 facilities located all over the world for the purpose of detecting and locating nuclear test explosions. Major challenges remain, namely the completion of the network where most of the remaining stations have either environmental, logistical and/or political issues to surmont (89% of the stations have already been built) and the sustainment of a reliable and state-of the-art network covering 4 technologies - seismic, infrasound , hydroacoustic and radionuclide. To have a credible and trustworthy verification system ready for entry into force of the Treaty, the CTBTO is protecting and enhancing its investment of its global network of stations and is providing effective data to the International Data Centre (IDC) and Member States. Regarding the protection of the CTBTO's investment and enhanced sustainment of IMS station operations, the IMS Division is enhancing the capabilities of the monitoring system by applying advances in instrumentation and introducing new software applications that are fit for purpose. Some examples are the development of noble gas laboratory systems to process and analyse subsoil samples, development of a mobile noble gas system for onsite inspection purposes, optimization of Beta Gamma detectors for Xenon detection, assessing and improving the efficiency of wind noise reduction systems for infrasound stations, development and testing of infrasound stations with a self-calibrating capability, and research into the use of modular designs for the hydroacoustic network.
Prevalence and Clinical Relevance of Exon 2 Deletion of COMMD1 in Bedlington Terriers in Korea.
Kim, Y G; Kim, S Y; Kim, J H; Lee, K K; Yun, Y M
2016-11-01
Deletion of exon 2 of copper metabolism domain containing 1 (COMMD1) results in copper toxicosis in Bedlington terriers (CT-BT). This study was conducted to identify the prevalence and clinical relevance of the COMMD1 mutation in Bedlington terriers in Korea. A total of 105 purebred Bedlington terriers (50 males, 55 females) from the kennels and pet dog clubs in Korea were examined during the period 2008-2013. A multiplex PCR was carried out to detect exon 2 deletion of COMMD1. Clinical analysis was performed on each genetic group, and clinical status of the dogs was followed up to estimate survival probability. Of the 105 samples, 52 (49%) were wild-type homozygote, 47 (45%) were heterozygote, and 6 (6%) were mutant-type homozygote. Plasma alanine aminotransferase (ALT) activity was increased in the mutant-type homozygous group >2 years of age (P < .0001). The survival probability of 6 mutant-type homozygotes surviving 2.5 years was 0.67, and 4 years was 0.5. Results show the prevalence and clinical relevance of exon 2 deletion of COMMD1 and could help establish a structured selective breeding program to prevent CT-BT in Korea. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, R.L.; Gross, D.; Pearson, D.C.
In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic,more » videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.« less
NASA Astrophysics Data System (ADS)
Koper, K. D.; Pechmann, J. C.; Burlacu, R.; Pankow, K. L.; Stein, J. R.; Hale, J. M.; Roberson, P.; McCarter, M. K.
2016-12-01
We investigate the feasibility of using the difference between local (ML) and coda duration (MC) magnitude as a means of discriminating manmade seismic events from naturally occurring tectonic earthquakes in and around Utah. Using a dataset of nearly 7,000 well-located earthquakes in the Utah region, we find that ML-MC is on average 0.44 magnitude units smaller for mining induced seismicity (MIS) than for tectonic seismicity (TS). MIS occurs within near-surface low-velocity layers that act as a waveguide and preferentially increase coda duration relative to peak amplitude, while the vast majority of TS occurs beneath the near-surface waveguide. A second dataset of more than 3,700 probable explosions in the Utah region also has significantly lower ML-MC values than TS, likely for the same reason as the MIS. These observations suggest that ML-MC, or related measures of peak amplitude versus signal duration, may be useful for discriminating small explosions from earthquakes at local-to-regional distances. ML and MC can be determined for small events with relatively few observations, hence an ML-MC discriminant can be effective in cases where moment tensor inversion is not possible because of low data quality or poorly known Green's functions. Furthermore, an ML-MC discriminant does not rely on the existence of the fast attenuating Rg phase at regional distances. ML-MC may provide a local-to-regional distance extension of the mb-MS discriminant that has traditionally been effective at identifying large nuclear explosions with teleseismic data. This topic is of growing interest in forensic seismology, in part because the Comprehensive Nuclear Test Ban Treaty (CTBT) is a zero tolerance treaty that prohibits all nuclear explosions, no matter how small. If the CTBT were to come into force, source discrimination at local distances would be required to verify compliance.
Implications from Meteoric and Volcanic Infrasound Measured in the Netherlands
NASA Astrophysics Data System (ADS)
Evers, L.
2003-12-01
Infrasound observations started in the Netherlands in 1986. Since then, several array configurations and instruments have been developed, tested and made operational. Currently, three infrasound arrays are continuously measuring infrasound with in-house developed microbarometers. The array apertures vary from 30 to 1500 meters and the number of instruments from 6 to 16 microbarometers. The inter-array distance ranges from 50 up to 150 km. This dense network of infrasound arrays is used to distinguish between earthquakes and sources in the atmosphere. Sonic booms, for example, can be experienced in the same manner as small (gas induced) earthquakes. Furthermore, Comprehensive Nuclear-Test-Ban Treaty (CTBT) related research is done. Meteors are one of the few natural impulsive sources generating energy in kT TNT equivalent range. Therefore, the study of meteors is essential to the CTBT where infrasound is applied as monitoring technique. Studies of meteors in the Netherlands have shown the capability of infrasound to trace a meteor through the stratosphere. The propagation of infrasound is in first order dependent on the wind and temperature structure of the atmosphere. The meteor's path could be reconstructed by using ECMWF atmospheric models for wind and temperature. The results were compared to visual observations, confirming the location, direction and reported origin time. The accuracy of the localization mainly depends on the applied atmospheric model and array resolution. Successfully applying infrasound depends on the array configuration that should be based on the -frequency depend- spatial coherence of the signals of interest. The array aperture and inter-element distance will play a decisive role in detecting low signal-to-noise ratios. This is shown by results from studies on volcanic infrasound from Mt. Etna (Italy) detected in the Netherlands. Sub-array processing on the 16 element array revealed an increased detectability of infrasound for small aperture, 800 m, arrays, compared to large aperture, 1500 m, arrays.
NASA Astrophysics Data System (ADS)
Gaebler, P. J.; Ceranna, L.
2016-12-01
All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection thresholdcan be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.
Trends in Nuclear Explosion Monitoring Research & Development - A Physics Perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maceira, Monica; Blom, Philip Stephen; MacCarthy, Jonathan K.
This document entitled “Trends in Nuclear Explosion Monitoring Research and Development – A Physics Perspective” reviews the accessible literature, as it relates to nuclear explosion monitoring and the Comprehensive Nuclear-Test-Ban Treaty (CTBT, 1996), for four research areas: source physics (understanding signal generation), signal propagation (accounting for changes through physical media), sensors (recording the signals), and signal analysis (processing the signal). Over 40 trends are addressed, such as moving from 1D to 3D earth models, from pick-based seismic event processing to full waveform processing, and from separate treatment of mechanical waves in different media to combined analyses. Highlighted in the documentmore » for each trend are the value and benefit to the monitoring mission, key papers that advanced the science, and promising research and development for the future.« less
2007-07-02
TYPE Final Report 3. DATES COVERED (From - To) 26-Sep-01 to 26-Jun-07 4. TITLE AND SUBTITLE OBTAINING UNIQUE, COMPREHENSIVE DEEP SEISMIC ... seismic records from 12 major Deep Seismic Sounding (DSS) projects acquired in 1970-1980’s in the former Soviet Union. The data include 3-component...records from 22 Peaceful Nuclear Explosions (PNEs) and over 500 chemical explosions recorded by a grid of linear, reversed seismic profiles covering a
Seismological Investigations of the National Data Centre Preparedness Exercise 2015 (NPE2015)
NASA Astrophysics Data System (ADS)
Gestermann, Nicolai; Hartmann, Gernot; Ross, Jens-Ole
2017-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. The National Data Centre Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and national technical means. These exercises should help to evaluate the effectiveness of analysis procedures applied at NDCs and the quality, completeness and usefulness of IDC products. The NPE2015 is a combined radionuclide-waveform scenario. Fictitious particulate radionuclide and radioxenon measurements at stations of the IMS (International Monitoring System) of the CTBTO were reported to the international community. The type of isotopes and concentrations could arise from an underground nuclear explosion (UNE). The task of the exercise is to identify the scenario behind the provided data. The source region and time domain of a possible treaty violation activity was determined from ATM in backtracking mode with input data from the fictitious data. A time slot in October and a region around the mining area of Lubin could be identified as the possible source area of the fictitious measurements. The seismicity of the determined source region was investigated in detail to identify events which cannot be classified as natural or induced within the relevant time interval. The comparison of spectral characteristics and a cluster analysis was applied to search for a non-characteristic event within a number of known induced events in the area. The results reveal that none of the candidate events had an explosion like characteristic. All candidate events are part of event cluster with a minimum of seven events with comparable signature. The possibility of a treaty violation would be very low in a real scenario. If the nature of a suspicious event cannot be clarified with data of the IMS or national technical means, an on-site inspection (OSI) can be requested by the member states. Taking into account the results of the seismological investigations it could be decided that an OSI is not necessary for the possible source region to exclude the possibility of a fictitious clandestine underground nuclear explosion.
Multi-Detection Events, Probability Density Functions, and Reduced Location Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eslinger, Paul W.; Schrom, Brian T.
2016-03-01
Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.
Use of IMS data and its potential for research through global noble gases concentration maps
NASA Astrophysics Data System (ADS)
Terzi, Lucrezia; Kalinowski, Martin; Gueibe, Christophe; Camps, Johan; Gheddou, Abdelhakim; Kusmierczyk-Michulec, Jolanta; Schoeppner, Michael
2017-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) established for verification purposes a global monitoring system for atmospheric radioisotopes and noble gas radioactivity. Daily activity concentrations have been collected worldwide for over 15 years providing unique data sets with long term time series that can be used for atmospheric circulation dynamics analysis. In this study, we want to emphasize the value of worldwide noble gas data by reconstructing global xenon concentration maps and comparing these observations with ATM simulations. By creating a residual plot, we can improve our understanding of our source estimation level for each region.
Ongoing research experiments at the former Soviet nuclear test site in eastern Kazakhstan
Leith, William S.; Kluchko, Luke J.; Konovalov, Vladimir; Vouille, Gerard
2002-01-01
Degelen mountain, located in EasternKazakhstan near the city of Semipalatinsk, was once the Soviets most active underground nuclear test site. Two hundred fifteen nuclear tests were conducted in 181 tunnels driven horizontally into its many ridges--almost twice the number of tests as at any other Soviet underground nuclear test site. It was also the site of the first Soviet underground nuclear test--a 1-kiloton device detonated on October 11, 1961. Until recently, the details of testing at Degelen were kept secret and have been the subject of considerable speculation. However, in 1991, the Semipalatinsk test site became part of the newly independent Republic of Kazakhstan; and in 1995, the Kazakhstani government concluded an agreement with the U.S. Department of Defense to eliminate the nuclear testing infrastructure in Kazakhstan. This agreement, which calls for the "demilitarization of the infrastructure directly associated with the nuclear weapons test tunnels," has been implemented as the "Degelen Mountain Tunnel Closure Program." The U.S. Defense Threat Reduction Agency, in partnership with the Department of Energy, has permitted the use of the tunnel closure project at the former nuclear test site as a foundation on which to support cost-effective, research-and-development-funded experiments. These experiments are principally designed to improve U.S. capabilities to monitor and verify the Comprehensive Test Ban Treaty (CTBT), but have provided a new source of information on the effects of nuclear and chemical explosions on hard, fractured rock environments. These new data extends and confirms the results of recent Russian publications on the rock environment at the site and the mechanical effects of large-scale chemical and nuclear testing. In 1998, a large-scale tunnel closure experiment, Omega-1, was conducted in Tunnel 214 at Degelen mountain. In this experiment, a 100-ton chemical explosive blast was used to test technologies for monitoring the Comprehensive Nuclear Test Ban Treaty, and to calibrate a portion of the CTBT's International Monitoring System. This experiment has also provided important benchmark data on the mechanical behavior of hard, dense, fractured rock, and has demonstrated the feasibility of fielding large-scale calibration explosions, which are specified as a "confidence-building measure" in the CTBT Protocol. Two other large-scale explosion experiments, Omega-2 and Omega-3, are planned for the summer of 1999 and 2000. Like the Tunnel 214 test, the 1999 experiment will include close-in monitoring of near-source effects, as well as contributing to the calibration of key seismic stations for the Comprehensive Test Ban Treaty. The Omega-3 test will examine the effect of multiple blasts on the fractured rock environment.
Challenges in Regional CTBT Monitoring: The Experience So Far From Vienna
NASA Astrophysics Data System (ADS)
Bratt, S. R.
2001-05-01
The verification system being established to monitor the CTBT will include an International Monitoring System (IMS) network of 321 seismic, hydroacoustic, infrasound and radionuclide stations, transmitting digital data to the International Data Centre (IDC) in Vienna, Austria over a Global Communications Infrastructure (GCI). The IDC started in February 2000 to disseminate a wide range of products based on automatic processing and interactive analysis of data from about 90 stations from the four IMS technologies. The number of events in the seismo-acoustic Reviewed Event Bulletins (REB) was 18,218 for the year 2000, with the daily number ranging from 30 to 360. Over 300 users from almost 50 Member States are now receiving an average of 18,000 data and product deliveries per month from the IDC. As the IMS network expands (40 - 60 new stations are scheduled start transmitting data this year) and as GCI communications links bring increasing volumes of new data into Vienna (70 new GCI sites are currently in preparation), the monitoring capability of the IMS and IDC has the potential to improve significantly. To realize this potential, the IDC must continue to improve its capacity to exploit regional seismic data from events defined by few stations with large azimuthal gaps. During 2000, 25% of the events in the REB were defined by five or fewer stations. 48% were defined by at least one regional phase, and 24% were defined by at least three. 34% had gaps in azimuthal coverage of more than 180 degrees. The fraction of regional, sparsely detected events will only increase as new, sensitive stations come on-line, and the detection threshold drops. This will be offset, to some extent, because stations within the denser network that detect near-threshold events will be at closer distances, on average. Thus to address the challenges of regional monitoring, the IDC must integrate "tuned" station and network processing parameters for new stations; enhanced and/or new methods for estimating location, depth and uncertainty bounds; and validated, regionally-calibrated travel times, event characterization parameters and screening criteria. A new IDC program to fund research to calibrate regional seismic travel paths seeks to address, in cooperation with other national efforts, one item on this list. More effective use of the full waveform data and cross-technology synergies must be explored. All of this work must be integrated into modular software systems that can be maintained and improved over time. To motivate these regional monitoring challenges and possible improvements, the experience from the IDC will be presented via a series of illustrative, sample events. Challenges in the technical and policy arenas must be addressed as well. IMS data must first be available at the IDC before they can be analyzed. The encouraging experience to date is that the availability of data arriving via the GCI is significantly higher (~95%) than the availability (~70%) from the same stations prior to GCI installation, when they were transmitting data via other routes. Within the IDC, trade-offs must be considered between the desired levels of product quality and timeliness, and the investment in personnel and system development to support the levels sought. Another high-priority objective is to develop a policy for providing data and products to scientific and disaster alert organizations. It is clear that broader exploitation of these rich and unique assets could be of great, mutual benefit, and is, perhaps, a necessity for the CTBT verification system to achieve its potential.
Isotopic signature of atmospheric xenon released from light water reactors.
Kalinowski, Martin B; Pistner, Christoph
2006-01-01
A global monitoring system for atmospheric xenon radioactivity is being established as part of the International Monitoring System to verify compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The isotopic activity ratios of (135)Xe, (133m)Xe, (133)Xe and (131m)Xe are of interest for distinguishing nuclear explosion sources from civilian releases. Simulations of light water reactor (LWR) fuel burn-up through three operational reactor power cycles are conducted to explore the possible xenon isotopic signature of nuclear reactor releases under different operational conditions. It is studied how ratio changes are related to various parameters including the neutron flux, uranium enrichment and fuel burn-up. Further, the impact of diffusion and mixing on the isotopic activity ratio variability are explored. The simulations are validated with reported reactor emissions. In addition, activity ratios are calculated for xenon isotopes released from nuclear explosions and these are compared to the reactor ratios in order to determine whether the discrimination of explosion releases from reactor effluents is possible based on isotopic activity ratios.
NASA Astrophysics Data System (ADS)
Ross, J. Ole; Ceranna, Lars
2016-04-01
The radionuclide component of the International Monitoring System (IMS) to verify compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) is in place to detect tiny traces of fission products from nuclear explosions in the atmosphere. The challenge for the interpretation of IMS radionuclide data is to discriminate radionuclide sources of CTBT relevance against emissions from nuclear facilities. Remarkable activity concentrations of Ba/La-140 occurred at the IMS radionuclide stations RN 37 (Okinawa) and RN 58 (Ussurysk) mid of May 2010. In those days also an elevated Xe-133 level was measured at RN 38 (Takasaki). Additional regional measurements of radioxenon were reported in the press and further analyzed in various publications. The radionuclide analysis gives evidence for the presence of a nuclear fission source between 10 and 12 May 2010. Backward Atmospheric Transport Modelling (ATM) with HYSPLIT driven by 0.2° ECMWF meteorological data for the IMS samples indicates that, assuming a single source, a wide range of source regions is possible including the Korean Peninsula, the Sea of Japan (East Sea), and parts of China and Russia. Further confinement of the possible source location can be provided by atmospheric backtracking for the assumed sampling periods of the reported regional xenon measurements. New studies indicate a very weak seismic event at the DPRK test site on early 12 May 2010. Forward ATM for a pulse release caused by this event shows fairly good agreement with the observed radionuclide signature. Nevertheless, the underlying nuclear fission scenario remains quite unclear and speculative even if assuming a connection between the waveform and the radionuclide event.
NASA Astrophysics Data System (ADS)
Baker, Ben; Stachnik, Joshua; Rozhkov, Mikhail
2017-04-01
International Data Center is required to conduct expert technical analysis and special studies to improve event parameters and assist State Parties in identifying the source of specific event according to the protocol to the Protocol to the Comprehensive Nuclear Test Ban Treaty. Determination of seismic event source mechanism and its depth is closely related to these tasks. It is typically done through a strategic linearized inversion of the waveforms for a complete or subset of source parameters, or similarly defined grid search through precomputed Greens Functions created for particular source models. In this presentation we demonstrate preliminary results obtained with the latter approach from an improved software design. In this development we tried to be compliant with different modes of CTBT monitoring regime and cover wide range of source-receiver distances (regional to teleseismic), resolve shallow source depths, provide full moment tensor solution based on body and surface waves recordings, be fast to satisfy both on-demand studies and automatic processing and properly incorporate observed waveforms and any uncertainties a priori as well as accurately estimate posteriori uncertainties. Posterior distributions of moment tensor parameters show narrow peaks where a significant number of reliable surface wave observations are available. For earthquake examples, fault orientation (strike, dip, and rake) posterior distributions also provide results consistent with published catalogues. Inclusion of observations on horizontal components will provide further constraints. In addition, the calculation of teleseismic P wave Green's Functions are improved through prior analysis to determine an appropriate attenuation parameter for each source-receiver path. Implemented HDF5 based Green's Functions pre-packaging allows much greater flexibility in utilizing different software packages and methods for computation. Further additions will have the rapid use of Instaseis/AXISEM full waveform synthetics added to a pre-computed GF archive. Along with traditional post processing analysis of waveform misfits through several objective functions and variance reduction, we follow a probabilistic approach to assess the robustness of moment tensor solution. In a course of this project full moment tensor and depth estimates are determined for DPRK events and shallow earthquakes using a new implementation of teleseismic P waves waveform fitting. A full grid search over the entire moment tensor space is used to appropriately sample all possible solutions. A recent method by Tape & Tape (2012) to discretize the complete moment tensor space from a geometric perspective is used. Probabilistic uncertainty estimates on the moment tensor parameters provide robustness to solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumgardt, D.
1998-03-31
The International Monitoring System (IMS) for the Comprehensive Test Ban Treaty (CTBT) faces the serious challenge of being able to accurately and reliably identify seismic events in any region of the world. Extensive research has been performed in recent years on developing discrimination techniques which appear to classify seismic events into broad categories of source types, such as nuclear explosion, earthquake, and mine blast. This report examines in detail the problem of effectiveness of regional discrimination procedures in the application of waveform discriminants to Special Event identification and the issue of discriminant transportability.
NASA Astrophysics Data System (ADS)
Krysta, Monika; Kushida, Noriyuki; Kotselko, Yuriy; Carter, Jerry
2016-04-01
Possibilities of associating information from four pillars constituting CTBT monitoring and verification regime, namely seismic, infrasound, hydracoustic and radionuclide networks, have been explored by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) for a long time. Based on a concept of overlying waveform events with the geographical regions constituting possible sources of the detected radionuclides, interactive and non-interactive tools were built in the past. Based on the same concept, a design of a prototype of a Fused Event Bulletin was proposed recently. One of the key design elements of the proposed approach is the ability to access fusion results from either the radionuclide or from the waveform technologies products, which are available on different time scales and through various different automatic and interactive products. To accommodate various time scales a dynamic product evolving while the results of the different technologies are being processed and compiled is envisioned. The product would be available through the Secure Web Portal (SWP). In this presentation we describe implementation of the data fusion functionality in the test framework of the SWP. In addition, we address possible refinements to the already implemented concepts.
Global Monitoring of the CTBT: Progress, Capabilities and Plans (Invited)
NASA Astrophysics Data System (ADS)
Zerbo, L.
2013-12-01
The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), established in 1996, is tasked with building up the verification regime of the CTBT. The regime includes a global system for monitoring the earth, the oceans and the atmosphere for nuclear tests, and an on-site inspection (OSI) capability. More than 80% of the 337 facilities of the International Monitoring System (IMS) have been installed and are sending data to the International Data Centre (IDC) in Vienna, Austria for processing. These IMS data along with IDC processed and reviewed products are available to all States that have signed the Treaty. Concurrent with the build-up of the global monitoring networks, near-field geophysical methods are being developed and tested for OSIs. The monitoring system is currently operating in a provisional mode, as the Treaty has not yet entered into force. Progress in installing and operating the IMS and the IDC and in building up an OSI capability will be described. The capabilities of the monitoring networks have progressively improved as stations are added to the IMS and IDC processing techniques refined. Detection thresholds for seismic, hydroacoustic, infrasound and radionuclide events have been measured and in general are equal to or lower than the predictions used during the Treaty negotiations. The measurements have led to improved models and tools that allow more accurate predictions of future capabilities and network performance under any configuration. Unplanned tests of the monitoring network occurred when the DPRK announced nuclear tests in 2006, 2009, and 2013. All three tests were well above the detection threshold and easily detected and located by the seismic monitoring network. In addition, noble gas consistent with the nuclear tests in 2006 and 2013 (according to atmospheric transport models) was detected by stations in the network. On-site inspections of these tests were not conducted as the Treaty has not entered into force. In order to achieve a credible and trustworthy Verification System, increased focus is being put on the development of OSI operational capabilities while operating and sustaining the existing monitoring system, increasing the data availability and quality, and completing the remaining facilities of the IMS. Furthermore, as mandated by the Treaty, the CTBTO also seeks to continuously improve its technologies and methods through interaction with the scientific community. Workshops and scientific conferences such as the CTBT Science and Technology Conference series provide venues for exchanging ideas, and mechanisms have been developed for sharing IMS data with researchers who are developing and testing new and innovative methods pertinent to the verification regime. While progress is steady on building up the verification regime, there is also progress in gaining entry into force of the Treaty, which requires the signatures and ratifications of the DPRK, India and Pakistan; it also requires the ratifications of China, Egypt, Iran, Israel and the United States. Thirty-six other States, whose signatures and ratifications are needed for entry into force have already done so.
Zhang, Weihua; Bean, Marc; Benotto, Mike; Cheung, Jeff; Ungar, Kurt; Ahier, Brian
2011-12-01
A high volume aerosol sampler ("Grey Owl") has been designed and developed at the Radiation Protection Bureau, Health Canada. Its design guidance is based on the need for a low operational cost and reliable sampler to provide daily aerosol monitoring samples that can be used as reference samples for radiological studies. It has been developed to provide a constant air flow rate at low pressure drops (∼3 kPa for a day sampling) with variations of less than ±1% of the full scale flow rate. Its energy consumption is only about 1.5 kW for a filter sampling over 22,000 standard cubic meter of air. It has been demonstrated in this Fukushima nuclear accident related aerosol radioactivity monitoring study at Sidney station, B.C. that the sampler is robust and reliable. The results provided by the new monitoring system have been used to support decision-making in Canada during an emergency response. Copyright © 2011 Elsevier Ltd. All rights reserved.
Make the World Safer from Nuclear Weapons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Ted
Senior Nuclear Scientist Ted Bowyer knows firsthand the challenges associated with protecting our nation. Ted and his colleagues help detect the proliferation of nuclear weapons. They developed award-winning technologies that give international treaty verification authorities “eyes and ears” around the globe. The instruments, located in 80 countries, help ensure compliance with the Comprehensive Nuclear Test-Ban Treaty, or CTBT. They are completely automated radionuclide monitoring systems that would detect airborne radioactive particles if a nuclear detonation occurred in the air, underground or at sea. Some samples collected through these technologies are sent to PNNL’s Shallow Underground Laboratory—the only certified U.S. radionuclidemore » laboratory for the CTBT’s International Monitoring System Organization.« less
Verification System: First System-Wide Performance Test
NASA Astrophysics Data System (ADS)
Chernobay, I.; Zerbo, L.
2006-05-01
System-wide performance tests are essential for the development, testing and evaluation of individual components of the verification system. In addition to evaluating global readiness it helps establishing the practical and financial requirements for eventual operations. The first system-wide performance test (SPT1) was conducted in three phases: - A preparatory phase in May-June 2004 - A performance testing phase in April-June 2005 - An evaluation phase in the last half of 2005. The preparatory phase was developmental in nature. The main objectives for the performance testing phase included establishment of performance baseline under current provisional mode of operation (CTBT/PC- 19/1/Annex II, CTBT/WGB-21/1), examination of established requirements and procedures for operation and maintenance. To establish a system-wide performance baseline the system configuration was fixed for April-May 2005. The third month (June 2005) was used for implementation of 21 test case scenarios to examine either particular operational procedures or the response of the system components to the failures simulated under controlled conditions. A total of 163 stations and 5 certified radionuclide laboratories of International Monitoring System (IMS) participated in the performance testing phase - about 50% of the eventual IMS network. 156 IMS facilities and 40 National Data Centres (NDCs) were connected to the International Data Centre (IDC) via Global Communication Infrastructure (GCI) communication links. In addition, 12 legacy stations in the auxiliary seismic network sent data to the IDC over the Internet. During the performance testing phase, the IDC produced all required products, analysed more than 6100 seismic events and 1700 radionuclide spectra. Performance of all system elements was documented and analysed. IDC products were compared with results of data processing at the NDCs. On the basis of statistics and information collected during the SPT1 a system-wide performance baseline under current guidelines for provisional Operation and Maintenance was established. The test provided feedback for further development of the draft IMS and IDC Operational Manuals and identified priority areas for further system development.
Matthews, K M; Bowyer, T W; Saey, P R J; Payne, R F
2012-08-01
Radiopharmaceuticals make contributions of inestimable value to medical practice. With growing demand new technologies are being developed and applied worldwide. Most diagnostic procedures rely on (99m)Tc and the use of uranium targets in reactors is currently the favored method of production, with 95% of the necessary (99)Mo parent currently being produced by four major global suppliers. Coincidentally there are growing concerns for nuclear security and proliferation. New disarmament treaties such as the Comprehensive Nuclear-Test-Ban Treaty (CTBT) are coming into effect and treaty compliance-verification monitoring is gaining momentum. Radioxenon emissions (isotopes Xe-131, 133, 133m and 135) from radiopharmaceutical production facilities are of concern in this context because radioxenon is a highly sensitive tracer for detecting nuclear explosions. There exists, therefore, a potential for confusing source attribution, with emissions from radiopharmaceutical-production facilities regularly being detected in treaty compliance-verification networks. The CTBT radioxenon network currently under installation is highly sensitive with detection limits approaching 0.1 mBq/m³ and, depending on transport conditions and background, able to detect industrial release signatures from sites thousands of kilometers away. The method currently employed to distinguish between industrial and military radioxenon sources involves plots of isotope ratios (133m)Xe/(131m)Xe versus (135)Xe/(133)Xe, but source attribution can be ambiguous. Through the WOSMIP Workshop the environmental monitoring community is gaining a better understanding of the complexities of the processes at production facilities, and the production community is recognizing the impact their operations have on monitoring systems and their goal of nuclear non-proliferation. Further collaboration and discussion are needed, together with advances in Xe trapping technology and monitoring systems. Such initiatives will help in addressing the dichotomy which exists between expanding production and improving monitoring sensitivity, with the ultimate aim of enabling unambiguous distinction between different nuclide signatures. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aalseth, Craig E.; Day, Anthony R.; Haas, Derek A.
On-Site Inspection (OSI) is a key component of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Measurements of radionuclide isotopes created by an underground nuclear explosion are a valuable signature of a Treaty violation. Argon-37 is produced from neutron interaction with calcium in soil, 40Ca(n,α)37Ar. For OSI, the 35-day half-life of 37Ar provides both high specific activity and sufficient time for completion of an inspection before decay limits sensitivity. This paper presents a low-background internal-source gas proportional counter with an 37Ar measurement sensitivity level equivalent to 45.1 mBq/SCM in whole air.
NG09 And CTBT On-Site Inspection Noble Gas Sampling and Analysis Requirements
NASA Astrophysics Data System (ADS)
Carrigan, Charles R.; Tanaka, Junichi
2010-05-01
A provision of the Comprehensive Test Ban Treaty (CTBT) allows on-site inspections (OSIs) of suspect nuclear sites to determine if the occurrence of a detected event is nuclear in origin. For an underground nuclear explosion (UNE), the potential success of an OSI depends significantly on the containment scenario of the alleged event as well as the application of air and soil-gas radionuclide sampling techniques in a manner that takes into account both the suspect site geology and the gas transport physics. UNE scenarios may be broadly divided into categories involving the level of containment. The simplest to detect is a UNE that vents a significant portion of its radionuclide inventory and is readily detectable at distance by the International Monitoring System (IMS). The most well contained subsurface events will only be detectable during an OSI. In such cases, 37 Ar and radioactive xenon cavity gases may reach the surface through either "micro-seepage" or the barometric pumping process and only the careful siting of sampling locations, timing of sampling and application of the most site-appropriate atmospheric and soil-gas capturing methods will result in a confirmatory signal. The OSI noble gas field tests NG09 was recently held in Stupava, Slovakia to consider, in addition to other field sampling and analysis techniques, drilling and subsurface noble gas extraction methods that might be applied during an OSI. One of the experiments focused on challenges to soil-gas sampling near the soil-atmosphere interface. During withdrawal of soil gas from shallow, subsurface sample points, atmospheric dilution of the sample and the potential for introduction of unwanted atmospheric gases were considered. Tests were designed to evaluate surface infiltration and the ability of inflatable well-packers to seal out atmospheric gases during sample acquisition. We discuss these tests along with some model-based predictions regarding infiltration under different near-surface hydrologic conditions. We also consider how naturally occurring as well as introduced (e.g., SF6) soil-gas tracers might be used to guard against the possibility of atmospheric contamination of soil gases while sampling during an actual OSI. The views expressed here do not necessarily reflect the opinion of the United States Government, the United States Department of Energy, or Lawrence Livermore National Laboratory. This work has been performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-418791
Characterization of infrasound from lightning
NASA Astrophysics Data System (ADS)
Assink, J. D.; Evers, L. G.; Holleman, I.; Paulssen, H.
2008-08-01
During thunderstorm activity in the Netherlands, electromagnetic and infrasonic signals are emitted due to the process of lightning and thunder. It is shown that correlating infrasound detections with results from a electromagnetic lightning detection network is successful up to distances of 50 km from the infrasound array. Infrasound recordings clearly show blastwave characteristics which can be related to cloud-ground discharges, with a dominant frequency between 1-5 Hz. Amplitude measurements of CG discharges can partly be explained by the beam pattern of a line source with a dominant frequency of 3.9 Hz, up to a distance of 20 km. The ability to measure lightning activity with infrasound arrays has both positive and negative implications for CTBT verification purposes. As a scientific application, lightning studies can benefit from the worldwide infrasound verification system.
Monitoring the Earth's Atmosphere with the Global IMS Infrasound Network
NASA Astrophysics Data System (ADS)
Brachet, Nicolas; Brown, David; Mialle, Pierrick; Le Bras, Ronan; Coyne, John; Given, Jeffrey
2010-05-01
The Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is tasked with monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) which bans nuclear weapon explosions underground, in the oceans, and in the atmosphere. The verification regime includes a globally distributed network of seismic, hydroacoustic, infrasound and radionuclide stations which collect and transmit data to the International Data Centre (IDC) in Vienna, Austria shortly after the data are recorded at each station. The infrasound network defined in the Protocol of the CTBT comprises 60 infrasound array stations. Each array is built according to the same technical specifications, it is typically composed of 4 to 9 sensors, with 1 to 3 km aperture geometry. At the end of 2000 only one infrasound station was transmitting data to the IDC. Since then, 41 additional stations have been installed and 70% of the infrasound network is currently certified and contributing data to the IDC. This constitutes the first global infrasound network ever built with such a large and uniform distribution of stations. Infrasound data at the IDC are processed at the station level using the Progressive Multi-Channel Correlation (PMCC) method for the detection and measurement of infrasound signals. The algorithm calculates the signal correlation between sensors at an infrasound array. If the signal is sufficiently correlated and consistent over an extended period of time and frequency range a detection is created. Groups of detections are then categorized according to their propagation and waveform features, and a phase name is assigned for infrasound, seismic or noise detections. The categorization complements the PMCC algorithm to avoid overwhelming the IDC automatic association algorithm with false alarm infrasound events. Currently, 80 to 90% of the detections are identified as noise by the system. Although the noise detections are not used to build events in the context of CTBT monitoring, they represent valuable data for other civil applications like monitoring of natural hazards (volcanic activity, storm tracking) and climate change. Non-noise detections are used in network processing at the IDC along with seismic and hydroacoustic technologies. The arrival phases detected on the three waveform technologies may be combined and used for locating events in an automatically generated bulletin of events. This automatic event bulletin is routinely reviewed by analysts during the interactive review process. However, the fusion of infrasound data with the other waveform technologies has only recently (in early 2010) become part of the IDC operational system, after a software development and testing period that began in 2004. The build-up of the IMS infrasound network, the recent developments of the IDC infrasound software, and the progress accomplished during the last decade in the domain of real-time atmospheric modelling have allowed better understanding of infrasound signals and identification of a growing data set of ground-truth sources. These infragenic sources originate from natural or man-made sources. Some of the detected signals are emitted by local or regional phenomena recorded by a single IMS infrasound station: man-made cultural activity, wind farms, aircraft, artillery exercises, ocean surf, thunderstorms, rumbling volcanoes, iceberg calving, aurora, avalanches. Other signals may be recorded by several IMS infrasound stations at larger distances: ocean swell, sonic booms, and mountain associated waves. Only a small fraction of events meet the event definition criteria considering the Treaty verification mission of the Organization. Candidate event types for the IDC Reviewed Event Bulletin include atmospheric or surface explosions, meteor explosions, rocket launches, signals from large earthquakes and explosive volcanic eruptions.
An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI
NASA Astrophysics Data System (ADS)
Necmioglu, O.; Meral Ozel, N.; Semin, K.
2009-12-01
Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK event, we saw a very clear P arrival at 01:05:47 (GMT) at BRTR SP array. The result of the f-k analysis performed in Geotool software, installed at NDC facilities in 2008 and is in full use currently, was also indicating that the arrival belongs to the DPRK event. When comparing our f-k results (calculated at 1-2 Hz) with IDC-REB, however, we have noticed that our calculation and therefore corresponding residuals (calculated with reference to REB residuals) are much better in comparison to REB. The reasons of this ambiguity have been explored and for the first time a comprehensive seismological analysis of a Nuclear Test has been conducted in Turkey. CTBT has an important role for the implementation of the non-proliferation of nuclear weapons and it is a key element for the pursuit of nuclear disarmament. In this study, we would like to reflect the technical and scientific aspects of the 25 May 2009 DPRK event analysis, together with our involvement in CTBT(O) affairs, which we believe it brings new dimensions to Turkey especially in the area of Geophysics.
An Overview of the Source Physics Experiments (SPE) at the Nevada National Security Site (NNSS)
NASA Astrophysics Data System (ADS)
Snelson, C. M.; Barker, D. L.; White, R. L.; Emmitt, R. F.; Townsend, M. J.; Graves, T. E.; Becker, S. A.; Teel, M. G.; Lee, P.; Antoun, T. H.; Rodgers, A.; Walter, W. R.; Mellors, R. J.; Brunish, W. M.; Bradley, C. R.; Patton, H. J.; Hawkins, W. L.; Corbell, B. H.; Abbott, R. E.; SPE Working Group
2011-12-01
Modeling of explosion phenomenology has been primarily empirically based when looking at the seismic, infrasound, and acoustic signals. In order to detect low-yield nuclear explosions under the Comprehensive Nuclear Test-Ban Treaty (CTBT), we must be able to understand and model the explosive source in settings beyond where we have empirical data. The Source Physics Experiments (SPE) at the Nevada National Security Site are the first step in this endeavor to link the empirically based with the physics-based modeling to develop this predictive capability. The current series of tests is being conducted in a granite body called the Climax Stock. This location was chosen for several reasons, including the site's expected "simple geology"-the granite is a fairly homogeneous body. In addition, data are available from underground nuclear tests that were conducted in the same rock body, and the nature of the geology has been well-documented. Among the project goals for the SPE is to provide fully coupled seismic energy to the seismic and acoustic seismic arrays so that the transition between the near and far-field data can be modeled and our scientists can begin to understand how non-linear effects and anisotropy control seismic energy transmission and partitioning. The first shot for the SPE was conducted in May 2011 as a calibration shot (SPE1) with 220 lb (100 kg) of chemical explosives set at a depth of 180 ft (55 m). An array of sensors and diagnostics recorded the shot data, including accelerometers, geophones, rotational sensors, short-period and broadband seismic sensors, Continuous Reflectometry for Radius vs. Time Experiment (CORRTEX), Time of Arrival (TOA), Velocity of Detonation (VOD) as well as infrasound sensors. The three-component accelerometer packages were set at depths of 180 ft (55 m), 150 ft (46 m), and 50 ft (15 m) in two rings around ground zero (GZ); the inner ring was at 10 m and the outer ring was 20 m from GZ. Six sets of surface accelerometers (100 and 500 g) were placed along in an azimuth of SW from GZ every 10 m. Seven infrasound sensors were placed in an array around the GZ, extending from tens of meters to kilometers. Over 100 seismic stations were positioned, most of which were in five radial lines from GZ out to 2 km. Over 400 data channels were recorded for SPE1, and data recovery was about 95% with high signal to noise ratio. Future tests will be conducted in the same shot hole as SPE1. The SPE2 experiment will consist of 2200 lb (1000 kg) of chemical explosives shot at 150 ft (46 m) depth utilizing the above-described instrumentation. Subsequent SPE shots will be the same size, within the same shot hole, and within the damage zone. The ultimate goal of the SPE Project is to develop predictive capability for using seismic energy as a tool for CTBT issues. This work was done by National Security Technologies, LLC, under Contract No. DE AC52 06NA25946 with the U.S. Department of Energy.
Seismic resonances of acoustic cavities
NASA Astrophysics Data System (ADS)
Schneider, F. M.; Esterhazy, S.; Perugia, I.; Bokelmann, G.
2016-12-01
The goal of an On-Site Inspection (OSI) is to clarify at a possible testsite whether a member state of the Comprehensive nuclear Test Ban Treaty (CTBT)has violated its rules by conducting a underground nuclear test. Compared toatmospheric and underwater tests underground nuclear explosions are the mostdifficult to detect.One primary structural target for the field team during an OSI is the detectionof an underground cavity, created by underground nuclear explosions. Theapplication of seismic-resonances of the cavity for its detection has beenproposed in the CTBT by mentioning "resonance seismometry" as possibletechnique during OSIs. We modeled the interaction of a seismic wave-field withan underground cavity by a sphere filled with an acoustic medium surrounded byan elastic full space. For this setting the solution of the seismic wave-fieldcan be computed analytically. Using this approach the appearance of acousticresonances can be predicted in the theoretical calculations. Resonance peaksappear in the spectrum derived for the elastic domain surrounding the acousticcavity, which scale in width with the density of the acoustic medium. For lowdensities in the acoustic medium as for an gas-filled cavity, the spectralpeaks become very narrow and therefore hard to resolve. The resonancefrequencies, however can be correlated to the discrete set of eigenmodes of theacoustic cavity and can thus be predicted if the dimension of the cavity isknown. Origin of the resonance peaks are internal reverberations of wavescoupling in the acoustic domain and causing an echoing signal that couples outto the elastic domain again. In the gas-filled case the amplitudes in timedomain are very low.Beside theoretical considerations we seek to find real data examples fromsimilar settings. As example we analyze a 3D active seismic data set fromFelsőpetény, Hungary that has been conducted between 2012 and 2014 on behalf ofthe CTBTO. In the subsurface of this area a former clay mine is situated, whichis connected to a karst cave of 30 m diameter in 70 m depth. Our aim is toinvestigate whether resonances predicted from theoretical models can be alsoobserved in data from such real experiments. Observation of spectral resonantpeaks could serve as the foundation of a cavity detection method that could beutilized for nuclear verification.
Seismic wave interaction with underground cavities
NASA Astrophysics Data System (ADS)
Schneider, Felix M.; Esterhazy, Sofi; Perugia, Ilaria; Bokelmann, Götz
2016-04-01
Realization of the future Comprehensive Nuclear Test Ban Treaty (CTBT) will require ensuring its compliance, making the CTBT a prime example of forensic seismology. Following indications of a nuclear explosion obtained on the basis of the (IMS) monitoring network further evidence needs to be sought at the location of the suspicious event. For such an On-Site Inspection (OSI) at a possible nuclear test site the treaty lists several techniques that can be carried out by the inspection team, including aftershock monitoring and the conduction of active seismic surveys. While those techniques are already well established, a third group of methods labeled as "resonance seismometry" is less well defined and needs further elaboration. A prime structural target that is expected to be present as a remnant of an underground nuclear explosion is a cavity at the location and depth the bomb was fired. Originally "resonance seismometry" referred to resonant seismic emission of the cavity within the medium that could be stimulated by an incident seismic wave of the right frequency and observed as peaks in the spectrum of seismic stations in the vicinity of the cavity. However, it is not yet clear which are the conditions for which resonant emissions of the cavity could be observed. In order to define distance-, frequency- and amplitude ranges at which resonant emissions could be observed we study the interaction of seismic waves with underground cavities. As a generic model for possible resonances we use a spherical acoustic cavity in an elastic full-space. To solve the forward problem for the full elastic wave field around acoustic spherical inclusions, we implemented an analytical solution (Korneev, 1993). This yields the possibility of generating scattering cross-sections, amplitude spectrums and synthetic seismograms for plane incident waves. Here, we focus on the questions whether or not we can expect resonant responses in the wave field scattered from the cavity. We show results for varying input parameters such as dimensions, densities, and seismic velocities in and around the cavity, in order to discuss the applicability of such observations during an OSI.
Monitoring and Reporting Tools of the International Data Centre and International Monitoring System
NASA Astrophysics Data System (ADS)
Lastowka, L.; Anichenko, A.; Galindo, M.; Villagran Herrera, M.; Mori, S.; Malakhova, M.; Daly, T.; Otsuka, R.; Stangel, H.
2007-05-01
The Comprehensive Test-Ban Treaty (CTBT) which prohibits all nuclear explosions was opened for signature in 1996. Since then, the Preparatory Commission for the CTBT Organization has been working towards the establishment of a global verification regime to monitor compliance with the ban on nuclear testing. The International Monitoring System (IMS) comprises facilities for seismic, hydroacoustic, infrasound and radionuclide monitoring, and the means of communication. This system is supported by the International Data Centre (IDC), which provides objective products and services necessary for effective global monitoring. Upon completion of the IMS, 321 stations will be contributing to both near real-time and reviewed data products. Currently there are 194 facilities in IDC operations. This number is expected to increase by about 40% over the next few years, necessitating methods and tools to effectively handle the expansion. The requirements of high data availability as well as operational transparency are fundamental principals of IMS network operations, therefore, a suite of tools for monitoring and reporting have been developed. These include applications for monitoring Global Communication Infrastructure (GCI) links, detecting outages in continuous and segmented data, monitoring the status of data processing and forwarding to member states, and for systematic electronic communication and problem ticketing. The operation of the IMS network requires the help of local specialists whose cooperation is in some cases ensured by contracts or other agreements. The PTS (Provisional Technical Secretariat) strives to make the monitoring of the IMS as standardized and efficient as possible, and has therefore created the Operations Centre in which the use of most the tools are centralized. Recently the tasks of operations across all technologies, including the GCI, have been centralized within a single section of the organization. To harmonize the operations, an ongoing State of Health monitoring project will provide an integrated view of network, station and GCI performance and will provide system metrics. Comprehensive procedures will be developed to utilize this tool. However, as the IMS network expands, easier access to more information will cause additional challenges, mainly with human resources, to analyze and manage these metrics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friese, Judah I.; Kephart, Rosara F.; Lucas, Dawn D.
2013-05-01
The Comprehensive Nuclear Test Ban Treaty (CTBT) has remote radionuclide monitoring followed by an On Site Inspection (OSI) to clarify the nature of a suspect event. An important aspect of radionuclide measurements on site is the discrimination of other potential sources of similar radionuclides such as reactor accidents or medical isotope production. The Chernobyl and Fukushima nuclear reactor disasters offer two different reactor source term environmental inputs that can be compared against historical measurements of nuclear explosions. The comparison of whole-sample gamma spectrometry measurements from these three events and the analysis of similarities and differences are presented. This analysis ismore » a step toward confirming what is needed for measurements during an OSI under the auspices of the Comprehensive Test Ban Treaty.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wogman, Ned A.; Milbrath, Brian D.; Payne, Rosara F.
This paper is intended to serve as a scientific basis to start discussions of the available environmental sampling techniques and equipment that have been used in the past that could be considered for use within the context of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) on-site inspections (OSI). This work contains information on the techniques, equipment, costs, and some operational procedures associated with environmental sampling that have actually been used in the past by the United States for the detection of nuclear explosions. This paper also includes a discussion of issues, recommendations, and questions needing further study within the context of themore » sampling and analysis of aquatic materials, atmospheric gases, atmospheric particulates, vegetation, sediments and soils, fauna, and drill-back materials.« less
Crustal Seismic Attenuation in Germany Measured with Acoustic Radiative Transfer Theory
NASA Astrophysics Data System (ADS)
Gaebler, Peter J.; Eulenfeld, Tom; Wegler, Ulrich
2017-04-01
This work is carried out in the context of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty a verification regime was introduced to detect, locate and characterize nuclear explosion testings. The study of seismology can provide essential information in the form of broadband waveform recordings for the identification and verification of these critical events. A profound knowledge of the Earth's subsurface between source and receiver is required for a detailed description of the seismic wave field. In addition to underground parameters such as seismic velocity or anisotropy, information about seismic attenuation values of the medium are required. Goal of this study is the creation of a comprehensive model of crustal seismic attenuation in Germany and adjacent areas. Over 20 years of earthquake data from the German Central Seismological Observatory data archive is used to estimate the spatial dependent distribution of seismic intrinsic and scattering attenuation of S-waves for frequencies between 0.5 and 20 Hz. The attenuation models are estimated by fitting synthetic seismogram envelopes calculated with acoustic radiative transfer theory to observed seismogram envelopes. This theory describes the propagation of seismic S-energy under the assumption of multiple isotropic scattering, the crustal structure of the scattering medium is hereby represented by a half-space model. We present preliminary results of the spatial distribution of intrinsic attenuation represented by the absorption path length, as well as of scattering attenuation in terms of the mean free path and compare the outcomes to results from previous studies. Furthermore catalog magnitudes are compared to moment magnitudes estimated during the inversion process. Additionally site amplification factors of the stations are presented.
Alcoverro, Benoit; Le Pichon, Alexis
2005-04-01
The implementation of the infrasound network of the International Monitoring System (IMS) for the enforcement of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) increases the effort in the design of suitable noise reducer systems. In this paper we present a new design consisting of low impedance elements. The dimensioning and the optimization of this discrete mechanical system are based on numerical simulations, including a complete electroacoustical modeling and a realistic wind-noise model. The frequency response and the noise reduction obtained for a given wind speed are compared to statistical noise measurements in the [0.02-4] Hz frequency band. The effects of the constructive parameters-the length of the pipes, inner diameters, summing volume, and number of air inlets-are investigated through a parametric study. The studied system consists of 32 air inlets distributed along an overall diameter of 16 m. Its frequency response is flat up to 4 Hz. For a 2 m/s wind speed, the maximal noise reduction obtained is 15 dB between 0.5 and 4 Hz. At lower frequencies, the noise reduction is improved by the use of a system of larger diameter. The main drawback is the high-frequency limitation introduced by acoustical resonances inside the pipes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellors, R J
The Comprehensive Nuclear Test Ban Treaty (CTBT) includes provisions for an on-site inspection (OSI), which allows the use of specific techniques to detect underground anomalies including cavities and rubble zones. One permitted technique is active seismic surveys such as seismic refraction or reflection. The purpose of this report is to conduct some simple modeling to evaluate the potential use of seismic reflection in detecting cavities and to test the use of open-source software in modeling possible scenarios. It should be noted that OSI inspections are conducted under specific constraints regarding duration and logistics. These constraints are likely to significantly impactmore » active seismic surveying, as a seismic survey typically requires considerable equipment, effort, and expertise. For the purposes of this study, which is a first-order feasibility study, these issues will not be considered. This report provides a brief description of the seismic reflection method along with some commonly used software packages. This is followed by an outline of a simple processing stream based on a synthetic model, along with results from a set of models representing underground cavities. A set of scripts used to generate the models are presented in an appendix. We do not consider detection of underground facilities in this work and the geologic setting used in these tests is an extremely simple one.« less
NASA Astrophysics Data System (ADS)
Aviv, O.; Lipshtat, A.
2018-05-01
On-Site Inspection (OSI) activities under the Comprehensive Nuclear-Test-Ban Treaty (CTBT) allow limitations to measurement equipment. Thus, certain detectors require modifications to be operated in a restricted mode. The accuracy and reliability of results obtained by a restricted device may be impaired. We present here a method for limiting data acquisition during OSI. Limitations are applied to a high-resolution high-purity germanium detector system, where the vast majority of the acquired data that is not relevant to the inspection is filtered out. The limited spectrum is displayed to the user and allows analysis using standard gamma spectrometry procedures. The proposed method can be incorporated into commercial gamma-ray spectrometers, including both stationary and mobile-based systems. By applying this procedure to more than 1000 spectra, representing various scenarios, we show that partial data are sufficient for reaching reliable conclusions. A comprehensive survey of potential false-positive identifications of various radionuclides is presented as well. It is evident from the results that the analysis of a limited spectrum is practically identical to that of a standard spectrum in terms of detection and quantification of OSI-relevant radionuclides. A future limited system can be developed making use of the principles outlined by the suggested method.
An IMS Station life cycle from a sustainment point of view
NASA Astrophysics Data System (ADS)
Brely, Natalie; Gautier, Jean-Pierre; Foster, Daniel
2014-05-01
The International Monitoring System (IMS) is to consist of 321 monitoring facilities, composed of four different technologies with a variety of designs and equipment types, deployed in a range of environments around the globe. The International Monitoring System is conceived to operate in perpetuity through maintenance, replacement and recapitalization of IMS facilities' infrastructure and equipment when the end of service life is reached [CTBT/PTS/INF.1163]. Life Cycle techniques and modellization are being used by the PTS to plan and forecast life cycle sustainment requirements of IMS facilities. Through historical data analysis, Engineering inputs and Feedback from experienced Station Operators, the PTS currently works towards increasing the level of confidence on these forecasts and sustainment requirements planning. Continued validation, feedback and improvement of source data from scientific community and experienced users is sought and essential in order to ensure limited effect on data availability and optimal costs (human and financial).
New evaluated radioxenon decay data and its implications in nuclear explosion monitoring.
Galan, Monica; Kalinowski, Martin; Gheddou, Abdelhakim; Yamba, Kassoum
2018-03-07
This work presents the last updated evaluations of the nuclear and decay data of the four radioxenon isotopes of interest for the Comprehensive Nuclear-Test-Ban Treaty (CTBT): Xe-131 m, Xe-133, Xe-133 m and Xe-135. This includes the most recent measured values on the half-lives, gamma emission probabilities (Pγ) and internal conversion coefficients (ICC). The evaluation procedure has been made within the Decay Data Evaluation Project (DDEP) framework and using the latest available versions of nuclear and atomic data evaluation software tools and compilations. The consistency of the evaluations was confirmed by the very close result between the total available energy calculated with the present evaluated data and the tabulated Q-value. The article also analyzes the implications on the variation of the activity ratio calculations from radioxenon monitoring facilities depending on the nuclear database of reference. Copyright © 2018. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Ross, J. Ole; Ceranna, Lars
2016-04-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. The International Monitoring System (IMS) is in place and at about 90% complete to verify compliance with the CTBT. The stations of the waveform technologies are capable to detect seismic, hydro-acoustic and infrasonic signals for detection, localization, and characterization of explosions. The seismic signals of the DPRK event on 6 January 2016 were detected by many seismic stations around the globe and allow for localization of the event and identification as explosion (see poster by G. Hartmann et al.). However, the direct evidence for a nuclear explosion is only possible through the detection of nuclear fission products which may be released. For that 80 Radionuclide (RN) Stations are part of the designed IMS, about 60 are already operational. All RN stations are highly sensitive for tiny traces of particulate radionuclides in large volume air samplers. There are 40 of the RN stations designated to be equipped with noble gas systems detecting traces of radioactive xenon isotopes which are more likely to escape from an underground test cavity than particulates. Already 30 of the noble gas systems are operational. Atmospheric Transport Modelling supports the interpretation of radionuclide detections (and as appropriate non-detections) by connecting the activity concentration measurements with potential source locations and release times. In our study forecasts with the Lagrangian Particle Dispersion Model HYSPLIT (NOAA) and GFS (NCEP) meteorological data are considered to assess the plume propagation patterns for hypothetical releases at the known DPRK nuclear test site. The results show a considerable sensitivity of the IMS station RN 38 Takasaki (Japan) to a potential radionuclide release at the test site in the days and weeks following the explosion in January 2016. In addition, backtracking simulations with ECMWF analysis data in 0.2° horizontal resolution are performed for selected samples to get a complementary estimation of the sensitivities and the connected thresholds for detectable releases.The meteorological situation is compared to the aftermath of the nuclear explosion on 12 February 2013 after which a specific occurrence of an unusual 131mXe signature at RN 38 eight weeks after the test could be very likely attributed to a late release from the DPRK event.
Sloan, Jamison; Sun, Yunwei; Carrigan, Charles
2016-05-01
Enforcement of the Comprehensive Nuclear Test Ban Treaty (CTBT) will involve monitoring for radiologic indicators of underground nuclear explosions (UNEs). A UNE produces a variety of radioisotopes which then decay through connected radionuclide chains. A particular species of interest is xenon, namely the four isotopes (131m)Xe, (133m)Xe, (133)Xe, and (135)Xe. Due to their half lives, some of these isotopes can exist in the subsurface for more than 100 days. This convenient timescale, combined with modern detection capabilities, makes the xenon family a desirable candidate for UNE detection. Ratios of these isotopes as a function of time have been studied in the past for distinguishing nuclear explosions from civilian nuclear applications. However, the initial yields from UNEs have been treated as fixed values. In reality, these independent yields are uncertain to a large degree. This study quantifies the uncertainty in xenon ratios as a result of these uncertain initial conditions to better bound the values that xenon ratios can assume. We have successfully used a combination of analytical and sampling based statistical methods to reliably bound xenon isotopic ratios. We have also conducted a sensitivity analysis and found that xenon isotopic ratios are primarily sensitive to only a few of many uncertain initial conditions. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Carrigan, Charles R.; Sun, Yunwei
2014-03-01
The development of a technically sound approach to detecting the subsurface release of noble gas radionuclides is a critical component of the on-site inspection (OSI) protocol under the Comprehensive Nuclear Test Ban Treaty. In this context, we are investigating a variety of technical challenges that have a significant bearing on policy development and technical guidance regarding the detection of noble gases and the creation of a technically justifiable OSI concept of operation. The work focuses on optimizing the ability to capture radioactive noble gases subject to the constraints of possible OSI scenarios. This focus results from recognizing the difficulty of detecting gas releases in geologic environments—a lesson we learned previously from the non-proliferation experiment (NPE). Most of our evaluations of a sampling or transport issue necessarily involve computer simulations. This is partly due to the lack of OSI-relevant field data, such as that provided by the NPE, and partly a result of the ability of computer-based models to test a range of geologic and atmospheric scenarios far beyond what could ever be studied by field experiments, making this approach very highly cost effective. We review some highlights of the transport and sampling issues we have investigated and complete the discussion of these issues with a description of a preliminary design for subsurface sampling that addresses some of the sampling challenges discussed here.
Entering the New Millennium: Dilemmas in Arms Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
BROWN,JAMES
The end of the Cold War finds the international community no longer divided into two opposing blocks. The concerns that the community now faces are becoming more fluid, less focused, and, in many ways, much less predictable. Issues of religion, ethnicity, and nationalism; the possible proliferation of Weapons of Mass Destruction; and the diffusion of technology and information processing throughout the world community have greatly changed the international security landscape in the last decade. Although our challenges appear formidable, the United Nations, State Parties, nongovernmental organizations, and the arms control community are moving to address and lessen these concerns throughmore » both formal and informal efforts. Many of the multilateral agreements (e.g., NPT, BWC, CWC, CTBT, MTCR), as well as the bilateral efforts that are taking place between Washington and Moscow employ confidence-building and transparency measures. These measures along with on-site inspection and other verification procedures lessen suspicion and distrust and reduce uncertainty, thus enhancing stability, confidence, and cooperation.« less
NASA Astrophysics Data System (ADS)
Sambell, K.; Evers, L. G.; Snellen, M.
2017-12-01
Deriving the deep-ocean temperature is a challenge. In-situ observations and satellite observations are hardly applicable. However, knowledge about changes in the deep ocean temperature is important in relation to climate change. Oceans are filled with low-frequency sound waves created by sources such as underwater volcanoes, earthquakes and seismic surveys. The propagation of these sound waves is temperature dependent and therefore carries valuable information that can be used for temperature monitoring. This phenomenon is investigated by applying interferometry to hydroacoustic data measured in the South Pacific Ocean. The data is measured at hydrophone station H03 which is part of the International Monitoring System (IMS). This network consists of several stations around the world and is in place for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The station consists of two arrays located north and south of Robinson Crusoe Island separated by 50 km. Both arrays consist of three hydrophones with an intersensor distance of 2 km located at a depth of 1200 m. This depth is in range of the SOFAR channel. Hydroacoustic data measured at the south station is cross-correlated for the time period 2014-2017. The results are improved by applying one-bit normalization as a preprocessing step. Furthermore, beamforming is applied to the hydroacoustic data in order to characterize ambient noise sources around the array. This shows the presence of a continuous source at a backazimuth between 180 and 200 degrees throughout the whole time period, which is in agreement with the results obtained by cross-correlation. Studies on source strength show a seasonal dependence. This is an indication that the sound is related to acoustic activity in Antarctica. Results on this are supported by acoustic propagation modeling. The normal mode technique is used to study the sound propagation from possible source locations towards station H03.
Bayesian Monitoring Systems for the CTBT: Historical Development and New Results
NASA Astrophysics Data System (ADS)
Russell, S.; Arora, N. S.; Moore, D.
2016-12-01
A project at Berkeley, begun in 2009 in collaboration with CTBTO andmore recently with LLNL, has reformulated the global seismicmonitoring problem in a Bayesian framework. A first-generation system,NETVISA, has been built comprising a spatial event prior andgenerative models of event transmission and detection, as well as aMonte Carlo inference algorithm. The probabilistic model allows forseamless integration of various disparate sources of information,including negative information (the absence of detections). Workingfrom arrivals extracted by traditional station processing fromInternational Monitoring System (IMS) data, NETVISA achieves areduction of around 60% in the number of missed events compared withthe currently deployed network processing system. It also finds manyevents that are missed by the human analysts who postprocess the IMSoutput. Recent improvements include the integration of models forinfrasound and hydroacoustic detections and a global depth model fornatural seismicity trained from ISC data. NETVISA is now fullycompatible with the CTBTO operating environment. A second-generation model called SIGVISA extends NETVISA's generativemodel all the way from events to raw signal data, avoiding theerror-prone bottom-up detection phase of station processing. SIGVISA'smodel automatically captures the phenomena underlying existingdetection and location techniques such as multilateration, waveformcorrelation matching, and double-differencing, and integrates theminto a global inference process that also (like NETVISA) handles denovo events. Initial results for the Western US in early 2008 (whenthe transportable US Array was operating) shows that SIGVISA finds,from IMS data only, more than twice the number of events recorded inthe CTBTO Late Event Bulletin (LEB). For mb 1.0-2.5, the ratio is more than10; put another way, for this data set, SIGVISA lowers the detectionthreshold by roughly one magnitude compared to LEB. The broader message of this work is that probabilistic inference basedon a vertically integrated generative model that directly expressesgeophysical knowledge can be a much more effective approach forinterpreting scientific data than the traditional bottom-up processingpipeline.
The Seismic Aftershock Monitoring System (SAMS) for OSI - Experiences from IFE14
NASA Astrophysics Data System (ADS)
Gestermann, Nicolai; Sick, Benjamin; Häge, Martin; Blake, Thomas; Labak, Peter; Joswig, Manfred
2016-04-01
An on-site inspection (OSI) is the third of four elements of the verification regime of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The sole purpose of an OSI is to confirm whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of the treaty and to gather any facts which might assist in identifying any possible violator. It thus constitutes the final verification measure under the CTBT if all other available measures are not able to confirm the nature of a suspicious event. The Provisional Technical Secretariat (PTS) carried out the Integrated Field Exercise 2014 (IFE14) in the Dead Sea Area of Jordan from 3 November to 9. December 2014. It was a fictitious OSI whose aim was to test the inspection capabilities in an integrated manner. The technologies allowed during an OSI are listed in the Treaty. The aim of the Seismic Aftershock Monitoring System (SAMS) is to detect and localize aftershocks of low magnitudes of the triggering event or collapses of underground cavities. The locations of these events are expected in the vicinity of a possible previous explosion and help to narrow down the search area within an inspection area (IA) of an OSI. The success of SAMS depends on the main elements, hardware, software, deployment strategy, the search logic and not least the effective use of personnel. All elements of SAMS were tested and improved during the Built-Up Exercises (BUE) which took place in Austria and Hungary. IFE14 provided more realistic climatic and hazardous terrain conditions with limited resources. Significant variations in topography of the IA of IFE14 in the mountainous Dead Sea Area of Jordan led to considerable challenges which were not expected from experiences encountered during BUE. The SAMS uses mini arrays with an aperture of about 100 meters and with a total of 4 elements. The station network deployed during IFE14 and results of the data analysis will be presented. Possible aftershocks of the triggering event are expected in a very low magnitude range. Therefore the detection threshold of the network is one of the key parameters of SAMS and crucial for the success of the monitoring. One of the objectives was to record magnitude values down to -2.0 ML. The threshold values have been compared with historical seismicity in the region and those monitored during IFE14. Results of the threshold detection estimation and experiences of the exercise will be presented.
Participation of the NDC Austria at the NDC Preparedness Exercise 2012
NASA Astrophysics Data System (ADS)
Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene
2013-04-01
NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.
Zhang, Weihua; Sadi, Baki; Rinaldo, Christopher; Chen, Jing; Spencer, Norman; Ungar, Kurt
2018-08-01
In this study, the activity concentrations of 210 Pb and 210 Po on the 22 daily air filter samples, collected at CTBT Yellowknife station from September 2015 to April 2016, were analysed. To estimate the time scale of atmospheric long-range transport aerosol bearing 210 Pb in the Arctic during winter, the mean transit time of aerosol bearing 210 Pb from its origin was determined based on the activity ratios of 210 Po/ 210 Pb and the parent-progeny decay/ingrowth equation. The activity ratios of 210 Po/ 210 Pb varied between 0.06 and 0.21 with a median value of 0.11. The aerosol mean transit time based the activity ratio of 210 Po/ 210 Pb suggests longer mean transit time of 210 Pb aerosols in winter (12 d) than in autumn (3.7 d) and spring (2.9 d). Four years 210 Pb and 212 Pb monitoring results and meteorological conditions at the Yellowknife station indicate that the 212 Pb activity is mostly of local origin, and that 210 Pb aerosol in wintertime are mainly from outside of the Arctic regions in common with other pollutants and sources contributing to the Arctic. The activity concentration ratios of 210 Pb and 212 Pb have a relatively constant value in summer with a significant peak observed in winter, centered in the month of February. Comparison of the 210 Pb/ 212 Pb activity ratios and the estimated mean 210 Pb transit time, the mean aerosol transit times were real reflection of the atmosphere transport characteristics, which can be used as a radio-chronometer for the transport of air masses to the Arctic region. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Infrasound data inversion for atmospheric sounding
NASA Astrophysics Data System (ADS)
Lalande, J.-M.; Sèbe, O.; Landès, M.; Blanc-Benon, Ph.; Matoza, R. S.; Le Pichon, A.; Blanc, E.
2012-07-01
The International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) continuously records acoustic waves in the 0.01-10 Hz frequency band, known as infrasound. These waves propagate through the layered structure of the atmosphere. Coherent infrasonic waves are produced by a variety of anthropogenic and natural sources and their propagation is controlled by spatiotemporal variations of temperature and wind velocity. Natural stratification of atmospheric properties (e.g. temperature, density and winds) forms waveguides, allowing long-range propagation of infrasound waves. However, atmospheric specifications used in infrasound propagation modelling suffer from lack and sparsity of available data above an altitude of 50 km. As infrasound can propagate in the upper atmosphere up to 120 km, we assume that infrasonic data could be used for sounding the atmosphere, analogous to the use of seismic data to infer solid Earth structure and the use of hydroacoustic data to infer oceanic structure. We therefore develop an inversion scheme for vertical atmospheric wind profiles in the framework of an iterative linear inversion. The forward problem is treated in the high-frequency approximation using a Hamiltonian formulation and complete first-order ray perturbation theory is developed to construct the Fréchet derivatives matrix. We introduce a specific parametrization for the unknown model parameters based on Principal Component Analysis. Finally, our algorithm is tested on synthetic data cases spanning different seasonal periods and network configurations. The results show that our approach is suitable for infrasound atmospheric sounding on a regional scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, P.; Schultz, C.; Larsen, S.
1997-07-15
Monitoring of a CTBT will require transportable seismic identification techniques, especially in regions where there is limited data. Unfortunately, most existing techniques are empirical and can not be used reliably in new regions. Our goal is to help develop transportable regional identification techniques by improving our ability to predict the behavior of regional phases and discriminants in diverse geologic regions and in regions with little or no data. Our approach is to use numerical modeling to understand the physical basis for regional wave propagation phenomena and to use this understanding to help explain observed behavior of regional phases and discriminants.more » In this paper, we focus on results from simulations of data in selected regions and investigate the sensitivity of these regional simulations to various features of the crustal structure. Our initial models use teleseismically estimated source locations, mechanisms, and durations and seismological structures that have been determined by others. We model the Mb 5.9, October 1992, Cairo Egypt earthquake at a station at Ankara Turkey (ANTO) using a two-dimensional crustal model consisting of a water layer over a deep sedimentary basin with a thinning crust beneath the basin. Despite the complex tectonics of the Eastern Mediterranean region, we find surprisingly good agreement between the observed data and synthetics based on this relatively smooth two-dimensional model.« less
NASA Astrophysics Data System (ADS)
Wotawa, Gerhard; Schraick, Irene
2010-05-01
An explosion in the Kara-Zhyra mine in Eastern Kazakhstan on 28 November 2009 around 07:20 UTC was recorded by both the CTBTO seismic and infrasound networks. This event triggered a world-wide preparedness exercise among the CTBTO National Data Centres. Within an hour after the event was selected by the German NDC, a computer program developed by NDC Austria based on weather forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) and from the U.S. National Centers for Environmental Prediction (NCEP) was started to analyse what Radionuclide Stations of the CTBTO International Monitoring System (IMS) would be potentially affected by the release from a nuclear explosion at this place in the course of the following 3-10 days. These calculations were daily updated to consider the observed state of the atmosphere instead of the predicted one. Based on these calculations, automated and reviewed radionuclide reports from the potentially affected stations as produced by the CTBTO International Data Centre (IDC) were looked at. An additional analysis of interesting spectra was provided by the Seibersdorf Laboratories. Based on all the results coming in, no evidence whatsoever was found that the explosion in Kazakhstan was nuclear. This is in accordance with ground truth information saying that the event was caused by the detonation of more than 53 Tons of explosives as part of mining operations. A number of conclusions can be drawn from this exercise. First, the international, bilateral as well as national mechanisms and procedures in place for such an event worked smoothly. Second, the products and services from the CTBTO IDC proved to be very useful to assist the member states in their verification efforts. Last but not least, issues with the availability of data from IMS radionuclide stations do remain.
Stockpile stewardship past, present, and future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Marvin L., E-mail: mladams@tamu.edu
2014-05-09
The U.S. National Academies released a report in 2012 on technical issues related to the Comprehensive Test Ban Treaty. One important question addressed therein is whether the U.S. could maintain a safe, secure, and reliable nuclear-weapons stockpile in the absence of nuclear-explosion testing. Here we discuss two main conclusions from the 2012 Academies report, which we paraphrase as follows: 1) Provided that sufficient resources and a national commitment to stockpile stewardship are in place, the U.S. has the technical capabilities to maintain a safe, secure, and reliable stockpile of nuclear weapons into the foreseeable future without nuclear-explosion testing. 2) Doingmore » this would require: a) a strong weapons science and engineering program that addresses gaps in understanding; b) an outstanding workforce that applies deep and broad weapons expertise to deliver solutions to stockpile problems; c) a vigorous, stable surveillance program that delivers the requisite data; d) production facilities that meet stewardship needs. We emphasize that these conclusions are independent of CTBT ratification-they apply provided only that the U.S. continues its nuclear-explosion moratorium.« less
Characterization of a Commercial Silicon Beta Cell
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foxe, Michael P.; Hayes, James C.; Mayer, Michael F.
Silicon detectors are of interest for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) due to their enhanced energy resolution compared to plastic scintillators beta cells. Previous work developing a figure-of-merit (FOM) for comparison of beta cells suggests that the minimum detectable activity (MDA) could be reduced by a factor of two to three with the use of silicon detectors. Silicon beta cells have been developed by CEA (France) and Lares Ltd. (Russia), with the PIPSBox developed by CEA being commercially available from Canberra for approximately $35k, but there is still uncertainty about the reproducibility of the capabilities in themore » field. PNNL is developing a high-resolution beta-gamma detector system in the shallow underground laboratory, which will utilize and characterize the operation of the PIPSBox detector. Throughout this report, we examine the capabilities of the PIPSBox as developed by CEA. The lessons learned through the testing and use of the PIPSBox will allow PNNL to strategically develop a silicon detector optimized to better suit the communities needs in the future.« less
Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G
2014-02-01
Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
New Horizons and New Strategies in Arms Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, J. editor
In the last ten years, since the break-up of the Soviet Union, remarkable progress in arms control and disarmament has occurred. The Nuclear Non-Proliferation Treaty (NPT), the completion of the Comprehensive Test Ban Treaty (CTBT), and the Chemical Weapons Treaty (CWC) are indicative of the great strides made in the non- proliferation arena. Simultaneously, the Intermediate Nuclear Forces Treaty (INF), the Conventional Forces Treaty in Europe (CFE), and the Strategic Arms Reduction Treaties (START), all associated with US-Soviet Union (now Russia) relations have assisted in redefining European relations and the security landscape. Finally, it now appears that progress is inmore » the offing in developing enhanced compliance measures for the Biological and Toxin Weapons Convention (BTWC). In sum, all of these achievements have set the stage for the next round of arms control activities, which may lead to a much broader, and perhaps more diffused multilateral agenda. In this new and somewhat unpredictable international setting, arms control and disarmament issues will require solutions that are both more creative and innovative than heretofore.« less
NASA Astrophysics Data System (ADS)
Liebsch, Mattes; Altmann, Jürgen
2015-04-01
For the verification of the Comprehensive Nuclear Test Ban Treaty (CTBT) the precise localisation of possible underground nuclear explosion sites is important. During an on-site inspection (OSI) sensitive seismic measurements of aftershocks can be performed, which, however, can be disturbed by other signals. To improve the quality and effectiveness of these measurements it is essential to understand those disturbances so that they can be reduced or prevented. In our work we focus on disturbing signals caused by airborne sources: When the sound of aircraft (as often used by the inspectors themselves) hits the ground, it propagates through pores in the soil. Its energy is transferred to the ground and soil vibrations are created which can mask weak aftershock signals. The understanding of the coupling of acoustic waves to the ground is still incomplete. However, it is necessary to improve the performance of an OSI, e.g. to address potential consequences for the sensor placement, the helicopter trajectories etc. We present our recent advances in this field. We performed several measurements to record sound pressure and soil velocity produced by various sources, e.g. broadband excitation by jet aircraft passing overhead and signals artificially produced by a speaker. For our experimental set-up microphones were placed close to the ground and geophones were buried in different depths in the soil. Several sensors were shielded from the directly incident acoustic signals by a box coated with acoustic damping material. While sound pressure under the box was strongly reduced, the soil velocity measured under the box was just slightly smaller than outside of it. Thus these soil vibrations were mostly created outside the box and travelled through the soil to the sensors. This information is used to estimate characteristic propagation lengths of the acoustically induced signals in the soil. In the seismic data we observed interference patterns which are likely caused by the superposition of acoustically induced seismic waves with reflections at a layer boundary. Their frequencies of increased/decreased amplitudes depend on the angle of incidence of the acoustic signal. So these patterns can be used to estimate the path(s) of propagation of acoustically induced soil vibrations. The frequency-dependent phase offset between different sensors is used to estimate the propagation velocity of soil. The research aims to deliver a better understanding of the interaction of acoustic waves and the ground when hitting the surface, the transfer of energy from sound waves into the soil and the possible excitation of seismic surface waves. The goal is to develop recommendations for sensitive seismic measurements during CTBTO on-site inspections to reduce disturbing vibrations caused by airborne sources.
NASA Astrophysics Data System (ADS)
Saey, P. R. J.; Auer, M.; Becker, A.; Colmanet, S.; Hoffmann, E.; Nikkinen, M.; Schlosser, C.; Sonck, M.
2009-04-01
Atmospheric radioxenon monitoring is a key component of the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Radiopharmaceutical production facilities (RPF) have recently been identified of emitting the major part of the environmental radioxenon measured at globally distributed monitoring sites deployed to strengthen the radionuclide part of the CTBT verification regime. Efforts to raise a global radioxenon emission inventory revealed that the global total emission from RPF's is 2-3 orders of magnitude higher than the respective emissions related to maintenance of all nuclear power plants (NPP). Given that situation we have seen in 2008 two peculiar hemisphere-specific situations: 1) In the northern hemisphere, a joint shutdown of the global largest four radiopharmaceutical facilities revealed the contribution of the normally 'masked' NPP related emissions. Due to an incident, the Molybdenum production at the "Institut des Radioéléments" (IRE) in Fleurus, Belgium, was shut down between Monday 25 August and 2 December 2008. IRE is the third largest global producer of medical isotopes. In the same period, but for different reasons, the other three worldwide largest producers (CRL in Canada, HFR in The Netherlands and NTP in South Africa) also had scheduled and unscheduled shutdowns. The activity concentrations of 133Xe measured at the Schauinsland Mountain station near Freiburg in Germany (situated 380 km SW of Fleurus) which have a mean of 4.8 mBq/m3 for the period February 2004 - August 2008, went down to 0.87 mBq/m3 for the period September - November 2008. 2) In the southern hemisphere, after a long break, the only radiopharmaceutical facility in Australia started up test production in late November 2008. In the period before the start-up, the background of radioxenon in Australia (Melbourne and Darwin) was below measurable quantities. During six test runs of the renewed RPF at ANSTO in Lucas Heights, up to 6 mBq/m3 of 133Xe were measured in the station at Melbourne, 700 km SW from the facility. This paper confirms the hypothesis that radiopharmaceutical production facilities are the major emitters of radioxenon first of all. Moreover it demonstrates how the temporal shut down of these facilities indicates the scale of their contribution to the European radioxenon background, which decreased 6 fold. Finally we have studied the contribution of the start-up of a renewed RFP to the buildup of a radioxenon background across Australia and the southern hemisphere. Disclaimer The views expressed in this publication are those of the authors and do not necessarily reflect the views of the CTBTO Preparatory Commission or any of the participating institutions.
Low Noise Results From IMS Site Surveys: A Preliminary New High-Frequency Low Noise Model
NASA Astrophysics Data System (ADS)
Ebeling, C.; Astiz, L.; Starovoit, Y.; Tavener, N.; Perez, G.; Given, H. K.; Barrientos, S.; Yamamoto, M.; Hfaiedh, M.; Stewart, R.; Estabrook, C.
2002-12-01
Since the establishment of the Provisional Technical Secretariat (PTS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Organization, a vigorous seismic site survey program has been carried out to identify locations as necessary for International Monitoring System (IMS) primary and auxiliary seismic stations listed in Annex 1 to the Protocol to the CTBT. The IMS Seismic Section maintains for this purpose a small pool of seismic equipment comprised of Guralp CMG-3T and CMG-3ESP and Streckeisen STS-2 broadband seismometers, and Reftek and Guralp acquisition systems. Seismic site surveys are carried out by conducting continuous measurements of ground motion at temporary installations for approximately five to seven days. Seismometer installation methods, which depend on instrument type and on local conditions, range from placement within small cement-floored subsurface vaults to near-surface burial. Data are sampled at 40 Hz. Seismic noise levels are evaluated through the analysis of power spectral density distributions. Eleven 10.5-minute-long representative de-trended and mean-removed segments each of daytime and night-time data are chosen randomly, but reviewed to avoid event contamination. Fast Fourier Transforms are calculated for the five windows in each of these segments generated using a 50% overlap for Hanning-tapered sections ~200 s long. Instrument responses are removed. To date, 20 site surveys for primary and auxiliary stations have been carried out by the IMS. The sites surveyed represent a variety of physical and geological environments on most continents. The lowest high frequency (>1.4 Hz) noise levels at five sites with igneous or metamorphic geologies were as much as 6 dB below the USGS New Low Noise Model (NLNM) developed by Peterson (1993). These sites were in Oman (local geology consisting of Ordovician metasediments), Egypt (Precambrian granite), Niger (early Proterozoic tonalite and granodiorite), Saudi Arabia (Precambian metasediments), and Zimbabwe (Archaean granite). Based on a composite of the results from these five surveys, we propose a preliminary IMS Low-Noise Model (pIMS-LNM) consisting of a revision downward of Peterson's NLNM in the passband from 0.1 to about 0.7 s and an extension of Peterson's NLNM above 0.1 to 0.07 s. As these low noise results are derived from data recorded at temporary installations, improved resolution of this model will be possible when data from final installations become available. Preliminary International Monitoring System Low Noise Model (pIMS-LNM) for periods from 0.07 to 0.70 s. Decibels are relative to ground acceleration ((m/s2)2/Hz). Values presented in (Period, dB) format. Figure in bold is from Peterson's NLNM. [(0.07,-167.0),(0.08,-168.0),(0.09,-169.0),(0.10,-169.5), (0.11,-170.5),(0.13,-171.0),(0.14,-171.5),(0.17,-172.0), (0.20,-1 72.5),(0.25,-173.0),(0.30,-173.5),(0.40,-173.0), (0.50,-172.0),(0.60,-171.0),(0.70,-170.0),(0.80,-169.2)] Reference Peterson, J., 1993. Observations and Modeling of Seismic Background Noise, U.S. Geological Survey Open-File Report 93-322, 47 p.
McNamara, D.E.; Walter, W.R.
2001-01-01
In this paper we describe a technique for mapping the lateral variation of Lg characteristics such as Lg blockage, efficient Lg propagation, and regions of very high attenuation in the Middle East, North Africa, Europe and the Mediterranean regions. Lg is used in a variety of seismological applications from magnitude estimation to identification of nuclear explosions for monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). These applications can give significantly biased results if the Lg phase is reduced or blocked by discontinuous structure or thin crust. Mapping these structures using quantitative techniques for determining Lg amplitude attenuation can break down when the phase is below background noise. In such cases Lg blockage and inefficient propagation zones are often mapped out by hand. With our approach, we attempt to visually simplify this information by imaging crustal structure anomalies that significantly diminish the amplitude of Lg. The visualization of such anomalies is achieved by defining a grid of cells that covers the entire region of interest. We trace Lg rays for each event/ station pair, which is simply the great circle path, and attribute to each cell a value equal to the maximum value of the Lg/P-coda amplitude ratio for all paths traversing that particular cell. The resulting map, from this empirical approach, is easily interpreted in terms of crustal structure and can successfully image small blockage features often missed by analysis of raypaths alone. This map can then be used to screen out events with blocked Lg prior to performing Q tomography, and to avoid using Lg-based methods of event identification for the CTBT in regions where they cannot work. For this study we applied our technique to one of the most tectonically complex regions on the earth. Nearly 9000 earthquake/station raypaths, traversing the vast region comprised of the Middle East, Mediterranean, Southern Europe and Northern Africa, have been analyzed. We measured the amplitude of Lg relative to the P-coda and mapped the lateral variation of Lg propagation efficiency. With the relatively dense coverage provided by the numerous crossing paths we are able to map out the pattern of crustal heterogeneity that gives rise to the observed character of Lg propagation. We observe that the propagation characteristics of Lg within the region of interest are very complicated but are readily correlated with the different tectonic environments within the region. For example, clear strong Lg arrivals are observed for paths crossing the stable continental interiors of Northern Africa and the Arabian Shield. In contrast, weakened to absent Lg is observed for paths crossing much of the Middle East, and Lg is absent for paths traversing the Mediterranean. Regions that block Lg transmission within the Middle East are very localized and include the Caspian Sea, the Iranian Plateau and the Red Sea. Resolution is variable throughout the region and strongly depends on the distribution of seismicity and recording stations. Lg propagation is best resolved within the Middle East where regions of crustal heterogeneity on the order of 100 km are imaged (e.g., South Caspian Sea and Red Sea). Crustal heterogeneity is resolvable but is poorest in seismically quiescent Northern Africa.
NASA Astrophysics Data System (ADS)
Annewandter, R.; Kalinowksi, M. B.
2009-04-01
An underground nuclear explosion injects radionuclids in the surrounding host rock creating an initial radionuclid distribution. In the case of fractured permeable media, cyclical changes in atmospheric pressure can draw gaseous species upwards to the surface, establishing a ratcheting pump effect. The resulting advective transport is orders of magnitude more significant than transport by molecular diffusion. In the 1990s the US Department of Energy funded the socalled Non-Proliferation Experiment conducted by the Lawrence Livermore National Laboratory to investigate this barometric pumping effect for verifying compliance with respect to the Comprehensive Nuclear Test Ban Treaty. A chemical explosive of approximately 1 kt TNT-equivalent has been detonated in a cavity located 390 m deep in the Rainier Mesa (Nevada Test Site) in which two tracer gases were emplaced. Within this experiment SF6 was first detected in soil gas samples taken near fault zones after 50 days and 3He after 325 days. For this paper a locally one-dimensional dual-porosity model for flow along the fracture and within the permeable matrix was used after Nilson and Lie (1990). Seepage of gases and diffusion of tracers between fracture and matrix are accounted. The advective flow along the fracture and within the matrix block is based on the FRAM filtering remedy and methodology of Chapman. The resulting system of equations is solved by an implicit non-iterative algorithm. Results on time of arrival and subsurface concentration levels for the CTBT-relevant xenons will be presented.
Knowledge information management toolkit and method
Hempstead, Antoinette R.; Brown, Kenneth L.
2006-08-15
A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.
NASA Astrophysics Data System (ADS)
Vivas Veloso, J. A.; Christie, D. R.; Campus, P.; Bell, M.; Hoffmann, T. L.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.
2002-11-01
The infrasound component of the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Ban Treaty verification aims for global detection and localization of low-frequency sound waves originating from atmospheric nuclear explosions. The infrasound network will consist of 60 array stations, distributed as evenly as possible over the globe to assure at least two-station detection capability for 1-kton explosions at any point on earth. This network will be larger and more sensitive than any other previously operated infrasound network. As of today, 85% of the site surveys for IMS infrasound stations have been completed, 25% of the stations have been installed, and 8% of the installations have been certified and are transmitting high-quality continuous data to the International Data Center in Vienna. By the end of 2002, 20% of the infrasound network is expected to be certified and operating in post-certification mode. This presentation will discuss the current status and progress made in the site survey, installation, and certification programs for IMS infrasound stations. A review will be presented of the challenges and difficulties encountered in these programs, together with practical solutions to these problems.
High-pressure swing system for measurements of radioactive fission gases in air samples
NASA Astrophysics Data System (ADS)
Schell, W. R.; Vives-Battle, J.; Yoon, S. R.; Tobin, M. J.
1999-01-01
Radionuclides emitted from nuclear reactors, fuel reprocessing facilities and nuclear weapons tests are distributed widely in the atmosphere but have very low concentrations. As part of the Comprehensive Test Ban Treaty (CTBT), identification and verification of the emission of radionuclides from such sources are fundamental in maintaining nuclear security. To detect underground and underwater nuclear weapons tests, only the gaseous components need to be analyzed. Equipment has now been developed that can be used to collect large volumes of air, separate and concentrate the radioactive gas constituents, such as xenon and krypton, and measure them quantitatively. By measuring xenon isotopes with different half-lives, the time since the fission event can be determined. Developments in high-pressure (3500 kPa) swing chromatography using molecular sieve adsorbents have provided the means to collect and purify trace quantities of the gases from large volumes of air automatically. New scintillation detectors, together with timing and pulse shaping electronics, have provided the low-background levels essential in identifying the gamma ray, X-ray, and electron energy spectra of specific radionuclides. System miniaturization and portability with remote control could be designed for a field-deployable production model.
Infrasonic waves from volcanic eruptions on the Kamchatka peninsula
NASA Astrophysics Data System (ADS)
Gordeev, E. I.; Firstov, P. P.; Kulichkov, S. N.; Makhmudov, E. R.
2013-07-01
The IS44 station operates at the observation point of Nachiki on the Kamchatka peninsula, which is part of the International Monitoring System (IMS), and it helps verify compliance with the Comprehensive Nuclear Test-Ban Treaty (CTBT). The Kamchatka Branch, Geophysical Service, Russian Academy of Sciences (KB GS RAS), has a station operating in the village of Paratunka. Both of these stations allow one to monitor strong explosive eruptions of andesitic volcanoes.1 Both kinematic and dynamic parameters of acoustic signals accompanying the eruptions of the Bezymyannyi volcano (at a distance of 361 km from Nachiki) in 2009-2010 and the Kizimen volcano (at a distance of 275 km) on December 31, 2011, are considered. A low-frequency rarefaction phase 60 s in length has been revealed in the initial portion of the record of acoustic signals accompanying such strong eruptions. It is shown that the rarefaction phase occurs due to the rapid condensation of superheated juvenile vapor2 that enters the atmosphere during such explosions.3 The amount of volcanic ash emitted into the atmosphere has been estimated within (3.2-7.3) 106 m3 on the basis of acoustic signals recorded during the eruptions under consideration.
Radionuclide observables for the Platte underground nuclear explosive test on 14 April 1962
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burnett, Jonathan L.; Milbrath, Brian D.
2016-11-01
Past nuclear weapons tests provide invaluable information for understanding the radionuclide observables and data quality objectives expected during an On-site Inspection (OSI) for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). These radioactive signatures are complex and subject to spatial and temporal variability. The Platte Underground Nuclear Test on 14 April 1962 provides extensive environmental monitoring data that can be modelled and used to assess an OSI. The 1.6 kT test is especially useful as it released the highest amounts of recorded activity during Operation Nougat at the Nevada Test Site – now known as the Nevada National Security Site (NNSS). It hasmore » been estimated that 0.36% of the activity was released, and dispersed in a northerly direction. The deposition ranged from 1 x 10-11 to 1 x 10-9 of the atmospheric release (per m2), and has been used to evaluate a hypothetical OSI at 1 week to 2 years post-detonation. Radioactive decay reduces the activity of the 17 OSI relevant radionuclides by 99.7%, such that detection throughout the inspection is only achievable close to the explosion where deposition was highest.« less
Using continuous microbarom recordings for probing peri-Antarctica's atmosphere
NASA Astrophysics Data System (ADS)
Ceranna, Lars; Le Pichon, Alexis; Blanc, Elisabeth
2010-05-01
Germany is operating one of the four Antarctic infrasound stations to fulfil the compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). IS27 is a nine element array which is in continuous operation since its deployment in January 2003. Using the PMCC detection algorithm coherent signals are observed in the frequency range from 0.0002 to 4.0 Hz covering a large variety of infrasound sources such as low frequent mountain-associated wave or high frequency ice-quakes. The most prominent signals are related to microbaroms (mb) generated by the strong peri-Antarctic ocean swells. These continuous signals with a dominant period of 5 s show a clear trend in the direction of their detection being well correlated to the prevailing stratospheric winds. For mb-signals a strong increase in trace velocity along with a decrease in the number of detections were observed during the Austral summer 2006 indicating strong variations in the troposphere and the stratospheric wave duct. However, ECMWF wind speed profiles at the station give no evidence for such an anomaly. Nevertheless, a smaller El-Nino event during Austral winter 2006 together with cooling in the upper stratosphere caused by eruption of the Manam volcano in Indonesia provide a potential explanation for the abnormal ducting conditions. This will be demonstrated with a statistical approach for the dominating ray-parameter launched from the estimated source regions towards IS27 (based on NOAA wave watch III). An increase in gravity wave activity is considered for Austral summer 2006 since a comparison of ECMWF profiles and measured radiosonde data has revealed a cleaning of the numerical profiles with respect to turbulences in the troposphere and lower stratosphere.
The ISC Contribution to Monitoring Research
NASA Astrophysics Data System (ADS)
Storchak, D. A.; Bondar, I.; Harris, J.; Gaspà Rebull, O.
2010-12-01
The International Seismological Centre (ISC) is a non-governmental organization charged with production of the ISC Bulletin - the definitive global summary of seismicity based on reports from over 4.5 thousand seismic stations worldwide. The ISC data have been extensively used in preparation of the Comprehensive Test Ban Treaty (CTBT). They are now used by the CTBTO Preparatory Technical Secretariat (PTS) and the State Parties as an important benchmark for assessing and monitoring detection capabilities of the International Monitoring System (IMS). The ISC also provides a valuable collection of reviewed waveform readings at academic and operational sites co-located with the IMS stations. To improve the timeliness of its Bulletin, the ISC is making a special effort in collecting preliminary bulletins from a growing number of networks worldwide that become available soon after seismic events occur. Preliminary bulletins are later substituted with the final analysis data once these become available to the ISC from each network. The ISC also collects and maintains data sets that are useful for monitoring research. These are the IASPEI Reference Event List of globally distributed GT0-5 events, the groomed ISC bulletin (EHB), the IDC REB, USArray phase picking data. In cooperation with the World Data Center for Seismology, Denver (USGS), the ISC also maintains the International Seismographic Station Registry that holds parameters of seismic stations used in the international data exchange. The UK Foreign and Commonwealth Office along with partners from several Nordic countries are currently funding a project to make the ISC database securely linked with the computer facilities at PTS and National Data Centres. The ISC Bulletin data are made available via a dedicated software link designed to offer the ISC data in a way convenient to monitoring community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharyya, J.; Rodgers, A.; Swenson, J.
2000-07-14
Long-range seismic profiles from Peaceful Nuclear Explosions (PNE) in the Former Soviet Union (FSU) provide a unique data set to investigate several important issues in regional Comprehensive Nuclear-Test-Ban Treaty (CTBT) monitoring. The recording station spacing ({approx}15 km) allows for extremely dense sampling of the propagation from the source to {approx} 3300 km. This allows us to analyze the waveforms at local, near- and far-regional and teleseismic distances. These data are used to: (1) study the evolution of regional phases and phase amplitude ratios along the profile; (2) infer one-dimensional velocity structure along the profile; and (3) evaluate the spatial correlationmore » of regional and teleseismic travel times and regional phase amplitude ratios. We analyzed waveform data from four PNE's (m{sub b} = 5.1-5.6) recorded along profile KRATON, which is an east-west trending profile located in northern Sibertil. Short-period regional discriminants, such as P/S amplitude ratios, will be essential for seismic monitoring of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) at small magnitudes (m{sub b} < 4.0). However, P/S amplitude ratios in the short-period band, 0.5-5.0 Hz, show some scatter. This scatter is primarily due to propagation and site effects, which arise from variability in the elastic and anelastic structure of the crustal waveguide. Preliminary results show that Pg and Lg propagate efficiently in north Siberia at regional distances. The amplitude ratios show some variability between adjacent stations that are modeled by simple distance trends. The effect of topography, sediment and crustal thickness, and upper mantle discontinuities on these ratios, after removal of the distance trends, will be investigated. The travel times of the body wave phases recorded on KEATON have been used to compute the one-dimensional structure of the crust and upper mantle in this region. The path-averaged one-dimensional velocity model was computed by minimizing the first arriving P-phase travel-time residuals for all distances ({Delta} = 300-2300 km). A grid search approach was used in the minimization. The most significant features of this model are the negative lid-gradient and a low-velocity zone in the upper mantle between the depths of 100-200 km; precise location of the LVZ is poorly constrained by the travel time data. We will extend our investigation to additional PNE lines to further investigate the amplitude and travel-time variations in eastern and central Eurasia. Finally, the dense station spacing of the PNE profiles allows us to model the spatial correlation of travel times and amplitude ratios through variogram modeling. The statistical analysis suggests that the correlation lengths of the travel-time and amplitude measurements are 12{sup o} and 10{sup o}, respectively.« less
Comparison of clinical knowledge bases for summarization of electronic health records.
McCoy, Allison B; Sittig, Dean F; Wright, Adam
2013-01-01
Automated summarization tools that create condition-specific displays may improve clinician efficiency. These tools require new kinds of knowledge that is difficult to obtain. We compared five problem-medication pair knowledge bases generated using four previously described knowledge base development approaches. The number of pairs in the resulting mapped knowledge bases varied widely due to differing mapping techniques from the source terminologies, ranging from 2,873 to 63,977,738 pairs. The number of overlapping pairs across knowledge bases was low, with one knowledge base having half of the pairs overlapping with another knowledge base, and most having less than a third overlapping. Further research is necessary to better evaluate the knowledge bases independently in additional settings, and to identify methods to integrate the knowledge bases.
Knowledge repositories for multiple uses
NASA Technical Reports Server (NTRS)
Williamson, Keith; Riddle, Patricia
1991-01-01
In the life cycle of a complex physical device or part, for example, the docking bay door of the Space Station, there are many uses for knowledge about the device or part. The same piece of knowledge might serve several uses. Given the quantity and complexity of the knowledge that must be stored, it is critical to maintain the knowledge in one repository, in one form. At the same time, because of quantity and complexity of knowledge that must be used in life cycle applications such as cost estimation, re-design, and diagnosis, it is critical to automate such knowledge uses. For each specific use, a knowledge base must be available and must be in a from that promotes the efficient performance of that knowledge base. However, without a single source knowledge repository, the cost of maintaining consistent knowledge between multiple knowledge bases increases dramatically; as facts and descriptions change, they must be updated in each individual knowledge base. A use-neutral representation of a hydraulic system for the F-111 aircraft was developed. The ability to derive portions of four different knowledge bases is demonstrated from this use-neutral representation: one knowledge base is for re-design of the device using a model-based reasoning problem solver; two knowledge bases, at different levels of abstraction, are for diagnosis using a model-based reasoning solver; and one knowledge base is for diagnosis using an associational reasoning problem solver. It was shown how updates issued against the single source use-neutral knowledge repository can be propagated to the underlying knowledge bases.
A Discussion of Knowledge Based Design
NASA Technical Reports Server (NTRS)
Wood, Richard M.; Bauer, Steven X. S.
1999-01-01
A discussion of knowledge and Knowledge- Based design as related to the design of aircraft is presented. The paper discusses the perceived problem with existing design studies and introduces the concepts of design and knowledge for a Knowledge- Based design system. A review of several Knowledge-Based design activities is provided. A Virtual Reality, Knowledge-Based system is proposed and reviewed. The feasibility of Virtual Reality to improve the efficiency and effectiveness of aerodynamic and multidisciplinary design, evaluation, and analysis of aircraft through the coupling of virtual reality technology and a Knowledge-Based design system is also reviewed. The final section of the paper discusses future directions for design and the role of Knowledge-Based design.
Terminological reference of a knowledge-based system: the data dictionary.
Stausberg, J; Wormek, A; Kraut, U
1995-01-01
The development of open and integrated knowledge bases makes new demands on the definition of the used terminology. The definition should be realized in a data dictionary separated from the knowledge base. Within the works done at a reference model of medical knowledge, a data dictionary has been developed and used in different applications: a term definition shell, a documentation tool and a knowledge base. The data dictionary includes that part of terminology, which is largely independent of a certain knowledge model. For that reason, the data dictionary can be used as a basis for integrating knowledge bases into information systems, for knowledge sharing and reuse and for modular development of knowledge-based systems.
Machine learning research 1989-90
NASA Technical Reports Server (NTRS)
Porter, Bruce W.; Souther, Arthur
1990-01-01
Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.
Health Care Leadership: Managing Knowledge Bases as Stakeholders.
Rotarius, Timothy
Communities are composed of many organizations. These organizations naturally form clusters based on common patterns of knowledge, skills, and abilities of the individual organizations. Each of these spontaneous clusters represents a distinct knowledge base. The health care knowledge base is shown to be the natural leader of any community. Using the Central Florida region's 5 knowledge bases as an example, each knowledge base is categorized as a distinct type of stakeholder, and then a specific stakeholder management strategy is discussed to facilitate managing both the cooperative potential and the threatening potential of each "knowledge base" stakeholder.
NASA Technical Reports Server (NTRS)
Rahimian, Eric N.; Graves, Sara J.
1988-01-01
A new approach used in constructing a rational data knowledge base system is described. The relational database is well suited for distribution due to its property of allowing data fragmentation and fragmentation transparency. An example is formulated of a simple relational data knowledge base which may be generalized for use in developing a relational distributed data knowledge base system. The efficiency and ease of application of such a data knowledge base management system is briefly discussed. Also discussed are the potentials of the developed model for sharing the data knowledge base as well as the possible areas of difficulty in implementing the relational data knowledge base management system.
Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
Case-based tutoring from a medical knowledge base.
Chin, H L; Cooper, G F
1989-01-01
The past decade has seen the emergence of programs that make use of large knowledge bases to assist physicians in diagnosis within the general field of internal medicine. One such program, Internist-I, contains knowledge about over 600 diseases, covering a significant proportion of internal medicine. This paper describes the process of converting a subset of this knowledge base--in the area of cardiovascular diseases--into a probabilistic format, and the use of this resulting knowledge base to teach medical diagnostic knowledge. The system (called KBSimulator--for Knowledge-Based patient Simulator) generates simulated patient cases and uses these cases as a focal point from which to teach medical knowledge. This project demonstrates the feasibility of building an intelligent, flexible instructional system that uses a knowledge base constructed primarily for medical diagnosis.
Automated knowledge base development from CAD/CAE databases
NASA Technical Reports Server (NTRS)
Wright, R. Glenn; Blanchard, Mary
1988-01-01
Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.
Cole-Lewis, Heather J; Smaldone, Arlene M; Davidson, Patricia R; Kukafka, Rita; Tobin, Jonathan N; Cassells, Andrea; Mynatt, Elizabeth D; Hripcsak, George; Mamykina, Lena
2016-01-01
To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Cole-Lewis, Heather J.; Smaldone, Arlene M.; Davidson, Patricia R.; Kukafka, Rita; Tobin, Jonathan N.; Cassells, Andrea; Mynatt, Elizabeth D.; Hripcsak, George; Mamykina, Lena
2015-01-01
Objective To develop an expandable knowledge base of reusable knowledge related to self-management of diabetes that can be used as a foundation for patient-centric decision support tools. Materials and methods The structure and components of the knowledge base were created in participatory design with academic diabetes educators using knowledge acquisition methods. The knowledge base was validated using scenario-based approach with practicing diabetes educators and individuals with diabetes recruited from Community Health Centers (CHCs) serving economically disadvantaged communities and ethnic minorities in New York. Results The knowledge base includes eight glycemic control problems, over 150 behaviors known to contribute to these problems coupled with contextual explanations, and over 200 specific action-oriented self-management goals for correcting problematic behaviors, with corresponding motivational messages. The validation of the knowledge base suggested high level of completeness and accuracy, and identified improvements in cultural appropriateness. These were addressed in new iterations of the knowledge base. Discussion The resulting knowledge base is theoretically grounded, incorporates practical and evidence-based knowledge used by diabetes educators in practice settings, and allows for personally meaningful choices by individuals with diabetes. Participatory design approach helped researchers to capture implicit knowledge of practicing diabetes educators and make it explicit and reusable. Conclusion The knowledge base proposed here is an important step towards development of new generation patient-centric decision support tools for facilitating chronic disease self-management. While this knowledge base specifically targets diabetes, its overall structure and composition can be generalized to other chronic conditions. PMID:26547253
NetWeaver for EMDS user guide (version 1.1): a knowledge base development system.
Keith M. Reynolds
1999-01-01
The guide describes use of the NetWeaver knowledge base development system. Knowledge representation in NetWeaver is based on object-oriented fuzzy-logic networks that offer several significant advantages over the more traditional rulebased representation. Compared to rule-based knowledge bases, NetWeaver knowledge bases are easier to build, test, and maintain because...
NASA Astrophysics Data System (ADS)
Zampolli, Mario; Haralabus, Georgios; Prior, Mark K.; Heaney, Kevin D.; Campbell, Richard
2014-05-01
Hydrophone stations of the Comprehensive Nuclear-Test-Ban Organisation (CTBTO) International Monitoring System (IMS), with the exception of one in Australia, comprise two triplets of submerged moored hydrophones, one North and one South of the island from which the respective system is deployed. Triplet distances vary approximately between 50 - 100 km from the island, with each triplet connected to the receiving shore equipment by fibre-optic submarine data cables. Once deployed, the systems relay underwater acoustic waveforms in the band 1 - 100 Hz in real time to Vienna via a shore based satellite link. The design life of hydroacoustic stations is at least 20 years, without need for any maintenance of the underwater system. The re-establishment of hydrophone monitoring station HA04 at Crozet (French Southern and Antarctic Territories) in the South-Western Indian Ocean is currently being investigated. In order to determine appropriate locations and depths for the installation of the hydrophones a number of constraints need to be taken into account and balanced against each other. The most important of these are (i) hydrophone depth in a region where the sound-speed profile is mostly upward refracting and the Sound Fixing and Ranging (SOFAR) channel is not well defined, (ii) a safe distance from the surface currents which occupy the first few hundred meters of the water column, (iii) seabed slopes that enable the safe deployment of the hydrophone mooring bases, (iv) avoidance of regions of high internal tide activity, (v) choice of locations to optimize basin and cross-basin scale acoustic coverage of each triplet and (vi) redundancy considerations so that one triplet can partially cover for the other one in case of necessity. A state-of-the-art three-dimensional (3-D) parabolic equation acoustic propagation model was used to model the propagation for a number of potential triplet locations. Criteria for short-listing candidate triplet locations were based on acoustic coverage towards the North and South, as well as overall acoustic coverage, taking into account different scales of source strength. An increase in the predicted area coverage compared to predictions based on 2-D modelling was observed and attributed to diffraction around sharp localized features such as islands or sea-mounts.
The research on construction and application of machining process knowledge base
NASA Astrophysics Data System (ADS)
Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai
2018-03-01
In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.
Refining Automatically Extracted Knowledge Bases Using Crowdsourcing.
Li, Chunhua; Zhao, Pengpeng; Sheng, Victor S; Xian, Xuefeng; Wu, Jian; Cui, Zhiming
2017-01-01
Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost.
Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis
NASA Astrophysics Data System (ADS)
Gluhih, I. N.; Akhmadulin, R. K.
2017-07-01
One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.
The influence of periodic wind turbine noise on infrasound array measurements
NASA Astrophysics Data System (ADS)
Pilger, Christoph; Ceranna, Lars
2017-02-01
Aerodynamic noise emissions from the continuously growing number of wind turbines in Germany are creating increasing problems for infrasound recording systems. These systems are equipped with highly sensitive micro pressure sensors accurately measuring acoustic signals in a frequency range inaudible to the human ear. Ten years of data (2006-2015) from the infrasound array IGADE in Northern Germany are analysed to quantify the influence of wind turbine noise on infrasound recordings. Furthermore, a theoretical model is derived and validated by a field experiment with mobile micro-barometer stations. Fieldwork was carried out 2004 to measure the infrasonic pressure level of a single horizontal-axis wind turbine and to extrapolate the sound effect for a larger number of nearby wind turbines. The model estimates the generated sound pressure level of wind turbines and thus enables for specifying the minimum allowable distance between wind turbines and infrasound stations for undisturbed recording. This aspect is particularly important to guarantee the monitoring performance of the German infrasound stations I26DE in the Bavarian Forest and I27DE in Antarctica. These stations are part of the International Monitoring System (IMS) verifying compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), and thus have to meet stringent specifications with respect to infrasonic background noise.
NASA Astrophysics Data System (ADS)
Suyehiro, K.; Sugioka, H.; Watanabe, T.
2008-12-01
The hydroacoustic monitoring by the International Monitoring System for CTBT (Comprehensive Nuclear- Test-Ban Treaty) verification system utilizes hydrophone stations (6) and seismic stations (5 and called T- phase stations) for worldwide detection. Some conspicuous signals of natural origin include those from earthquakes, volcanic eruptions, or whale calls. Among artificial sources are non-nuclear explosions and airgun shots. It is important for the IMS system to detect and locate hydroacoustic events with sufficient accuracy and correctly characterize the signals and identify the source. As there are a number of seafloor cable networks operated offshore Japanese islands basically facing the Pacific Ocean for monitoring regional seismicity, the data from these stations (pressure and seismic sensors) may be utilized to increase the capability of IMS. We use these data to compare some selected event parameters with those by IMS. In particular, there have been several unconventional acoustic signals in the western Pacific,which were also captured by IMS hydrophones across the Pacific in the time period of 2007-present. These anomalous examples and also dynamite shots used for seismic crustal structure studies and other natural sources will be presented in order to help improve the IMS verification capabilities for detection, location and characterization of anomalous signals.
Data mining on long-term barometric data within the ARISE2 project
NASA Astrophysics Data System (ADS)
Hupe, Patrick; Ceranna, Lars; Pilger, Christoph
2016-04-01
The Comprehensive nuclear-Test-Ban Treaty (CTBT) led to the implementation of an international infrasound array network. The International Monitoring System (IMS) network includes 48 certified stations, each providing data for up to 15 years. As part of work package 3 of the ARISE2 project (Atmospheric dynamics Research InfraStructure in Europe, phase 2) the data sets will be statistically evaluated with regard on atmospheric dynamics. The current study focusses on fluctuations of absolute air pressure. Time series have been analysed for 17 monitoring stations which are located all over the world between Greenland and Antarctica along the latitudes to represent different climate zones and characteristic atmospheric conditions. Hence this enables quantitative comparisons between those regions. Analyses are shown including wavelet power spectra, multi-annual time series of average variances with regard to long-wave scales, and spectral densities to derive characteristics and special events. Evaluations reveal periodicities in average variances on 2 to 20 day scale with a maximum in the winter months and a minimum in summer of the respective hemisphere. This basically applies to time series of IMS stations beyond the tropics where the dominance of cyclones and anticyclones changes with seasons. Furthermore, spectral density analyses illustrate striking signals for several dynamic activities within one day, e.g., the semidiurnal tide.
Natural ³⁷Ar concentrations in soil air: implications for monitoring underground nuclear explosions.
Riedmann, Robin A; Purtschert, Roland
2011-10-15
For on-site inspections (OSI) under the Comprehensive Nuclear-Test-Ban Treaty (CTBT) measurement of the noble gas ³⁷Ar is considered an important technique. ³⁷Ar is produced underground by neutron activation of Calcium by the reaction ⁴⁰Ca(n,α)³⁷Ar. The naturally occurring equilibrium ³⁷Ar concentration balance in soil air is a function of an exponentially decreasing production rate from cosmic ray neutrons with increasing soil depth, diffusive transport in the soil air, and radioactive decay (T(1/2): 35 days). In this paper for the first time, measurements of natural ³⁷Ar activities in soil air are presented. The highest activities of ~100 mBq m⁻³ air are 2 orders of magnitude larger than in the atmosphere and are found in 1.5-2.5 m depth. At depths > 8 m ³⁷Ar activities are < 20 mBq m⁻³ air. After identifying the main ³⁷Ar production and gas transport factors the expected global activity range distribution of ³⁷Ar in shallow subsoil (0.7 m below the surface) was estimated. In high altitude soils, with large amounts of Calcium and with low gas permeability, ³⁷Ar activities may reach values up to 1 Bq m⁻³.
Detection and interpretation of seismoacoustic events at German infrasound stations
NASA Astrophysics Data System (ADS)
Pilger, Christoph; Koch, Karl; Ceranna, Lars
2016-04-01
Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P.E.; Glenn, L.A.
This report presents a preliminary summary of the data recorded at three regional seismic stations from surface blasting at the Black Thunder Coal Mine in northeast Wyoming. The regional stations are part of a larger effort that includes many more seismic stations in the immediate vicinity of the mine. The overall purpose of this effort is to characterize the source function and propagation characteristics of large typical surface mine blasts. A detailed study of source and propagation features of conventional surface blasts is a prerequisite to attempts at discriminating this type of blasting activity from other sources of seismic events.more » The Black Thunder Seismic experiment is a joint verification effort to determine seismic source and path effects that result from very large, but routine ripple-fired surface mining blasts. Studies of the data collected will be for the purpose of understanding how the near-field and regional seismic waveforms from these surface mining blasts are similar to, and different from, point shot explosions and explosions at greater depth. The Black Hills Station is a Designated Seismic Station that was constructed for temporary occupancy by the Former Soviet Union seismic verification scientists in accordance with the Threshold Test Ban Treaty protocol.« less
NASA Astrophysics Data System (ADS)
Labak, P.; Rowlands, A.; Malich, G.; Charlton, A.; Schultz-Fellenz, E. S.; Craven, J.
2016-12-01
The availability of data and the ability to effectively interpret those data in the context of an alleged Treaty violation are critical to operations during the launch phase of an inspection. The launch phase encompasses the time when the initial inspection plan is being developed and finalised; this document will set the scene for the inspection and will propose mission activities for the critical first three days of an inspection. While authenticated data products from the CTBT International Data Centre form the basis of the initial inspection plan, other data types, provided as national technical means, can also be used to inform the development of the initial inspection plan. In this context, remotely sensed data and derived products acquired from sensors on satellites feature prominently. Given the environmental setting, optical and/or radar sensors have the potential to provide valuable information to guide mission activities. Such data could provide more than mere backdrops to mapping products. While recognising time constraints and the difficulties associated with integrating data from disparate optical and radar sensors, this abstract uses case studies to illustrate the types of derived data products from sapecborne sensors that have the potential to inform inspectors during the preparation of the initial inspection plan.
Improvements of low-level radioxenon detection sensitivity by a state-of-the art coincidence setup.
Cagniant, A; Le Petit, G; Gross, P; Douysset, G; Richard-Bressand, H; Fontaine, J-P
2014-05-01
The ability to quantify isotopic ratios of 135, 133 m, 133 and 131 m radioxenon is essential for the verification of the Comprehensive Nuclear-Test Ban Treaty (CTBT). In order to improve detection limits, CEA has developed a new on-site setup using photon/electron coincidence (Le Petit et al., 2013. J. Radioanal. Nucl. Chem., DOI : 10.1007/s 10697-013-2525-8.). Alternatively, the electron detection cell equipped with large silicon chips (PIPS) can be used with HPGe detector for laboratory analysis purpose. This setup allows the measurement of β/γ coincidences for the detection of (133)Xe and (135)Xe; and K-shell Conversion Electrons (K-CE)/X-ray coincidences for the detection of (131m)Xe, (133m)Xe and (133)Xe as well. Good energy resolution of 11 keV at 130 keV and low energy threshold of 29 keV for the electron detection were obtained. This provides direct discrimination between K-CE from (133)Xe, (133m)Xe and (131m)Xe. Estimation of Minimum Detectable Activity (MDA) for (131m)Xe is in the order of 1mBq over a 4 day measurement. An analysis of an environmental radioxenon sample using this method is shown. © 2013 The Authors. Published by Elsevier Ltd All rights reserved.
NASA Astrophysics Data System (ADS)
Hupe, Patrick; Ceranna, Lars; Pilger, Christoph; Le Pichon, Alexis
2017-04-01
The infrasound network of the International Monitoring System (IMS) has been established for monitoring the atmosphere to detect violations of the Comprehensive nuclear-Test-Ban Treaty (CTBT). The IMS comprises 49 certified infrasound stations which are globally distributed. Each station provides data for up to 16 years. Due to the uniform distribution of the stations, the IMS infrasound network can be used to derive global information on atmospheric dynamics' features. This study focuses on mountain-associated waves (MAWs), i.e. acoustic waves in the frequency range between approximately 0.01 Hz and 0.05 Hz. MAWs can be detected in infrasound data by applying the Progressive Multi-Channel Correlation (PMCC) algorithm. As a result of triangulation, global hotspots of MAWs can be identified. Previous studies on gravity waves indicate that global hotspots of gravity waves are similar to those found for MAWs by using the PMCC algorithm. The objective of our study is an enhanced understanding of the excitation sources and of possible interactions between MAWs and gravity waves. Therefore, spatial and temporal correlation analyses will be performed. As a preceding step, we will present (seasonal) hotspots of MAWs as well as hotspots of gravity waves derived by the IMS infrasound network.
Influence of atmospheric transport patterns on xenon detections at the CTBTO radionuclide network
NASA Astrophysics Data System (ADS)
Krysta, Monika; Kusmierczyk-Michulec, Jolanta
2016-04-01
In order to fulfil its task of monitoring for signals emanating from nuclear explosions, Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) operates global International Monitoring System (IMS) comprising seismic, infrasound, hydroacoustic and radionuclide measurement networks. At present, 24 among 80 radionuclide stations foreseen by the Comprehensive Nuclear-Test-Ban Treaty (CTBT) are equipped with certified noble gas measurement systems. Over a past couple of years these systems collected a rich set of measurements of radioactive isotopes of xenon. Atmospheric transport modelling simulations are crucial to an assessment of the origin of xenon detected at the IMS stations. Numerous studies undertaken in the past enabled linking these detections to non Treaty-relevant activities and identifying main contributors. Presence and quantity of xenon isotopes at the stations is hence a result of an interplay of emission patterns and atmospheric circulation. In this presentation we analyse the presence or absence of radioactive xenon at selected stations from an angle of such an interplay. We attempt to classify the stations according to similarity of detection patterns, examine seasonality in those patterns and link them to large scale or local meteorological phenomena. The studies are undertaken using crude hypotheses on emission patterns from known sources and atmospheric transport modelling simulations prepared with the FLEXPART model.
An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process
NASA Astrophysics Data System (ADS)
Nguyen, ThanhDat; Kifor, Claudiu Vasile
2015-09-01
DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.
An Object-Oriented Software Architecture for the Explorer-2 Knowledge Management Environment
Tarabar, David B.; Greenes, Robert A.; Slosser, Eric T.
1989-01-01
Explorer-2 is a workstation based environment to facilitate knowledge management. It provides consistent access to a broad range of knowledge on the basis of purpose, not type. We have developed a software architecture based on Object-Oriented programming for Explorer-2. We have defined three classes of program objects: Knowledge ViewFrames, Knowledge Resources, and Knowledge Bases. This results in knowledge management at three levels: the screen level, the disk level and the meta-knowledge level. We have applied this design to several knowledge bases, and believe that there is a broad applicability of this design.
Systems, methods and apparatus for verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.
Foundation: Transforming data bases into knowledge bases
NASA Technical Reports Server (NTRS)
Purves, R. B.; Carnes, James R.; Cutts, Dannie E.
1987-01-01
One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.
A knowledge-base generating hierarchical fuzzy-neural controller.
Kandadai, R M; Tien, J M
1997-01-01
We present an innovative fuzzy-neural architecture that is able to automatically generate a knowledge base, in an extractable form, for use in hierarchical knowledge-based controllers. The knowledge base is in the form of a linguistic rule base appropriate for a fuzzy inference system. First, we modify Berenji and Khedkar's (1992) GARIC architecture to enable it to automatically generate a knowledge base; a pseudosupervised learning scheme using reinforcement learning and error backpropagation is employed. Next, we further extend this architecture to a hierarchical controller that is able to generate its own knowledge base. Example applications are provided to underscore its viability.
Developing a geoscience knowledge framework for a national geological survey organisation
NASA Astrophysics Data System (ADS)
Howard, Andrew S.; Hatton, Bill; Reitsma, Femke; Lawrie, Ken I. G.
2009-04-01
Geological survey organisations (GSOs) are established by most nations to provide a geoscience knowledge base for effective decision-making on mitigating the impacts of natural hazards and global change, and on sustainable management of natural resources. The value of the knowledge base as a national asset is continually enhanced by the exchange of knowledge between GSOs as data and information providers and the stakeholder community as knowledge 'users and exploiters'. Geological maps and associated narrative texts typically form the core of national geoscience knowledge bases, but have some inherent limitations as methods of capturing and articulating knowledge. Much knowledge about the three-dimensional (3D) spatial interpretation and its derivation and uncertainty, and the wider contextual value of the knowledge, remains intangible in the minds of the mapping geologist in implicit and tacit form. To realise the value of these knowledge assets, the British Geological Survey (BGS) has established a workflow-based cyber-infrastructure to enhance its knowledge management and exchange capability. Future geoscience surveys in the BGS will contribute to a national, 3D digital knowledge base on UK geology, with the associated implicit and tacit information captured as metadata, qualitative assessments of uncertainty, and documented workflows and best practice. Knowledge-based decision-making at all levels of society requires both the accessibility and reliability of knowledge to be enhanced in the grid-based world. Establishment of collaborative cyber-infrastructures and ontologies for geoscience knowledge management and exchange will ensure that GSOs, as knowledge-based organisations, can make their contribution to this wider goal.
Refining Automatically Extracted Knowledge Bases Using Crowdsourcing
Xian, Xuefeng; Cui, Zhiming
2017-01-01
Machine-constructed knowledge bases often contain noisy and inaccurate facts. There exists significant work in developing automated algorithms for knowledge base refinement. Automated approaches improve the quality of knowledge bases but are far from perfect. In this paper, we leverage crowdsourcing to improve the quality of automatically extracted knowledge bases. As human labelling is costly, an important research challenge is how we can use limited human resources to maximize the quality improvement for a knowledge base. To address this problem, we first introduce a concept of semantic constraints that can be used to detect potential errors and do inference among candidate facts. Then, based on semantic constraints, we propose rank-based and graph-based algorithms for crowdsourced knowledge refining, which judiciously select the most beneficial candidate facts to conduct crowdsourcing and prune unnecessary questions. Our experiments show that our method improves the quality of knowledge bases significantly and outperforms state-of-the-art automatic methods under a reasonable crowdsourcing cost. PMID:28588611
A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public
Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin
2016-01-01
The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314
A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.
Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin
2016-01-01
The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.
A Knowledge-Base for a Personalized Infectious Disease Risk Prediction System.
Vinarti, Retno; Hederman, Lucy
2018-01-01
We present a knowledge-base to represent collated infectious disease risk (IDR) knowledge. The knowledge is about personal and contextual risk of contracting an infectious disease obtained from declarative sources (e.g. Atlas of Human Infectious Diseases). Automated prediction requires encoding this knowledge in a form that can produce risk probabilities (e.g. Bayesian Network - BN). The knowledge-base presented in this paper feeds an algorithm that can auto-generate the BN. The knowledge from 234 infectious diseases was compiled. From this compilation, we designed an ontology and five rule types for modelling IDR knowledge in general. The evaluation aims to assess whether the knowledge-base structure, and its application to three disease-country contexts, meets the needs of personalized IDR prediction system. From the evaluation results, the knowledge-base conforms to the system's purpose: personalization of infectious disease risk.
NASA Astrophysics Data System (ADS)
Salloum, Sara
2017-06-01
This conceptual paper aims to characterize science teachers' practical knowledge utilizing a virtue-based theory of knowledge and the Aristotelian notion of phronesis/practical wisdom. The article argues that a greater understanding of the concept of phronesis and its relevance to science education would enrich our understandings of teacher knowledge, its development, and consequently models of teacher education. Views of teacher knowledge presented in this paper are informed by philosophical literature that questions normative views of knowledge and argues for a virtue-based epistemology rather than a belief-based one. The paper first outlines general features of phronesis/practical wisdom. Later, a virtue-based view of knowledge is described. A virtue-based view binds knowledge with moral concepts and suggests that knowledge development is motivated by intellectual virtues such as intellectual sobriety, perseverance, fairness, and humility. A virtue-based theory of knowledge gives prominence to the virtue of phronesis/practical wisdom, whose primary function is to mediate among virtues and theoretical knowledge into a line of action that serves human goods. The role of phronesis and its relevance to teaching science are explained accordingly. I also discuss differences among various characterizations of practical knowledge in science education and a virtue-based characterization. Finally, implications and further questions for teacher education are presented.
Design of Composite Structures Using Knowledge-Based and Case Based Reasoning
NASA Technical Reports Server (NTRS)
Lambright, Jonathan Paul
1996-01-01
A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of limited well defined rules. The findings indicated that the technique is most effective when used as a design aid and not as a tool to totally automate the composites design process. Other areas of application and implications for future research are discussed.
NASA Technical Reports Server (NTRS)
Genuardi, Michael T.
1993-01-01
One strategy for machine-aided indexing (MAI) is to provide a concept-level analysis of the textual elements of documents or document abstracts. In such systems, natural-language phrases are analyzed in order to identify and classify concepts related to a particular subject domain. The overall performance of these MAI systems is largely dependent on the quality and comprehensiveness of their knowledge bases. These knowledge bases function to (1) define the relations between a controlled indexing vocabulary and natural language expressions; (2) provide a simple mechanism for disambiguation and the determination of relevancy; and (3) allow the extension of concept-hierarchical structure to all elements of the knowledge file. After a brief description of the NASA Machine-Aided Indexing system, concerns related to the development and maintenance of MAI knowledge bases are discussed. Particular emphasis is given to statistically-based text analysis tools designed to aid the knowledge base developer. One such tool, the Knowledge Base Building (KBB) program, presents the domain expert with a well-filtered list of synonyms and conceptually-related phrases for each thesaurus concept. Another tool, the Knowledge Base Maintenance (KBM) program, functions to identify areas of the knowledge base affected by changes in the conceptual domain (for example, the addition of a new thesaurus term). An alternate use of the KBM as an aid in thesaurus construction is also discussed.
Ontologies, Knowledge Bases and Knowledge Management
2002-07-01
AFRL-IF-RS-TR-2002-163 Final Technical Report July 2002 ONTOLOGIES, KNOWLEDGE BASES AND KNOWLEDGE MANAGEMENT USC Information ...and layer additional information necessary to make specific uses of the knowledge in this core. Finally, while we were able to find adequate solutions... knowledge base and inference engine. Figure 3.2: SDA Editor Interface 46 Although the SDA has access to information about the situation, we wanted the user
New knowledge-based genetic algorithm for excavator boom structural optimization
NASA Astrophysics Data System (ADS)
Hua, Haiyan; Lin, Shuwen
2014-03-01
Due to the insufficiency of utilizing knowledge to guide the complex optimal searching, existing genetic algorithms fail to effectively solve excavator boom structural optimization problem. To improve the optimization efficiency and quality, a new knowledge-based real-coded genetic algorithm is proposed. A dual evolution mechanism combining knowledge evolution with genetic algorithm is established to extract, handle and utilize the shallow and deep implicit constraint knowledge to guide the optimal searching of genetic algorithm circularly. Based on this dual evolution mechanism, knowledge evolution and population evolution can be connected by knowledge influence operators to improve the configurability of knowledge and genetic operators. Then, the new knowledge-based selection operator, crossover operator and mutation operator are proposed to integrate the optimal process knowledge and domain culture to guide the excavator boom structural optimization. Eight kinds of testing algorithms, which include different genetic operators, are taken as examples to solve the structural optimization of a medium-sized excavator boom. By comparing the results of optimization, it is shown that the algorithm including all the new knowledge-based genetic operators can more remarkably improve the evolutionary rate and searching ability than other testing algorithms, which demonstrates the effectiveness of knowledge for guiding optimal searching. The proposed knowledge-based genetic algorithm by combining multi-level knowledge evolution with numerical optimization provides a new effective method for solving the complex engineering optimization problem.
Lu, Tong; Tai, Chiew-Lan; Yang, Huafei; Cai, Shijie
2009-08-01
We present a novel knowledge-based system to automatically convert real-life engineering drawings to content-oriented high-level descriptions. The proposed method essentially turns the complex interpretation process into two parts: knowledge representation and knowledge-based interpretation. We propose a new hierarchical descriptor-based knowledge representation method to organize the various types of engineering objects and their complex high-level relations. The descriptors are defined using an Extended Backus Naur Form (EBNF), facilitating modification and maintenance. When interpreting a set of related engineering drawings, the knowledge-based interpretation system first constructs an EBNF-tree from the knowledge representation file, then searches for potential engineering objects guided by a depth-first order of the nodes in the EBNF-tree. Experimental results and comparisons with other interpretation systems demonstrate that our knowledge-based system is accurate and robust for high-level interpretation of complex real-life engineering projects.
Placement Mentors Making Sense of Research-Based Knowledge
ERIC Educational Resources Information Center
Raaen, Finn Daniel
2017-01-01
Placement mentors' role increasingly implies demonstrating to student teachers how research-based knowledge in combination with experience-based knowledge may be relevant in teachers' professional work. This is a challenge. Placement mentors are often unsure how to make sense of research-based knowledge. Frequently there is a mismatch between what…
The Knowledge-Based Economy and E-Learning: Critical Considerations for Workplace Democracy
ERIC Educational Resources Information Center
Remtulla, Karim A.
2007-01-01
The ideological shift by nation-states to "a knowledge-based economy" (also referred to as "knowledge-based society") is causing changes in the workplace. Brought about by the forces of globalisation and technological innovation, the ideologies of the "knowledge-based economy" are not limited to influencing the…
Teacher Education: Considerations for a Knowledge Base Framework.
ERIC Educational Resources Information Center
Tumposky, Nancy
Traditionally, the knowledge base has been defined more as product than process and has encompassed definitions, principles, values, and facts. Recent reforms in teaching and teacher education have brought about efforts to redefine the knowledge base. The reconceptualized knowledge base builds upon the earlier model but gives higher priority to…
Case-based reasoning: The marriage of knowledge base and data base
NASA Technical Reports Server (NTRS)
Pulaski, Kirt; Casadaban, Cyprian
1988-01-01
The coupling of data and knowledge has a synergistic effect when building an intelligent data base. The goal is to integrate the data and knowledge almost to the point of indistinguishability, permitting them to be used interchangeably. Examples given in this paper suggest that Case-Based Reasoning is a more integrated way to link data and knowledge than pure rule-based reasoning.
Collective intelligence in medical diagnosis systems: A case study.
Hernández-Chan, Gandhi S; Ceh-Varela, Edgar Eduardo; Sanchez-Cervantes, Jose L; Villanueva-Escalante, Marisol; Rodríguez-González, Alejandro; Pérez-Gallardo, Yuliana
2016-07-01
Diagnosing a patient's condition is one of the most important and challenging tasks in medicine. We present a study of the application of collective intelligence in medical diagnosis by applying consensus methods. We compared the accuracy obtained with this method against the diagnostics accuracy reached through the knowledge of a single expert. We used the ontological structures of ten diseases. Two knowledge bases were created by placing five diseases into each knowledge base. We conducted two experiments, one with an empty knowledge base and the other with a populated knowledge base. For both experiments, five experts added and/or eliminated signs/symptoms and diagnostic tests for each disease. After this process, the individual knowledge bases were built based on the output of the consensus methods. In order to perform the evaluation, we compared the number of items for each disease in the agreed knowledge bases against the number of items in the GS (Gold Standard). We identified that, while the number of items in each knowledge base is higher, the consensus level is lower. In all cases, the lowest level of agreement (20%) exceeded the number of signs that are in the GS. In addition, when all experts agreed, the number of items decreased. The use of collective intelligence can be used to increase the consensus of physicians. This is because, by using consensus, physicians can gather more information and knowledge than when obtaining information and knowledge from knowledge bases fed or populated from the knowledge found in the literature, and, at the same time, they can keep updated and collaborate dynamically. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Xuan, Albert L.; Shinghal, Rajjan
1989-03-01
As the need for knowledge-based systems increases, an increasing number of domain experts are becoming interested in taking more active part in the building of knowledge-based systems. However, such a domain expert often must deal with a large number of unfamiliar terms concepts, facts, procedures and principles based on different approaches and schools of thought. He (for brevity, we shall use masculine pronouns for both genders) may need the help of a knowledge engineer (KE) in building the knowledge-based system but may encounter a number of problems. For instance, much of the early interaction between him and the knowl edge engineer may be spent in educating each other about their seperate kinds of expertise. Since the knowledge engineer will usually be ignorant of the knowledge domain while the domain expert (DE) will have little knowledge about knowledge-based systems, a great deal of time will be wasted on these issues ad the DE and the KE train each other to the point where a fruitful interaction can occur. In some situations, it may not even be possible for the DE to find a suitable KE to work with because he has no time to train the latter in his domain. This will engender the need for the DE to be more knowledgeable about knowledge-based systems and for the KE to find methods and techniques which will allow them to learn new domains as fast as they can. In any event, it is likely that the process of building knowledge-based systems will be smooth, er and more efficient if the domain expert is knowledgeable about the methods and techniques of knowledge-based systems building.
McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F
2015-01-01
Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.
Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.
2015-01-01
Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079
Development of a Knowledge Base for Enduser Consultation of AAL-Systems.
Röll, Natalie; Stork, Wilhelm; Rosales, Bruno; Stephan, René; Knaup, Petra
2016-01-01
Manufacturer information, user experiences and product availability of assistive living technologies are usually not known to citizens or consultation centers. The different knowledge levels concerning the availability of technology shows the need for building up a knowledge base. The aim of this contribution is the definition of requirements in the development of knowledge bases for AAL consultations. The major requirements, such as a maintainable and easy to use structure were implemented into a web based knowledge base, which went productive in ~3700 consulting interviews of municipal technology information centers. Within this field phase the implementation of the requirements for a knowledge base in the field of AAL consulting was evaluated and further developed.
A Knowledge-Based System Developer for aerospace applications
NASA Technical Reports Server (NTRS)
Shi, George Z.; Wu, Kewei; Fensky, Connie S.; Lo, Ching F.
1993-01-01
A prototype Knowledge-Based System Developer (KBSD) has been developed for aerospace applications by utilizing artificial intelligence technology. The KBSD directly acquires knowledge from domain experts through a graphical interface then builds expert systems from that knowledge. This raises the state of the art of knowledge acquisition/expert system technology to a new level by lessening the need for skilled knowledge engineers. The feasibility, applicability , and efficiency of the proposed concept was established, making a continuation which would develop the prototype to a full-scale general-purpose knowledge-based system developer justifiable. The KBSD has great commercial potential. It will provide a marketable software shell which alleviates the need for knowledge engineers and increase productivity in the workplace. The KBSD will therefore make knowledge-based systems available to a large portion of industry.
Combining Open-domain and Biomedical Knowledge for Topic Recognition in Consumer Health Questions.
Mrabet, Yassine; Kilicoglu, Halil; Roberts, Kirk; Demner-Fushman, Dina
2016-01-01
Determining the main topics in consumer health questions is a crucial step in their processing as it allows narrowing the search space to a specific semantic context. In this paper we propose a topic recognition approach based on biomedical and open-domain knowledge bases. In the first step of our method, we recognize named entities in consumer health questions using an unsupervised method that relies on a biomedical knowledge base, UMLS, and an open-domain knowledge base, DBpedia. In the next step, we cast topic recognition as a binary classification problem of deciding whether a named entity is the question topic or not. We evaluated our approach on a dataset from the National Library of Medicine (NLM), introduced in this paper, and another from the Genetic and Rare Disease Information Center (GARD). The combination of knowledge bases outperformed the results obtained by individual knowledge bases by up to 16.5% F1 and achieved state-of-the-art performance. Our results demonstrate that combining open-domain knowledge bases with biomedical knowledge bases can lead to a substantial improvement in understanding user-generated health content.
Knowledge Representation and Ontologies
NASA Astrophysics Data System (ADS)
Grimm, Stephan
Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.
Adaptive Knowledge Management of Project-Based Learning
ERIC Educational Resources Information Center
Tilchin, Oleg; Kittany, Mohamed
2016-01-01
The goal of an approach to Adaptive Knowledge Management (AKM) of project-based learning (PBL) is to intensify subject study through guiding, inducing, and facilitating development knowledge, accountability skills, and collaborative skills of students. Knowledge development is attained by knowledge acquisition, knowledge sharing, and knowledge…
Semantics driven approach for knowledge acquisition from EMRs.
Perera, Sujan; Henson, Cory; Thirunarayan, Krishnaprasad; Sheth, Amit; Nair, Suhas
2014-03-01
Semantic computing technologies have matured to be applicable to many critical domains such as national security, life sciences, and health care. However, the key to their success is the availability of a rich domain knowledge base. The creation and refinement of domain knowledge bases pose difficult challenges. The existing knowledge bases in the health care domain are rich in taxonomic relationships, but they lack nontaxonomic (domain) relationships. In this paper, we describe a semiautomatic technique for enriching existing domain knowledge bases with causal relationships gleaned from Electronic Medical Records (EMR) data. We determine missing causal relationships between domain concepts by validating domain knowledge against EMR data sources and leveraging semantic-based techniques to derive plausible relationships that can rectify knowledge gaps. Our evaluation demonstrates that semantic techniques can be employed to improve the efficiency of knowledge acquisition.
Applying knowledge compilation techniques to model-based reasoning
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.
Drug knowledge bases and their applications in biomedical informatics research.
Zhu, Yongjun; Elemento, Olivier; Pathak, Jyotishman; Wang, Fei
2018-01-03
Recent advances in biomedical research have generated a large volume of drug-related data. To effectively handle this flood of data, many initiatives have been taken to help researchers make good use of them. As the results of these initiatives, many drug knowledge bases have been constructed. They range from simple ones with specific focuses to comprehensive ones that contain information on almost every aspect of a drug. These curated drug knowledge bases have made significant contributions to the development of efficient and effective health information technologies for better health-care service delivery. Understanding and comparing existing drug knowledge bases and how they are applied in various biomedical studies will help us recognize the state of the art and design better knowledge bases in the future. In addition, researchers can get insights on novel applications of the drug knowledge bases through a review of successful use cases. In this study, we provide a review of existing popular drug knowledge bases and their applications in drug-related studies. We discuss challenges in constructing and using drug knowledge bases as well as future research directions toward a better ecosystem of drug knowledge bases. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Islam, Chhanda
A study was conducted to determine and compare the literacy beliefs, knowledge bases, and practices of early childhood educators who espouse emergent literacy and reading readiness philosophies; to explore the relationship among beliefs, knowledge bases, and practices; and to examine the degree to which beliefs, knowledge bases, and practices were…
Knowledge Acquisition Using Linguistic-Based Knowledge Analysis
Daniel L. Schmoldt
1998-01-01
Most knowledge-based system developmentefforts include acquiring knowledge from one or more sources. difficulties associated with this knowledge acquisition task are readily acknowledged by most researchers. While a variety of knowledge acquisition methods have been reported, little has been done to organize those different methods and to suggest how to apply them...
The Meta-Ontology Model of the Fishdisease Diagnostic Knowledge Based on Owl
NASA Astrophysics Data System (ADS)
Shi, Yongchang; Gao, Wen; Hu, Liang; Fu, Zetian
For improving available and reusable of knowledge in fish disease diagnosis (FDD) domain and facilitating knowledge acquisition, an ontology model of FDD knowledge was developed based on owl according to FDD knowledge model. It includes terminology of terms in FDD knowledge and hierarchies of their class.
Jones, Loretta; Bazargan, Mohsen; Lucas-Wright, Anna; Vadgama, Jaydutt V; Vargas, Roberto; Smith, James; Otoukesh, Salman; Maxwell, Annette E
2013-01-01
Most theoretical formulations acknowledge that knowledge and awareness of cancer screening and prevention recommendations significantly influence health behaviors. This study compares perceived knowledge of cancer prevention and screening with test-based knowledge in a community sample. We also examine demographic variables and self-reported cancer screening and prevention behaviors as correlates of both knowledge scores, and consider whether cancer related knowledge can be accurately assessed using just a few, simple questions in a short and easy-to-complete survey. We used a community-partnered participatory research approach to develop our study aims and a survey. The study sample was composed of 180 predominantly African American and Hispanic community individuals who participated in a full-day cancer prevention and screening promotion conference in South Los Angeles, California, on July 2011. Participants completed a self-administered survey in English or Spanish at the beginning of the conference. Our data indicate that perceived and test-based knowledge scores are only moderately correlated. Perceived knowledge score shows a stronger association with demographic characteristics and other cancer related variables than the test-based score. Thirteen out of twenty variables that are examined in our study showed a statistically significant correlation with the perceived knowledge score, however, only four variables demonstrated a statistically significant correlation with the test-based knowledge score. Perceived knowledge of cancer prevention and screening was assessed with fewer items than test-based knowledge. Thus, using this assessment could potentially reduce respondent burden. However, our data demonstrate that perceived and test-based knowledge are separate constructs.
The development of a classification schema for arts-based approaches to knowledge translation.
Archibald, Mandy M; Caine, Vera; Scott, Shannon D
2014-10-01
Arts-based approaches to knowledge translation are emerging as powerful interprofessional strategies with potential to facilitate evidence uptake, communication, knowledge, attitude, and behavior change across healthcare provider and consumer groups. These strategies are in the early stages of development. To date, no classification system for arts-based knowledge translation exists, which limits development and understandings of effectiveness in evidence syntheses. We developed a classification schema of arts-based knowledge translation strategies based on two mechanisms by which these approaches function: (a) the degree of precision in key message delivery, and (b) the degree of end-user participation. We demonstrate how this classification is necessary to explore how context, time, and location shape arts-based knowledge translation strategies. Classifying arts-based knowledge translation strategies according to their core attributes extends understandings of the appropriateness of these approaches for various healthcare settings and provider groups. The classification schema developed may enhance understanding of how, where, and for whom arts-based knowledge translation approaches are effective, and enable theorizing of essential knowledge translation constructs, such as the influence of context, time, and location on utilization strategies. The classification schema developed may encourage systematic inquiry into the effectiveness of these approaches in diverse interprofessional contexts. © 2014 Sigma Theta Tau International.
From Regional Hazard Assessment to Nuclear-Test-Ban Treaty Support - InSAR Ground Motion Services
NASA Astrophysics Data System (ADS)
Lege, T.; Kalia, A.; Gruenberg, I.; Frei, M.
2016-12-01
There are numerous scientific applications of InSAR methods in tectonics, earthquake analysis and other geologic and geophysical fields. Ground motion on local and regional scale measured and monitored via the application of the InSAR techniques provide scientists and engineers with plenty of new insights and further understanding of subsurface processes. However, the operational use of InSAR is not yet very widespread. To foster the operational utilization of the Copernicus Sentinel Satellites in the day-to-day business of federal, state and municipal work and planning BGR (Federal Institute for Geosciences and Natural Resources) initiated workshops with potential user groups. Through extensive reconcilement of interests and demands with scientific, technical, economic and governmental stakeholders (e.g. Ministries, Mining Authorities, Geological Surveys, Geodetic Surveys and Environmental Agencies on federal and state level, SMEs, German Aerospace Center) BGR developed the concept of the InSAR based German National Ground Motion Service. One important backbone for the nationwide ground motion service is the so-called Persistent Scatterer Interferometry Wide Area Product (WAP) approach developed with grants of European research funds. The presentation shows the implementation of the ground motion service and examples for product developments for operational supervision of mining, water resources management and spatial planning. Furthermore the contributions of Copernicus Sentinel 1 radar data in the context of CTBT are discussed. The DInSAR processing of Sentinel 1 IW (Interferometric Wide Swath) SAR acquisitions from January 1st and 13th Jan. 2016 allow for the first time a near real time ground motion measurement of the North Korean nuclear test site. The measured ground displacements show a strong spatio-temporal correlation to the calculated epicenter measured by teleseismic stations. We are convinced this way another space technique will soon contribute even further to secure better societal information needs.
Issues Involving The OSI Concept of Operation For Noble Gas Radionuclide Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrigan, C R; Sun, Y
2011-01-21
The development of a technically sound protocol for detecting the subsurface release of noble gas radionuclides is critical to the successful operation of an on site inspection (OSI) under the CTBT and has broad ramifications for all aspects of the OSI regime including the setting of specifications for both sampling and analysis equipment used during an OSI. With NA-24 support, we are investigating a variety of issues and concerns that have significant bearing on policy development and technical guidance regarding the detection of noble gases and the creation of a technically justifiable OSI concept of operation. The work at LLNLmore » focuses on optimizing the ability to capture radioactive noble gases subject to the constraints of possible OSI scenarios. This focus results from recognizing the difficulty of detecting gas releases in geologic environments - a lesson we learned previously from the LLNL Non-Proliferation Experiment (NPE). Evaluation of a number of important noble gas detection issues, potentially affecting OSI policy, has awaited the US re-engagement with the OSI technical community. Thus, there have been numerous issues to address during the past 18 months. Most of our evaluations of a sampling or transport issue necessarily involve computer simulations. This is partly due to the lack of OSI-relevant field data, such as that provided by the NPE, and partly a result of the ability of LLNL computer-based models to test a range of geologic and atmospheric scenarios far beyond what could ever be studied in the field making this approach very highly cost effective. We review some highlights of the transport and sampling issues we have investigated during the past year. We complete the discussion of these issues with a description of a preliminary design for subsurface sampling that is intended to be a practical solution to most if not all the challenges addressed here.« less
A knowledge base of the chemical compounds of intermediary metabolism.
Karp, P D
1992-08-01
This paper describes a publicly available knowledge base of the chemical compounds involved in intermediary metabolism. We consider the motivations for constructing a knowledge base of metabolic compounds, the methodology by which it was constructed, and the information that it currently contains. Currently the knowledge base describes 981 compounds, listing for each: synonyms for its name, a systematic name, CAS registry number, chemical formula, molecular weight, chemical structure and two-dimensional display coordinates for the structure. The Compound Knowledge Base (CompoundKB) illustrates several methodological principles that should guide the development of biological knowledge bases. I argue that biological datasets should be made available in multiple representations to increase their accessibility to end users, and I present multiple representations of the CompoundKB (knowledge base, relational data base and ASN. 1 representations). I also analyze the general characteristics of these representations to provide an understanding of their relative advantages and disadvantages. Another principle is that the error rate of biological data bases should be estimated and documented-this analysis is performed for the CompoundKB.
Automated knowledge generation
NASA Technical Reports Server (NTRS)
Myler, Harley R.; Gonzalez, Avelino J.
1988-01-01
The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).
Case-based medical informatics
Pantazi, Stefan V; Arocha, José F; Moehr, Jochen R
2004-01-01
Background The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. Discussion We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. Summary Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately. PMID:15533257
Reusing Design Knowledge Based on Design Cases and Knowledge Map
ERIC Educational Resources Information Center
Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi
2013-01-01
Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…
Radionuclide observables for the Platte underground nuclear explosive test on 14 April 1962.
Burnett, Jonathan L; Milbrath, Brian D
2016-11-01
Past nuclear weapon explosive tests provide invaluable information for understanding the radionuclide observables expected during an On-site Inspection (OSI) for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). These radioactive signatures are complex and subject to spatial and temporal variability. The Platte underground nuclear explosive test on 14 April 1962 provides extensive environmental monitoring data that can be modelled and used to calculate the maximum time available for detection of the OSI-relevant radionuclides. The 1.6 kT test is especially useful as it released the highest amounts of recorded activity during Operation Nougat at the Nevada Test Site - now known as the Nevada National Security Site (NNSS). It has been estimated that 0.36% of the activity was released, and dispersed in a northerly direction. The deposition ranged from 1 × 10 -11 to 1 × 10 -9 of the atmospheric release (per m 2 ), and has been used in this paper to evaluate an OSI and the OSI-relevant radionuclides at 1 week to 2 years post-detonation. Radioactive decay reduces the activity of the OSI-relevant radionuclides by 99.7% within 2 years of detonation, such that detection throughout the hypothesized inspection is only achievable close to the explosion where deposition was highest. Copyright © 2016 Elsevier Ltd. All rights reserved.
Calibration of an Ultra-Low-Background Proportional Counter for Measuring 37Ar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifert, Allen; Aalseth, Craig E.; Bonicalzi, Ricco
Abstract. An ultra-low-background proportional counter (ULBPC) design has been developed at Pacific Northwest National Laboratory (PNNL) using clean materials, primarily electrochemically-purified copper. This detector, along with an ultra-low-background counting system (ULBCS), was developed to complement a new shallow underground laboratory (30 meters water-equivalent) constructed at PNNL. The ULBCS design includes passive neutron and gamma shielding, along with an active cosmic-veto system. This system provides a capability for making ultra-sensitive measurements to support applications like age-dating soil hydrocarbons with 14C/3H, age-dating of groundwater with 39Ar, and soil-gas assay for 37Ar to support On-Site Inspection (OSI). On-Site Inspection is a key componentmore » of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Measurements of radionuclides created by an underground nuclear explosion are valuable signatures of a Treaty violation. For OSI, the 35-day half-life of 37Ar, produced from neutron interactions with calcium in soil, provides both high specific activity and sufficient time for inspection before decay limits sensitivity. This work describes the calibration techniques and analysis methods developed to enable quantitative measurements of 37Ar samples over a broad range of pressures. These efforts, along with parallel work in progress on gas chemistry separation, are expected to provide a significant new capability for 37Ar soil gas background studies.« less
Incorporating linguistic knowledge for learning distributed word representations.
Wang, Yan; Liu, Zhiyuan; Sun, Maosong
2015-01-01
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining.
Incorporating Linguistic Knowledge for Learning Distributed Word Representations
Wang, Yan; Liu, Zhiyuan; Sun, Maosong
2015-01-01
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic knowledge into consideration. Linguistic knowledge can be represented as either link-based knowledge or preference-based knowledge, and we propose knowledge regularized word representation models (KRWR) to incorporate these prior knowledge for learning distributed word representations. Experiment results demonstrate that our estimated word representation achieves better performance in task of semantic relatedness ranking. This indicates that our methods can efficiently encode both prior knowledge from knowledge bases and statistical knowledge from large-scale text corpora into a unified word representation model, which will benefit many tasks in text mining. PMID:25874581
Intrusion Detection Systems with Live Knowledge System
2016-05-31
Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection
Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif
2008-03-01
High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.
1988-02-01
intellegent knowledge bases. The present state of our system for concurrent evaluation of a knowledge base of logic clauses using static allocation...de Kleer, J., An assumption-based TMS, Artificial Intelligence, Vol. 28, No. 2, 1986. [Doyle 79) Doyle, J. A truth maintenance system, Artificial
Knowledge-base browsing: an application of hybrid distributed/local connectionist networks
NASA Astrophysics Data System (ADS)
Samad, Tariq; Israel, Peggy
1990-08-01
We describe a knowledge base browser based on a connectionist (or neural network) architecture that employs both distributed and local representations. The distributed representations are used for input and output thereby enabling associative noise-tolerant interaction with the environment. Internally all representations are fully local. This simplifies weight assignment and facilitates network configuration for specific applications. In our browser concepts and relations in a knowledge base are represented using " microfeatures. " The microfeatures can encode semantic attributes structural features contextual information etc. Desired portions of the knowledge base can then be associatively retrieved based on a structured cue. An ordered list of partial matches is presented to the user for selection. Microfeatures can also be used as " bookmarks" they can be placed dynamically at appropriate points in the knowledge base and subsequently used as retrieval cues. A proof-of-concept system has been implemented for an internally developed Honeywell-proprietary knowledge acquisition tool. 1.
NASA Astrophysics Data System (ADS)
Pritykin, F. N.; Nebritov, V. I.
2018-01-01
The paper presents the configuration of knowledge base necessary for intelligent control of android arm mechanism motion with different positions of certain forbidden regions taken into account. The present structure of the knowledge base characterizes the past experience of arm motion synthesis in the vector of velocities with due regard for the known obstacles. This structure also specifies its intrinsic properties. Knowledge base generation is based on the study of the arm mechanism instantaneous states implementations. Computational experiments connected with the virtual control of android arm motion with known forbidden regions using the developed knowledge base are introduced. Using the developed knowledge base to control virtually the arm motion reduces the time of test assignments calculation. The results of the research can be used in developing control systems of autonomous android robots in the known in advance environment.
Knowledge service decision making in business incubators based on the supernetwork model
NASA Astrophysics Data System (ADS)
Zhao, Liming; Zhang, Haihong; Wu, Wenqing
2017-08-01
As valuable resources for incubating firms, knowledge resources have received gradually increasing attention from all types of business incubators, and business incubators use a variety of knowledge services to stimulate rapid growth in incubating firms. Based on previous research, we generalize the knowledge transfer and knowledge networking services of two main forms of knowledge services and further divide knowledge transfer services into knowledge depth services and knowledge breadth services. Then, we construct the business incubators' knowledge supernetwork model, describe the evolution mechanism among heterogeneous agents and utilize a simulation to explore the performance variance of different business incubators' knowledge services. The simulation results show that knowledge stock increases faster when business incubators are able to provide knowledge services to more incubating firms and that the degree of discrepancy in the knowledge stock increases during the process of knowledge growth. Further, knowledge transfer services lead to greater differences in the knowledge structure, while knowledge networking services lead to smaller differences. Regarding the two types of knowledge transfer services, knowledge depth services are more conducive to knowledge growth than knowledge breadth services, but knowledge depth services lead to greater gaps in knowledge stocks and greater differences in knowledge structures. Overall, it is optimal for business incubators to select a single knowledge service or portfolio strategy based on the amount of time and energy expended on the two types of knowledge services.
Sustaining Knowledge Building as a Principle-Based Innovation at an Elementary School
ERIC Educational Resources Information Center
Zhang, Jianwei; Hong, Huang-Yao; Scardamalia, Marlene; Teo, Chew Lee; Morley, Elizabeth A.
2011-01-01
This study explores Knowledge Building as a principle-based innovation at an elementary school and makes a case for a principle- versus procedure-based approach to educational innovation, supported by new knowledge media. Thirty-nine Knowledge Building initiatives, each focused on a curriculum theme and facilitated by nine teachers over eight…
1990-05-01
Sanders Associates. Inc. A demonstration of knowledge-based support for the evolut ;cnry development of software system requirements uskig mitV/9 text...Conference Commiffee W Douga W~t Spin-Off Technologies 4 AN OVERVIEW OF RADC’S KNOWLEDGE BASED SOFTWARE ASSISTANT PROGRAM Donald M. Elefante Rome Air...Knowledge-Based Software Assistant is a formally based, computer-mediated paradigm for the specification, development, evolution , and Ir ig term
Knowledge Base Editor (SharpKBE)
NASA Technical Reports Server (NTRS)
Tikidjian, Raffi; James, Mark; Mackey, Ryan
2007-01-01
The SharpKBE software provides a graphical user interface environment for domain experts to build and manage knowledge base systems. Knowledge bases can be exported/translated to various target languages automatically, including customizable target languages.
Creating illusions of knowledge: learning errors that contradict prior knowledge.
Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J
2013-02-01
Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors. 2013 APA, all rights reserved
Nurses experience of using scientific knowledge in clinical practice: a grounded theory study.
Renolen, Åste; Hjälmhult, Esther
2015-12-01
Guidelines recommend the use of evidence-based practice in nursing. Nurses are expected to give patients care and treatment based on the best knowledge available. They may have knowledge and positive attitudes, but this does not mean that they are basing their work on evidence-based practice. Knowledge is still lacking about what is needed to successfully implement evidence-based practice. The aim of this study was to gain more knowledge about what nurses perceive as the most important challenge in implementing evidence-based practice and to explain how they act to face and overcome this challenge. We used classical grounded theory methodology and collected data through four focus groups and one individual interview in different geographical locations in one large hospital trust in Norway. Fourteen registered clinical practice nurses participated. We analysed the data in accordance with grounded theory, using the constant comparative method. Contextual balancing of knowledge emerged as the core category and explains how the nurses dealt with their main concern, how to determine what types of knowledge they could trust. The nurses' main strategies were an inquiring approach, examining knowledge and maintaining control while taking care of patients. They combined their own experienced-based knowledge and the guidelines of evidence-based practice with a sense of control in the actual situation. The grounded theory contextual balancing of knowledge may help us to understand how nurses detect what types of knowledge they can trust in clinical practice. The nurses needed to rely on what they did, and they seemed to rely on their own experience rather than on research. © 2015 Nordic College of Caring Science.
An architecture for rule based system explanation
NASA Technical Reports Server (NTRS)
Fennel, T. R.; Johannes, James D.
1990-01-01
A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.
Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur
2012-01-01
This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions.
Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur
2012-01-01
This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions. PMID:23112650
Distributed, cooperating knowledge-based systems
NASA Technical Reports Server (NTRS)
Truszkowski, Walt
1991-01-01
Some current research in the development and application of distributed, cooperating knowledge-based systems technology is addressed. The focus of the current research is the spacecraft ground operations environment. The underlying hypothesis is that, because of the increasing size, complexity, and cost of planned systems, conventional procedural approaches to the architecture of automated systems will give way to a more comprehensive knowledge-based approach. A hallmark of these future systems will be the integration of multiple knowledge-based agents which understand the operational goals of the system and cooperate with each other and the humans in the loop to attain the goals. The current work includes the development of a reference model for knowledge-base management, the development of a formal model of cooperating knowledge-based agents, the use of testbed for prototyping and evaluating various knowledge-based concepts, and beginning work on the establishment of an object-oriented model of an intelligent end-to-end (spacecraft to user) system. An introductory discussion of these activities is presented, the major concepts and principles being investigated are highlighted, and their potential use in other application domains is indicated.
Towards Modeling False Memory With Computational Knowledge Bases.
Li, Justin; Kohanyi, Emma
2017-01-01
One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.
Knowledge Acquisition of Generic Queries for Information Retrieval
Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.
2002-01-01
Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.
Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K
2013-08-12
A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based text-mining approach that can automatically identify such definitions within text. Using an existing knowledge base of 156 autism phenotype definitions and an annotated corpus of 26 source articles containing such definitions, we evaluated and compared the average rank of correctly identified rule definition or corresponding rule template using both our semantic-based approach and a standard term-based approach. We examined three separate scenarios: (1) the snippet of text contained a definition already in the knowledge base; (2) the snippet contained an alternative definition for a concept in the knowledge base; and (3) the snippet contained a definition not in the knowledge base. Our semantic-based approach had a higher average rank than the term-based approach for each of the three scenarios (scenario 1: 3.8 vs. 5.0; scenario 2: 2.8 vs. 4.9; and scenario 3: 4.5 vs. 6.2), with each comparison significant at the p-value of 0.05 using the Wilcoxon signed-rank test. Our work shows that leveraging existing domain knowledge in the information extraction of biomedical definitions significantly improves the correct identification of such knowledge within sentences. Our method can thus help researchers rapidly acquire knowledge about biomedical definitions that are specified and evolving within an ever-growing corpus of scientific publications.
NASA Astrophysics Data System (ADS)
Sołtysik-Piorunkiewicz, Anna
2015-02-01
How we can measure the impact of internet technology Web 2.0/3.0 for knowledge management? How we can use the Web 2.0/3.0 technologies for generating, evaluating, sharing, organizing knowledge in knowledge-based organization? How we can evaluate it from user-centered perspective? Article aims to provide a method for evaluate the usability of web technologies to support knowledge management in knowledge-based organizations of the various stages of the cycle knowledge management, taking into account: generating knowledge, evaluating knowledge, sharing knowledge, etc. for the modern Internet technologies based on the example of agent technologies. The method focuses on five areas of evaluation: GUI, functional structure, the way of content publication, organizational aspect, technological aspect. The method is based on the proposed indicators relating respectively to assess specific areas of evaluation, taking into account the individual characteristics of the scoring. Each of the features identified in the evaluation is judged first point wise, then this score is subject to verification and clarification by means of appropriate indicators of a given feature. The article proposes appropriate indicators to measure the impact of Web 2.0/3.0 technologies for knowledge management and verification them in an example of agent technology usability in knowledge management system.
Knowledge-based reasoning in the Paladin tactical decision generation system
NASA Technical Reports Server (NTRS)
Chappell, Alan R.
1993-01-01
A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.
Engineering Knowledge for Assistive Living
NASA Astrophysics Data System (ADS)
Chen, Liming; Nugent, Chris
This paper introduces a knowledge based approach to assistive living in smart homes. It proposes a system architecture that makes use of knowledge in the lifecycle of assistive living. The paper describes ontology based knowledge engineering practices and discusses mechanisms for exploiting knowledge for activity recognition and assistance. It presents system implementation and experiments, and discusses initial results.
ERIC Educational Resources Information Center
Turkan, Sultan; De Oliveira, Luciana C.; Lee, Okhee; Phelps, Geoffrey
2014-01-01
Background/Context: The current research on teacher knowledge and teacher accountability falls short on information about what teacher knowledge base could guide preparation and accountability of the mainstream teachers for meeting the academic needs of ELLs. Most recently, research on specialized knowledge for teaching has offered ways to…
Fritsche, L; Greenhalgh, T; Falck-Ytter, Y; Neumayer, H-H; Kunz, R
2002-01-01
Objective To develop and validate an instrument for measuring knowledge and skills in evidence based medicine and to investigate whether short courses in evidence based medicine lead to a meaningful increase in knowledge and skills. Design Development and validation of an assessment instrument and before and after study. Setting Various postgraduate short courses in evidence based medicine in Germany. Participants The instrument was validated with experts in evidence based medicine, postgraduate doctors, and medical students. The effect of courses was assessed by postgraduate doctors from medical and surgical backgrounds. Intervention Intensive 3 day courses in evidence based medicine delivered through tutor facilitated small groups. Main outcome measure Increase in knowledge and skills. Results The questionnaire distinguished reliably between groups with different expertise in evidence based medicine. Experts attained a threefold higher average score than students. Postgraduates who had not attended a course performed better than students but significantly worse than experts. Knowledge and skills in evidence based medicine increased after the course by 57% (mean score before course 6.3 (SD 2.9) v 9.9 (SD 2.8), P<0.001). No difference was found among experts or students in absence of an intervention. Conclusions The instrument reliably assessed knowledge and skills in evidence based medicine. An intensive 3 day course in evidence based medicine led to a significant increase in knowledge and skills. What is already known on this topicNumerous observational studies have investigated the impact of teaching evidence based medicine to healthcare professionals, with conflicting resultsMost of the studies were of poor methodological qualityWhat this study addsAn instrument assessing basic knowledge and skills required for practising evidence based medicine was developed and validatedAn intensive 3 day course on evidence based medicine for doctors from various backgrounds and training level led to a clinically meaningful improvement of knowledge and skills PMID:12468485
PDA: A coupling of knowledge and memory for case-based reasoning
NASA Technical Reports Server (NTRS)
Bharwani, S.; Walls, J.; Blevins, E.
1988-01-01
Problem solving in most domains requires reference to past knowledge and experience whether such knowledge is represented as rules, decision trees, networks or any variant of attributed graphs. Regardless of the representational form employed, designers of expert systems rarely make a distinction between the static and dynamic aspects of the system's knowledge base. The current paper clearly distinguishes between knowledge-based and memory-based reasoning where the former in its most pure sense is characterized by a static knowledge based resulting in a relatively brittle expert system while the latter is dynamic and analogous to the functions of human memory which learns from experience. The paper discusses the design of an advisory system which combines a knowledge base consisting of domain vocabulary and default dependencies between concepts with a dynamic conceptual memory which stores experimental knowledge in the form of cases. The case memory organizes past experience in the form of MOPs (memory organization packets) and sub-MOPs. Each MOP consists of a context frame and a set of indices. The context frame contains information about the features (norms) common to all the events and sub-MOPs indexed under it.
NASA Astrophysics Data System (ADS)
Nieten, Joseph L.; Burke, Roger
1993-03-01
The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.
A New Perspective on Modeling Groundwater-Driven Health Risk With Subjective Information
NASA Astrophysics Data System (ADS)
Ozbek, M. M.
2003-12-01
Fuzzy rule-based systems provide an efficient environment for the modeling of expert information in the context of risk management for groundwater contamination problems. In general, their use in the form of conditional pieces of knowledge, has been either as a tool for synthesizing control laws from data (i.e., conjunction-based models), or in a knowledge representation and reasoning perspective in Artificial Intelligence (i.e., implication-based models), where only the latter may lead to coherence problems (e.g., input data that leads to logical inconsistency when added to the knowledge base). We implement a two-fold extension to an implication-based groundwater risk model (Ozbek and Pinder, 2002) including: 1) the implementation of sufficient conditions for a coherent knowledge base, and 2) the interpolation of expert statements to supplement gaps in knowledge. The original model assumes statements of public health professionals for the characterization of the exposed individual and the relation of dose and pattern of exposure to its carcinogenic effects. We demonstrate the utility of the extended model in that it: 1)identifies inconsistent statements and establishes coherence in the knowledge base, and 2) minimizes the burden of knowledge elicitation from the experts for utilizing existing knowledge in an optimal fashion.ÿÿ
Relating GTE and Knowledge-Based Courseware Engineering: Some Epistemological Issues.
ERIC Educational Resources Information Center
De Diana, Italo P. F.; Ladhani, Al-Noor
1998-01-01
Discusses GTE (Generic Tutoring Environment) and knowledge-based courseware engineering from an epistemological point of view and suggests some combination of the two approaches. Topics include intelligent tutoring; courseware authoring; application versus acquisition of knowledge; and domain knowledge. (LRW)
Indonesia knowledge dissemination: a snapshot
NASA Astrophysics Data System (ADS)
Nasution, M. K. M.
2018-03-01
The educational progress of a country or educational institution is measured through the implementation of knowledge dissemination. Evidence of knowledge dissemination has carried out be in form of the type of published document, which is based on the databases of the index of scientific publications: Scopus. This paper expresses a simple form of knowledge dissemination based on document type. Although the growth of knowledge dissemination does not have the same pattern based on the appearance of document types, the general implementation is almost the same. However, maximum effort needs to be done by PTN-bh to support Indonesia knowledge dissemination.
Integration of an OWL-DL knowledge base with an EHR prototype and providing customized information.
Jing, Xia; Kay, Stephen; Marley, Tom; Hardiker, Nicholas R
2014-09-01
When clinicians use electronic health record (EHR) systems, their ability to obtain general knowledge is often an important contribution to their ability to make more informed decisions. In this paper we describe a method by which an external, formal representation of clinical and molecular genetic knowledge can be integrated into an EHR such that customized knowledge can be delivered to clinicians in a context-appropriate manner.Web Ontology Language-Description Logic (OWL-DL) is a formal knowledge representation language that is widely used for creating, organizing and managing biomedical knowledge through the use of explicit definitions, consistent structure and a computer-processable format, particularly in biomedical fields. In this paper we describe: 1) integration of an OWL-DL knowledge base with a standards-based EHR prototype, 2) presentation of customized information from the knowledge base via the EHR interface, and 3) lessons learned via the process. The integration was achieved through a combination of manual and automatic methods. Our method has advantages for scaling up to and maintaining knowledge bases of any size, with the goal of assisting clinicians and other EHR users in making better informed health care decisions.
When craft and science collide: Improving therapeutic practices through evidence-based innovations.
Justice, Laura M
2010-04-01
Evidence-based practice (EBP) is a model of clinical decision-making that is increasingly being advocated for use in the field of speech-language pathology. With the increased emphasis on scientific evidence as a form of knowledge important to EBP, clinicians may wonder whether their craft-based knowledge (i.e., knowledge derived from theory and practice), remains a legitimate form of knowledge for use in clinician decisions. This article describes forms of knowledge that may be used to address clinical questions, to include both craft and science. Additionally, the steps used when engaging in EBP are described so that clinicians understand when and how craft comes into play. The major premise addressed within this article is that craft is a legitimate form of knowledge and that engagement in EBP requires one to employ craft-based knowledge.
Vakily, Masoomeh; Noroozi, Mahnaz; Yamani, Nikoo
2017-01-01
Training the health personnel about domestic violence would cause them to investigate and evaluate this issue more than before. Considering the new educational approaches for transferring knowledge, the goal of this research was to compare the effect of group-based and compact disk (CD)-based training on midwives' knowledge and attitude toward domestic violence. In this clinical experiment, seventy midwives working at health centers and hospitals of Isfahan were randomly allocated into two classes of group-based and CD-based trainings and were trained in the fields of recognition, prevention, and management of domestic violence. Data were collected by questionnaires which were completed by the midwives for evaluation of their knowledge and attitude. The mean score of midwives' knowledge and attitude toward domestic violence had a meaningful increase after the training (16.1, 46.9) compared to the score of before the training (12.1, 39.1) in both of the classes (group-based training: 17.7, 45.4) (CD-based training: 11.7, 38.6). No meaningful difference was observed between the two groups regarding midwives' attitude toward domestic violence after the intervention; however, regarding their knowledge level, the difference was statistically meaningful ( P = 0.001), and this knowledge increase was more in the CD-based training group. In spite of the effectiveness of both of the training methods in promoting midwives' knowledge and attitude about domestic violence, training with CD was more effective in increasing their knowledge; as a result, considering the benefits of CD-based training such as cost-effectiveness and possibility of use at any time, it is advised to be used in training programs for the health personnel.
Vakily, Masoomeh; Noroozi, Mahnaz; Yamani, Nikoo
2017-01-01
BACKGROUND: Training the health personnel about domestic violence would cause them to investigate and evaluate this issue more than before. Considering the new educational approaches for transferring knowledge, the goal of this research was to compare the effect of group-based and compact disk (CD)-based training on midwives’ knowledge and attitude toward domestic violence. METHODS: In this clinical experiment, seventy midwives working at health centers and hospitals of Isfahan were randomly allocated into two classes of group-based and CD-based trainings and were trained in the fields of recognition, prevention, and management of domestic violence. Data were collected by questionnaires which were completed by the midwives for evaluation of their knowledge and attitude. RESULTS: The mean score of midwives’ knowledge and attitude toward domestic violence had a meaningful increase after the training (16.1, 46.9) compared to the score of before the training (12.1, 39.1) in both of the classes (group-based training: 17.7, 45.4) (CD-based training: 11.7, 38.6). No meaningful difference was observed between the two groups regarding midwives’ attitude toward domestic violence after the intervention; however, regarding their knowledge level, the difference was statistically meaningful (P = 0.001), and this knowledge increase was more in the CD-based training group. CONCLUSIONS: In spite of the effectiveness of both of the training methods in promoting midwives’ knowledge and attitude about domestic violence, training with CD was more effective in increasing their knowledge; as a result, considering the benefits of CD-based training such as cost-effectiveness and possibility of use at any time, it is advised to be used in training programs for the health personnel. PMID:28852660
The elements of design knowledge capture
NASA Technical Reports Server (NTRS)
Freeman, Michael S.
1988-01-01
This paper will present the basic constituents of a design knowledge capture effort. This will include a discussion of the types of knowledge to be captured in such an effort and the difference between design knowledge capture and more traditional knowledge base construction. These differences include both knowledge base structure and knowledge acquisition approach. The motivation for establishing a design knowledge capture effort as an integral part of major NASA programs will be outlined, along with the current NASA position on that subject. Finally the approach taken in design knowledge capture for Space Station will be contrasted with that used in the HSTDEK project.
The biomedical disciplines and the structure of biomedical and clinical knowledge.
Nederbragt, H
2000-11-01
The relation between biomedical knowledge and clinical knowledge is discussed by comparing their respective structures. The knowledge of a disease as a biological phenomenon is constructed by the interaction of facts and theories from the main biomedical disciplines: epidemiology, diagnostics, clinical trial, therapy development and pathogenesis. Although these facts and theories are based on probabilities and extrapolations, the interaction provides a reliable and coherent structure, comparable to a Kuhnian paradigma. In the structure of clinical knowledge, i.e. knowledge of the patient with the disease, not only biomedical knowledge contributes to the structure but also economic and social relations, ethics and personal experience. However, the interaction between each of the participating "knowledges" in clinical knowledge is not based on mutual dependency and accumulation of different arguments from each, as in biomedical knowledge, but on competition and partial exclusion. Therefore, the structure of biomedical knowledge is different from that of clinical knowledge. This difference is used as the basis for a discussion in which the place of technology, evidence-based medicine and the gap between scientific and clinical knowledge are evaluated.
A Lessons Learned Knowledge Warehouse to Support the Army Knowledge Management Command-Centric
2004-03-01
Warehouse to Support the Army Knowledge Management Command-Centric increase the quality and availability of information in context ( knowledge ) to the... information , geographical information , knowledge base, Intelligence data (HUMINT, SIGINT, etc.); and • • Human Computer Interaction (HCI): allows...the Data Fusion Process from the HCI point of view? Can the LL Knowledge Base provide any valuable information to achieve better estimates of the
Haux, R; Grothe, W; Runkel, M; Schackert, H K; Windeler, H J; Winter, A; Wirtz, R; Herfarth, C; Kunze, S
1996-04-01
We report on a prospective, prolective observational study, supplying information on how physicians and other health care professionals retrieve medical knowledge on-line within the Heidelberg University Hospital information system. Within this hospital information system, on-line access to medical knowledge has been realised by installing a medical knowledge server in the range of about 24 GB and by providing access to it by health care professional workstations in wards, physicians' rooms, etc. During the study, we observed about 96 accesses per working day. The main group of health care professionals retrieving medical knowledge were physicians and medical students. Primary reasons for its utilisation were identified as support for the users' scientific work (50%), own clinical cases (19%), general medical problems (14%) and current clinical problems (13%). Health care professionals had accesses to medical knowledge bases such as MEDLINE (79%), drug bases ('Rote Liste', 6%), and to electronic text books and knowledge base systems as well. Sixty-five percent of accesses to medical knowledge were judged to be successful. In our opinion, medical knowledge retrieval can serve as a first step towards knowledge processing in medicine. We point out the consequences for the management of hospital information systems in order to provide the prerequisites for such a type of knowledge retrieval.
Koutkias, Vassilis; Kilintzis, Vassilis; Stalidis, George; Lazou, Katerina; Niès, Julie; Durand-Texte, Ludovic; McNair, Peter; Beuscart, Régis; Maglaveras, Nicos
2012-06-01
The primary aim of this work was the development of a uniform, contextualized and sustainable knowledge-based framework to support adverse drug event (ADE) prevention via Clinical Decision Support Systems (CDSSs). In this regard, the employed methodology involved first the systematic analysis and formalization of the knowledge sources elaborated in the scope of this work, through which an application-specific knowledge model has been defined. The entire framework architecture has been then specified and implemented by adopting Computer Interpretable Guidelines (CIGs) as the knowledge engineering formalism for its construction. The framework integrates diverse and dynamic knowledge sources in the form of rule-based ADE signals, all under a uniform Knowledge Base (KB) structure, according to the defined knowledge model. Equally important, it employs the means to contextualize the encapsulated knowledge, in order to provide appropriate support considering the specific local environment (hospital, medical department, language, etc.), as well as the mechanisms for knowledge querying, inference, sharing, and management. In this paper, we present thoroughly the establishment of the proposed knowledge framework by presenting the employed methodology and the results obtained as regards implementation, performance and validation aspects that highlight its applicability and virtue in medication safety. Copyright © 2012 Elsevier Inc. All rights reserved.
The Relationship between Agriculture Knowledge Bases for Teaching and Sources of Knowledge
ERIC Educational Resources Information Center
Rice, Amber H.; Kitchel, Tracy
2015-01-01
The purpose of this study was to describe the agriculture knowledge bases for teaching of agriculture teachers and to see if a relationship existed between years of teaching experience, sources of knowledge, and development of pedagogical content knowledge (PCK), using quantitative methods. A model of PCK from mathematics was utilized as a…
The Role of Learning Goals in Building a Knowledge Base for Elementary Mathematics Teacher Education
ERIC Educational Resources Information Center
Jansen, Amanda; Bartell, Tonya; Berk, Dawn
2009-01-01
In this article, we describe features of learning goals that enable indexing knowledge for teacher education. Learning goals are the key enabler for building a knowledge base for teacher education; they define what counts as essential knowledge for prospective teachers. We argue that 2 characteristics of learning goals support knowledge-building…
ERIC Educational Resources Information Center
Bernacki, Matthew
2010-01-01
This study examined how learners construct textbase and situation model knowledge in hypertext computer-based learning environments (CBLEs) and documented the influence of specific self-regulated learning (SRL) tactics, prior knowledge, and characteristics of the learner on posttest knowledge scores from exposure to a hypertext. A sample of 160…
Jacobi, Johanna; Mathez-Stiefel, Sarah-Lan; Gambon, Helen; Rist, Stephan; Altieri, Miguel
2017-03-01
Agroforestry often relies on local knowledge, which is gaining recognition in development projects. However, how local knowledge can articulate with external and scientific knowledge is little known. Our study explored the use and integration of local and external knowledge in agroforestry projects in Bolivia. In 42 field visits and 62 interviews with agroforestry farmers, civil society representatives, and policymakers, we found a diverse knowledge base. We examined how local and external knowledge contribute to livelihood assets and tree and crop diversity. Projects based predominantly on external knowledge tended to promote a single combination of tree and crop species and targeted mainly financial capital, whereas projects with a local or mixed knowledge base tended to focus on food security and increased natural capital (e.g., soil restoration) and used a higher diversity of trees and crops than those with an external knowledge base. The integration of different forms of knowledge can enable farmers to better cope with new challenges emerging as a result of climate change, fluctuating market prices for cash crops, and surrounding destructive land use strategies such as uncontrolled fires and aerial fumigation with herbicides. However, many projects still tended to prioritize external knowledge and undervalue local knowledge-a tendency that has long been institutionalized in the formal educational system and in extension services. More dialogue is needed between different forms of knowledge, which can be promoted by strengthening local organizations and their networks, reforming agricultural educational institutions, and working in close interaction with policymakers.
Computer Assisted Multi-Center Creation of Medical Knowledge Bases
Giuse, Nunzia Bettinsoli; Giuse, Dario A.; Miller, Randolph A.
1988-01-01
Computer programs which support different aspects of medical care have been developed in recent years. Their capabilities range from diagnosis to medical imaging, and include hospital management systems and therapy prescription. In spite of their diversity these systems have one commonality: their reliance on a large body of medical knowledge in computer-readable form. This knowledge enables such programs to draw inferences, validate hypotheses, and in general to perform their intended task. As has been clear to developers of such systems, however, the creation and maintenance of medical knowledge bases are very expensive. Practical and economical difficulties encountered during this long-term process have discouraged most attempts. This paper discusses knowledge base creation and maintenance, with special emphasis on medical applications. We first describe the methods currently used and their limitations. We then present our recent work on developing tools and methodologies which will assist in the process of creating a medical knowledge base. We focus, in particular, on the possibility of multi-center creation of the knowledge base.
Translating three states of knowledge--discovery, invention, and innovation
2010-01-01
Background Knowledge Translation (KT) has historically focused on the proper use of knowledge in healthcare delivery. A knowledge base has been created through empirical research and resides in scholarly literature. Some knowledge is amenable to direct application by stakeholders who are engaged during or after the research process, as shown by the Knowledge to Action (KTA) model. Other knowledge requires multiple transformations before achieving utility for end users. For example, conceptual knowledge generated through science or engineering may become embodied as a technology-based invention through development methods. The invention may then be integrated within an innovative device or service through production methods. To what extent is KT relevant to these transformations? How might the KTA model accommodate these additional development and production activities while preserving the KT concepts? Discussion Stakeholders adopt and use knowledge that has perceived utility, such as a solution to a problem. Achieving a technology-based solution involves three methods that generate knowledge in three states, analogous to the three classic states of matter. Research activity generates discoveries that are intangible and highly malleable like a gas; development activity transforms discoveries into inventions that are moderately tangible yet still malleable like a liquid; and production activity transforms inventions into innovations that are tangible and immutable like a solid. The paper demonstrates how the KTA model can accommodate all three types of activity and address all three states of knowledge. Linking the three activities in one model also illustrates the importance of engaging the relevant stakeholders prior to initiating any knowledge-related activities. Summary Science and engineering focused on technology-based devices or services change the state of knowledge through three successive activities. Achieving knowledge implementation requires methods that accommodate these three activities and knowledge states. Accomplishing beneficial societal impacts from technology-based knowledge involves the successful progression through all three activities, and the effective communication of each successive knowledge state to the relevant stakeholders. The KTA model appears suitable for structuring and linking these processes. PMID:20205873
Jing, Xia; Kay, Stephen; Marley, Thomas; Hardiker, Nicholas R; Cimino, James J
2012-02-01
The current volume and complexity of genetic tests, and the molecular genetics knowledge and health knowledge related to interpretation of the results of those tests, are rapidly outstripping the ability of individual clinicians to recall, understand and convey to their patients information relevant to their care. The tailoring of molecular genetics knowledge and health knowledge in clinical settings is important both for the provision of personalized medicine and to reduce clinician information overload. In this paper we describe the incorporation, customization and demonstration of molecular genetic data (mainly sequence variants), molecular genetics knowledge and health knowledge into a standards-based electronic health record (EHR) prototype developed specifically for this study. We extended the CCR (Continuity of Care Record), an existing EHR standard for representing clinical data, to include molecular genetic data. An EHR prototype was built based on the extended CCR and designed to display relevant molecular genetics knowledge and health knowledge from an existing knowledge base for cystic fibrosis (OntoKBCF). We reconstructed test records from published case reports and represented them in the CCR schema. We then used the EHR to dynamically filter molecular genetics knowledge and health knowledge from OntoKBCF using molecular genetic data and clinical data from the test cases. The molecular genetic data were successfully incorporated in the CCR by creating a category of laboratory results called "Molecular Genetics" and specifying a particular class of test ("Gene Mutation Test") in this category. Unlike other laboratory tests reported in the CCR, results of tests in this class required additional attributes ("Molecular Structure" and "Molecular Position") to support interpretation by clinicians. These results, along with clinical data (age, sex, ethnicity, diagnostic procedures, and therapies) were used by the EHR to filter and present molecular genetics knowledge and health knowledge from OntoKBCF. This research shows a feasible model for delivering patient sequence variants and presenting tailored molecular genetics knowledge and health knowledge via a standards-based EHR system prototype. EHR standards can be extended to include the necessary patient data (as we have demonstrated in the case of the CCR), while knowledge can be obtained from external knowledge bases that are created and maintained independently from the EHR. This approach can form the basis for a personalized medicine framework, a more comprehensive standards-based EHR system and a potential platform for advancing translational research by both disseminating results and providing opportunities for new insights into phenotype-genotype relationships. Copyright © 2011 Elsevier Inc. All rights reserved.
Petzold, Anita; Korner-Bitensky, Nicol; Salbach, Nancy M; Ahmed, Sara; Menon, Anita; Ogourtsova, Tatiana
2012-02-01
The aim of this study was to investigate: (i) the feasibility of delivering a multi-modal knowledge translation intervention specific to the management of acute post-stroke unilateral spatial neglect; and (ii) the impact of the knowledge translation intervention on occupational therapists' knowledge of evidence-based unilateral spatial neglect problem identification, assessment and treatment, and self-efficacy related to evidence-based practice implementation. A 3-period (pre-post) repeated measures design. Acute care occupational therapists treating patients with post-stroke unilateral spatial neglect were recruited from two major Canadian cities. Participants completed two pre-intervention assessments, took part in a day-long interactive multi-modal knowledge translation intervention and a subsequent 8-week follow-up, and completed a post-intervention assessment. Knowledge of evidence-based problem identification, assessment and treatment of unilateral spatial neglect, and self-efficacy to perform evidence-based practice activities were measured using standard scales. The intervention was tested on 20 occupational therapists. Results indicate a significant improvement in knowledge of best practice unilateral spatial neglect management (p < 0.000) and evidence-based practice self-efficacy in carrying out evidence-based practice activities (p < 0.045) post-intervention. Use of a multi-modal knowledge translation intervention is feasible and can significantly improve occupational therapists' knowledge of unilateral spatial neglect best practices and self-efficacy. The findings should help advance best practices specific to the management of post-stroke unilateral spatial neglect as well as informing knowledge translation studies in other areas of practice.
ERIC Educational Resources Information Center
Loo, Sai Y.
2014-01-01
This paper focuses on teacher education in the English further education sector, where the teaching of disciplinary and pedagogic knowledge is an issue. Using research findings, the paper advocates an approach based on collaboration and informed research to emphasize and integrate knowledge(s) in situated teaching contexts despite working in a…
NASA Astrophysics Data System (ADS)
Tucker, Deborah L.
Purpose. The purpose of this grounded theory study was to refine, using a Delphi study process, the four categories of the theoretical model of the comprehensive knowledge base required by providers of professional development for K-12 teachers of science generated from a review of the literature. Methodology. This grounded theory study used data collected through a modified Delphi technique and interviews to refine and validate the literature-based knowledge base required by providers of professional development for K-12 teachers of science. Twenty-three participants, experts in the fields of science education, how people learn, instructional and assessment strategies, and learning contexts, responded to the study's questions. Findings. By "densifying" the four categories of the knowledge base, this study determined the causal conditions (the science subject matter knowledge), the intervening conditions (how people learn), the strategies (the effective instructional and assessment strategies), and the context (the context and culture of formal learning environments) surrounding the science professional development process. Eight sections were added to the literature-based knowledge base; the final model comprised of forty-nine sections. The average length of the operational definitions increased nearly threefold and the number of citations per operational definition increased more than twofold. Conclusions. A four-category comprehensive model that can serve as the foundation for the knowledge base required by science professional developers now exists. Subject matter knowledge includes science concepts, inquiry, the nature of science, and scientific habits of mind; how people learn includes the principles of learning, active learning, andragogy, variations in learners, neuroscience and cognitive science, and change theory; effective instructional and assessment strategies include constructivist learning and inquiry-based teaching, differentiation of instruction, making knowledge and thinking accessible to learners, automatic and fluent retrieval of nonscience-specific skills, and science assessment and assessment strategies, science-specific instructional strategies, and safety within a learning environment; and, contextual knowledge includes curriculum selection and implementation strategies and knowledge of building program coherence. Recommendations. Further research on the use of which specific instructional strategies identified in the refined knowledge base have positive, significant effect sizes for adult learners is recommended.
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
Knowledge and Use of Intervention Practices by Community-Based Early Intervention Service Providers
ERIC Educational Resources Information Center
Paynter, Jessica M.; Keen, Deb
2015-01-01
This study investigated staff attitudes, knowledge and use of evidence-based practices (EBP) and links to organisational culture in a community-based autism early intervention service. An EBP questionnaire was completed by 99 metropolitan and regionally-based professional and paraprofessional staff. Participants reported greater knowledge and use…
ERIC Educational Resources Information Center
Baartman, L. K. J.; Kilbrink, N.; de Bruijn, E.
2018-01-01
In vocational education, students learn in different school-based and workplace-based learning environments and engage with different types of knowledge in these environments. Students are expected to integrate these experiences and make meaning of them in relation to their own professional knowledge base. This study focuses both on…
Towards a knowledge-based system to assist the Brazilian data-collecting system operation
NASA Technical Reports Server (NTRS)
Rodrigues, Valter; Simoni, P. O.; Oliveira, P. P. B.; Oliveira, C. A.; Nogueira, C. A. M.
1988-01-01
A study is reported which was carried out to show how a knowledge-based approach would lead to a flexible tool to assist the operation task in a satellite-based environmental data collection system. Some characteristics of a hypothesized system comprised of a satellite and a network of Interrogable Data Collecting Platforms (IDCPs) are pointed out. The Knowledge-Based Planning Assistant System (KBPAS) and some aspects about how knowledge is organized in the IDCP's domain are briefly described.
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
A knowledge-based object recognition system for applications in the space station
NASA Technical Reports Server (NTRS)
Dhawan, Atam P.
1988-01-01
A knowledge-based three-dimensional (3D) object recognition system is being developed. The system uses primitive-based hierarchical relational and structural matching for the recognition of 3D objects in the two-dimensional (2D) image for interpretation of the 3D scene. At present, the pre-processing, low-level preliminary segmentation, rule-based segmentation, and the feature extraction are completed. The data structure of the primitive viewing knowledge-base (PVKB) is also completed. Algorithms and programs based on attribute-trees matching for decomposing the segmented data into valid primitives were developed. The frame-based structural and relational descriptions of some objects were created and stored in a knowledge-base. This knowledge-base of the frame-based descriptions were developed on the MICROVAX-AI microcomputer in LISP environment. The simulated 3D scene of simple non-overlapping objects as well as real camera data of images of 3D objects of low-complexity have been successfully interpreted.
The impact of innovation intermediary on knowledge transfer
NASA Astrophysics Data System (ADS)
Lin, Min; Wei, Jun
2018-07-01
Many firms have opened up their innovation process and actively transfer knowledge with external partners in the market of technology. To reduce some of the market inefficiencies, more and more firms collaborate with innovation intermediaries. In light of the increasing importance of intermediary in the context of open innovation, we in this paper systematically investigate the effect of innovation intermediary on knowledge transfer and innovation process in networked systems. We find that the existence of innovation intermediary is conducive to the knowledge diffusion and facilitate the knowledge growth at system level. Interestingly, the scale of the innovation intermediary has little effect on the growth of knowledge. We further investigate the selection of intermediary members by comparing four selection strategies: random selection, initial knowledge level based selection, absorptive capability based selection, and innovative ability based selection. It is found that the selection strategy based on innovative ability outperforms all the other strategies in promoting the system knowledge growth. Our study provides a theoretical understanding of the impact of innovation intermediary on knowledge transfer and sheds light on the design and selection of innovation intermediary in open innovation.
Ambulatory orthopaedic surgery patients' knowledge with internet-based education.
Heikkinen, Katja; Leino-Kilpi, H; Salanterä, S
2012-01-01
There is a growing need for patient education and an evaluation of its outcomes. The aim of this study was to compare ambulatory orthopaedic surgery patients' knowledge with Internet-based education and face-to-face education with a nurse. The following hypothesis was proposed: Internet-based patient education (experiment) is as effective as face-to-face education with a nurse (control) in increasing patients' level of knowledge and sufficiency of knowledge. In addition, the correlations of demographic variables were tested. The patients were randomized to either an experiment group (n = 72) or a control group (n = 75). Empirical data were collected with two instruments. Patients in both groups showed improvement in their knowledge during their care. Patients in the experiment group improved their knowledge level significantly more in total than those patients in the control group. There were no differences in patients' sufficiency of knowledge between the groups. Knowledge was correlated especially with patients' age, gender and earlier ambulatory surgeries. As a conclusion, positive results concerning patients' knowledge could be achieved with the Internet-based education. The Internet is a viable method in ambulatory care.
NASA Technical Reports Server (NTRS)
Fayyad, Kristina E.; Hill, Randall W., Jr.; Wyatt, E. J.
1993-01-01
This paper presents a case study of the knowledge engineering process employed to support the Link Monitor and Control Operator Assistant (LMCOA). The LMCOA is a prototype system which automates the configuration, calibration, test, and operation (referred to as precalibration) of the communications, data processing, metric data, antenna, and other equipment used to support space-ground communications with deep space spacecraft in NASA's Deep Space Network (DSN). The primary knowledge base in the LMCOA is the Temporal Dependency Network (TDN), a directed graph which provides a procedural representation of the precalibration operation. The TDN incorporates precedence, temporal, and state constraints and uses several supporting knowledge bases and data bases. The paper provides a brief background on the DSN, and describes the evolution of the TDN and supporting knowledge bases, the process used for knowledge engineering, and an analysis of the successes and problems of the knowledge engineering effort.
Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith
2013-01-01
Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts.
NASA Astrophysics Data System (ADS)
Kapiarsa, A. B.; Sariffuddin, S.
2018-02-01
Local knowledge in disaster management should not be neglected in developing community resilience. The circular relation between humans and their living habitat and community social relation have developed the local knowledge namely specialized knowledge, shared knowledge, and common knowledge. Its correlation with community-based disaster management has become an important discussion specially to answer can local knowledge underlie community-based disaster risk reduction concept development? To answer this question, this research used mix-method. Interview and crosstab method for 73 respondents with 90% trust rate were used to determine the correlation between local knowledge and community characteristics. This research found out that shared knowledge dominated community local knowledge (77%). While common knowledge and specialized knowledge were sequentially 8% and 15%. The high score of shared value (77%) indicated that local knowledge was occurred in household level and not yet indicated in community level. Shared knowledge was found in 3 phases of the resilient community in dealing with disaster, namely mitigation, emergency response, and recovery phase. This research, therefore, has opened a new scientific discussion on the self-help concept in community-help concept in CBDRM concept development in Indonesia.
Good, David; Lo, Joseph; Lee, W Robert; Wu, Q Jackie; Yin, Fang-Fang; Das, Shiva K
2013-09-01
Intensity modulated radiation therapy (IMRT) treatment planning can have wide variation among different treatment centers. We propose a system to leverage the IMRT planning experience of larger institutions to automatically create high-quality plans for outside clinics. We explore feasibility by generating plans for patient datasets from an outside institution by adapting plans from our institution. A knowledge database was created from 132 IMRT treatment plans for prostate cancer at our institution. The outside institution, a community hospital, provided the datasets for 55 prostate cancer cases, including their original treatment plans. For each "query" case from the outside institution, a similar "match" case was identified in the knowledge database, and the match case's plan parameters were then adapted and optimized to the query case by use of a semiautomated approach that required no expert planning knowledge. The plans generated with this knowledge-based approach were compared with the original treatment plans at several dose cutpoints. Compared with the original plan, the knowledge-based plan had a significantly more homogeneous dose to the planning target volume and a significantly lower maximum dose. The volumes of the rectum, bladder, and femoral heads above all cutpoints were nominally lower for the knowledge-based plan; the reductions were significantly lower for the rectum. In 40% of cases, the knowledge-based plan had overall superior (lower) dose-volume histograms for rectum and bladder; in 54% of cases, the comparison was equivocal; in 6% of cases, the knowledge-based plan was inferior for both bladder and rectum. Knowledge-based planning was superior or equivalent to the original plan in 95% of cases. The knowledge-based approach shows promise for homogenizing plan quality by transferring planning expertise from more experienced to less experienced institutions. Copyright © 2013 Elsevier Inc. All rights reserved.
Vertically Integrated Seismological Analysis II : Inference
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.; Sudderth, E.
2009-12-01
Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.
What Is the Current Level of Asthma Knowledge in Elementary, Middle, and High School Teachers?
ERIC Educational Resources Information Center
Carey, Stephen
2013-01-01
This study examined teacher asthma knowledge based on three areas including (a) the level of teacher asthma knowledge in the Maury County Public School System, (b) the level of teacher asthma knowledge based on five demographic factors, and (c) the level of teacher asthma knowledge in the Maury County Public School System compared with teacher…
NASA Astrophysics Data System (ADS)
Jacobi, Johanna; Mathez-Stiefel, Sarah-Lan; Gambon, Helen; Rist, Stephan; Altieri, Miguel
2017-03-01
Agroforestry often relies on local knowledge, which is gaining recognition in development projects. However, how local knowledge can articulate with external and scientific knowledge is little known. Our study explored the use and integration of local and external knowledge in agroforestry projects in Bolivia. In 42 field visits and 62 interviews with agroforestry farmers, civil society representatives, and policymakers, we found a diverse knowledge base. We examined how local and external knowledge contribute to livelihood assets and tree and crop diversity. Projects based predominantly on external knowledge tended to promote a single combination of tree and crop species and targeted mainly financial capital, whereas projects with a local or mixed knowledge base tended to focus on food security and increased natural capital (e.g., soil restoration) and used a higher diversity of trees and crops than those with an external knowledge base. The integration of different forms of knowledge can enable farmers to better cope with new challenges emerging as a result of climate change, fluctuating market prices for cash crops, and surrounding destructive land use strategies such as uncontrolled fires and aerial fumigation with herbicides. However, many projects still tended to prioritize external knowledge and undervalue local knowledge—a tendency that has long been institutionalized in the formal educational system and in extension services. More dialogue is needed between different forms of knowledge, which can be promoted by strengthening local organizations and their networks, reforming agricultural educational institutions, and working in close interaction with policymakers.
Knowledge-Based Entrepreneurship in a Boundless Research System
ERIC Educational Resources Information Center
Dell'Anno, Davide
2008-01-01
International entrepreneurship and knowledge-based entrepreneurship have recently generated considerable academic and non-academic attention. This paper explores the "new" field of knowledge-based entrepreneurship in a boundless research system. Cultural barriers to the development of business opportunities by researchers persist in some academic…
Building Better Decision-Support by Using Knowledge Discovery.
ERIC Educational Resources Information Center
Jurisica, Igor
2000-01-01
Discusses knowledge-based decision-support systems that use artificial intelligence approaches. Addresses the issue of how to create an effective case-based reasoning system for complex and evolving domains, focusing on automated methods for system optimization and domain knowledge evolution that can supplement knowledge acquired from domain…
Methods and systems for detecting abnormal digital traffic
Goranson, Craig A [Kennewick, WA; Burnette, John R [Kennewick, WA
2011-03-22
Aspects of the present invention encompass methods and systems for detecting abnormal digital traffic by assigning characterizations of network behaviors according to knowledge nodes and calculating a confidence value based on the characterizations from at least one knowledge node and on weighting factors associated with the knowledge nodes. The knowledge nodes include a characterization model based on prior network information. At least one of the knowledge nodes should not be based on fixed thresholds or signatures. The confidence value includes a quantification of the degree of confidence that the network behaviors constitute abnormal network traffic.
Research and application of knowledge resources network for product innovation.
Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian
2015-01-01
In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method.
Towards a Cognitive Organisational Framework for Knowledge Management
2001-09-01
of knowledge now exist only at the bottom of the organisation, with management uninformed on specific detail. For many knowledge- based ...the organisation’ s internal context - its internal management practices, learning culture and knowledge base . This has particularly been found to...of the process is based in ’Kaizen’. Within the Western culture we may well have ignored important insights in our own tradition. Traditionally, in
Knowledge Integration in Global R&D Networks
NASA Astrophysics Data System (ADS)
Erkelens, Rose; van den Hooff, Bart; Vlaar, Paul; Huysman, Marleen
This paper reports a qualitative study conducted at multinational organizations' R&D departments about their process of knowledge integration. Taking into account the knowledge based view (KBV) of the firm and the practice-based view of knowledge, and building on the literatures concerning specialization and integration of knowledge in organizations, we explore which factors may have a significant influence on the integration process of knowledge between R&D units. The findings indicated (1) the contribution of relevant factors influencing knowledge integration processes and (2) a thoughtful balance between engineering and emergent approaches to be helpful in understanding and overcoming knowledge integration issues.
A Model of Knowledge Based Information Retrieval with Hierarchical Concept Graph.
ERIC Educational Resources Information Center
Kim, Young Whan; Kim, Jin H.
1990-01-01
Proposes a model of knowledge-based information retrieval (KBIR) that is based on a hierarchical concept graph (HCG) which shows relationships between index terms and constitutes a hierarchical thesaurus as a knowledge base. Conceptual distance between a query and an object is discussed and the use of Boolean operators is described. (25…
ERIC Educational Resources Information Center
Hsu, Chung-Yuan; Tsai, Meng-Jung; Chang, Yu-Hsuan; Liang, Jyh-Chong
2017-01-01
Using the Game-based-learning Teaching Belief Scale (GTBS) and the Technological Pedagogical Content Knowledge--Games questionnaire (TPACK-G), this study investigated 316 Taiwanese in-service teachers' teaching beliefs about game-based learning and their perceptions of game-based pedagogical content knowledge (GPCK). Both t-tests and ANOVA…
ERIC Educational Resources Information Center
Wu, Ji-Wei; Tseng, Judy C. R.; Hwang, Gwo-Jen
2015-01-01
Inquiry-Based Learning (IBL) is an effective approach for promoting active learning. When inquiry-based learning is incorporated into instruction, teachers provide guiding questions for students to actively explore the required knowledge in order to solve the problems. Although the World Wide Web (WWW) is a rich knowledge resource for students to…
Integration of object-oriented knowledge representation with the CLIPS rule based system
NASA Technical Reports Server (NTRS)
Logie, David S.; Kamil, Hasan
1990-01-01
The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
ERIC Educational Resources Information Center
Paisey, Catriona; Paisey, Nicholas J.
2007-01-01
Knowledge is a defining characteristic of professions but, given the ever-increasing knowledge base, curricula in professional disciplines are becoming overcrowded. This paper discusses propositional/declarative knowledge in relation to the accounting knowledge base. Historically, professional areas such as medicine, law and accountancy have…
A Comparison of Books and Hypermedia for Knowledge-based Sports Coaching.
ERIC Educational Resources Information Center
Vickers, Joan N.; Gaines, Brian R.
1988-01-01
Summarizes and illustrates the knowledge-based approach to instructional material design. A series of sports coaching handbooks and hypermedia presentations of the same material are described and the different instantiations of the knowledge and training structures are compared. Figures show knowledge structures for badminton and the architecture…
1989-02-01
which capture the knowledge of such experts. These Expert Systems, or Knowledge-Based Systems’, differ from the usual computer programming techniques...their applications in the fields of structural design and welding is reviewed. 5.1 Introduction Expert Systems, or KBES, are computer programs using Al...procedurally constructed as conventional computer programs usually are; * The knowledge base of such systems is executable, unlike databases 3 "Ill
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
Is Probabilistic Evidence a Source of Knowledge?
ERIC Educational Resources Information Center
Friedman, Ori; Turri, John
2015-01-01
We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…
The Assessment of Athletics "Knowledge" with Written and Video Tests
ERIC Educational Resources Information Center
van Vuuren-Cassar, Gemma; Lamprianou, Iasonas
2006-01-01
Background: Athletics programmes for secondary schools include a variety of skills, knowledge and cognitive abilities, which are currently assessed through written, practical, oral and/or video-based tests. Skills are traditionally taught in practice-based sessions, while the knowledge aspect is often reinforced in class-based sessions with…
Mapping and Managing Knowledge and Information in Resource-Based Learning
ERIC Educational Resources Information Center
Tergan, Sigmar-Olaf; Graber, Wolfgang; Neumann, Anja
2006-01-01
In resource-based learning scenarios, students are often overwhelmed by the complexity of task-relevant knowledge and information. Techniques for the external interactive representation of individual knowledge in graphical format may help them to cope with complex problem situations. Advanced computer-based concept-mapping tools have the potential…
Development of a Spacecraft Materials Selector Expert System
NASA Technical Reports Server (NTRS)
Pippin, G.; Kauffman, W. (Technical Monitor)
2002-01-01
This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.
A NASA/RAE cooperation in the development of a real-time knowledge-based autopilot
NASA Technical Reports Server (NTRS)
Daysh, Colin; Corbin, Malcolm; Butler, Geoff; Duke, Eugene L.; Belle, Steven D.; Brumbaugh, Randal W.
1991-01-01
As part of a US/UK cooperative aeronautical research program, a joint activity between the NASA Dryden Flight Research Facility and the Royal Aerospace Establishment on knowledge-based systems was established. This joint activity is concerned with tools and techniques for the implementation and validation of real-time knowledge-based systems. The proposed next stage of this research is described, in which some of the problems of implementing and validating a knowledge-based autopilot for a generic high-performance aircraft are investigated.
Trust from the past: Bayesian Personalized Ranking based Link Prediction in Knowledge Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Baichuan; Choudhury, Sutanay; Al-Hasan, Mohammad
2016-02-01
Estimating the confidence for a link is a critical task for Knowledge Graph construction. Link prediction, or predicting the likelihood of a link in a knowledge graph based on prior state is a key research direction within this area. We propose a Latent Feature Embedding based link recommendation model for prediction task and utilize Bayesian Personalized Ranking based optimization technique for learning models for each predicate. Experimental results on large-scale knowledge bases such as YAGO2 show that our approach achieves substantially higher performance than several state-of-art approaches. Furthermore, we also study the performance of the link prediction algorithm in termsmore » of topological properties of the Knowledge Graph and present a linear regression model to reason about its expected level of accuracy.« less
Computational neuroanatomy: ontology-based representation of neural components and connectivity.
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-02-05
A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.
Yao, Bin; Kang, Hong; Miao, Qi; Zhou, Sicheng; Liang, Chen; Gong, Yang
2017-01-01
Patient falls are a common safety event type that impairs the healthcare quality. Strategies including solution tools and reporting systems for preventing patient falls have been developed and implemented in the U.S. However, the current strategies do not include timely knowledge support, which is in great need in bridging the gap between reporting and learning. In this study, we constructed a knowledge base of fall events by combining expert-reviewed fall prevention solutions and then integrating them into a reporting system. The knowledge base enables timely and tailored knowledge support and thus will serve as a prevailing fall prevention tool. This effort holds promise in making knowledge acquisition and management a routine process for enhancing the reporting and understanding of patient safety events.
ERIC Educational Resources Information Center
Lowe, Joel Courtney
2013-01-01
This study explores teachers' reactions to a knowledge- and skills-based pay (KSBP) system implemented in a large international school. Such systems are designed to set teacher compensation based on demonstrated professional knowledge and skills as opposed to the traditional scale based on years of experience and degrees attained. This study fills…
Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.
Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie
2016-06-01
Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.
Zhang, Yi-Fan; Tian, Yu; Zhou, Tian-Shu; Araki, Kenji; Li, Jing-Song
2016-01-01
The broad adoption of clinical decision support systems within clinical practice has been hampered mainly by the difficulty in expressing domain knowledge and patient data in a unified formalism. This paper presents a semantic-based approach to the unified representation of healthcare domain knowledge and patient data for practical clinical decision making applications. A four-phase knowledge engineering cycle is implemented to develop a semantic healthcare knowledge base based on an HL7 reference information model, including an ontology to model domain knowledge and patient data and an expression repository to encode clinical decision making rules and queries. A semantic clinical decision support system is designed to provide patient-specific healthcare recommendations based on the knowledge base and patient data. The proposed solution is evaluated in the case study of type 2 diabetes mellitus inpatient management. The knowledge base is successfully instantiated with relevant domain knowledge and testing patient data. Ontology-level evaluation confirms model validity. Application-level evaluation of diagnostic accuracy reaches a sensitivity of 97.5%, a specificity of 100%, and a precision of 98%; an acceptance rate of 97.3% is given by domain experts for the recommended care plan orders. The proposed solution has been successfully validated in the case study as providing clinical decision support at a high accuracy and acceptance rate. The evaluation results demonstrate the technical feasibility and application prospect of our approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Hosseini, Seyed Kianoosh; Ghalamkari, Marziyeh; Yousefshahi, Fardin; Mireskandari, Seyed Mohammad; Rezaei Hamami, Mohsen
2013-10-28
Cardiopulmonary-cerebral resuscitation (CPCR) training is essential for all hospital workers, especially junior residents who might become the manager of the resuscitation team. In our center, the traditional CPCR knowledge training curriculum for junior residents up to 5 years ago was lecture-based and had some faults. This study aimed to evaluate the effect of a problem-based method on residents' CPCR knowledge and skills as well as their evaluation of their CPCR trainers. This study, conducted at Tehran University of Medical Sciences, included 290 first-year residents in 2009-2010 - who were trained via a problem-based method (the problem-based group) - and 160 first-year residents in 2003-2004 - who were trained via a lecture-based method (the lecture-based group). Other educational techniques and facilities were similar. The participants self-evaluated their own CPCR knowledge and skills pre and post workshop and also assessed their trainers' efficacy post workshop by completing special questionnaires. The problem-based group, trained via the problem-based method, had higher self-assessment scores of CPCR knowledge and skills post workshop: the difference as regards the mean scores between the problem-based and lecture-based groups was 32.36 ± 19.23 vs. 22.33 ± 20.35 for knowledge (p value = 0.003) and 10.13 ± 7.17 vs. 8.19 ± 8.45 for skills (p value = 0.043). The residents' evaluation of their trainers was similar between the two study groups (p value = 0.193), with the mean scores being 15.90 ± 2.59 and 15.46 ± 2.90 in the problem-based and lecture-based groups - respectively. The problem-based method increased our residents' self-evaluation score of their own CPCR knowledge and skills.
Incorporating World Knowledge to Document Clustering via Heterogeneous Information Networks.
Wang, Chenguang; Song, Yangqiu; El-Kishky, Ahmed; Roth, Dan; Zhang, Ming; Han, Jiawei
2015-08-01
One of the key obstacles in making learning protocols realistic in applications is the need to supervise them, a costly process that often requires hiring domain experts. We consider the framework to use the world knowledge as indirect supervision. World knowledge is general-purpose knowledge, which is not designed for any specific domain. Then the key challenges are how to adapt the world knowledge to domains and how to represent it for learning. In this paper, we provide an example of using world knowledge for domain dependent document clustering. We provide three ways to specify the world knowledge to domains by resolving the ambiguity of the entities and their types, and represent the data with world knowledge as a heterogeneous information network. Then we propose a clustering algorithm that can cluster multiple types and incorporate the sub-type information as constraints. In the experiments, we use two existing knowledge bases as our sources of world knowledge. One is Freebase, which is collaboratively collected knowledge about entities and their organizations. The other is YAGO2, a knowledge base automatically extracted from Wikipedia and maps knowledge to the linguistic knowledge base, Word-Net. Experimental results on two text benchmark datasets (20newsgroups and RCV1) show that incorporating world knowledge as indirect supervision can significantly outperform the state-of-the-art clustering algorithms as well as clustering algorithms enhanced with world knowledge features.
Incorporating World Knowledge to Document Clustering via Heterogeneous Information Networks
Wang, Chenguang; Song, Yangqiu; El-Kishky, Ahmed; Roth, Dan; Zhang, Ming; Han, Jiawei
2015-01-01
One of the key obstacles in making learning protocols realistic in applications is the need to supervise them, a costly process that often requires hiring domain experts. We consider the framework to use the world knowledge as indirect supervision. World knowledge is general-purpose knowledge, which is not designed for any specific domain. Then the key challenges are how to adapt the world knowledge to domains and how to represent it for learning. In this paper, we provide an example of using world knowledge for domain dependent document clustering. We provide three ways to specify the world knowledge to domains by resolving the ambiguity of the entities and their types, and represent the data with world knowledge as a heterogeneous information network. Then we propose a clustering algorithm that can cluster multiple types and incorporate the sub-type information as constraints. In the experiments, we use two existing knowledge bases as our sources of world knowledge. One is Freebase, which is collaboratively collected knowledge about entities and their organizations. The other is YAGO2, a knowledge base automatically extracted from Wikipedia and maps knowledge to the linguistic knowledge base, Word-Net. Experimental results on two text benchmark datasets (20newsgroups and RCV1) show that incorporating world knowledge as indirect supervision can significantly outperform the state-of-the-art clustering algorithms as well as clustering algorithms enhanced with world knowledge features. PMID:26705504
Research and Application of Knowledge Resources Network for Product Innovation
Li, Chuan; Li, Wen-qiang; Li, Yan; Na, Hui-zhen; Shi, Qian
2015-01-01
In order to enhance the capabilities of knowledge service in product innovation design service platform, a method of acquiring knowledge resources supporting for product innovation from the Internet and providing knowledge active push is proposed. Through knowledge modeling for product innovation based on ontology, the integrated architecture of knowledge resources network is put forward. The technology for the acquisition of network knowledge resources based on focused crawler and web services is studied. Knowledge active push is provided for users by user behavior analysis and knowledge evaluation in order to improve users' enthusiasm for participation in platform. Finally, an application example is illustrated to prove the effectiveness of the method. PMID:25884031
Study of Design Knowledge Capture (DKC) schemes implemented in magnetic bearing applications
NASA Technical Reports Server (NTRS)
1990-01-01
A design knowledge capture (DKC) scheme was implemented using frame-based techniques. The objective of such a system is to capture not only the knowledge which describes a design, but also that which explains how the design decisions were reached. These knowledge types were labelled definitive and explanatory, respectively. Examination of the design process helped determine what knowledge to retain and at what stage that knowledge is used. A discussion of frames resulted in the recognition of their value to knowledge representation and organization. The FORMS frame system was used as a basis for further development, and for examples using magnetic bearing design. The specific contributions made by this research include: determination that frame-based systems provide a useful methodology for management and application of design knowledge; definition of specific user interface requirements, (this consists of a window-based browser); specification of syntax for DKC commands; and demonstration of the feasibility of DKC by applications to existing designs. It was determined that design knowledge capture could become an extremely valuable engineering tool for complicated, long-life systems, but that further work was needed, particularly the development of a graphic, window-based interface.
From Data to Knowledge through Concept-oriented Terminologies
Cimino, James J.
2000-01-01
Knowledge representation involves enumeration of conceptual symbols and arrangement of these symbols into some meaningful structure. Medical knowledge representation has traditionally focused more on the structure than the symbols. Several significant efforts are under way, at local, national, and international levels, to address the representation of the symbols though the creation of high-quality terminologies that are themselves knowledge based. This paper reviews these efforts, including the Medical Entities Dictionary (MED) in use at Columbia University and the New York Presbyterian Hospital. A decade's experience with the MED is summarized to serve as a proof-of-concept that knowledge-based terminologies can support the use of coded patient data for a variety of knowledge-based activities, including the improved understanding of patient data, the access of information sources relevant to specific patient care problems, the application of expert systems directly to the care of patients, and the discovery of new medical knowledge. The terminological knowledge in the MED has also been used successfully to support clinical application development and maintenance, including that of the MED itself. On the basis of this experience, current efforts to create standard knowledge-based terminologies appear to be justified. PMID:10833166
Cimino, J J
2000-01-01
Knowledge representation involves enumeration of conceptual symbols and arrangement of these symbols into some meaningful structure. Medical knowledge representation has traditionally focused more on the structure than the symbols. Several significant efforts are under way, at local, national, and international levels, to address the representation of the symbols though the creation of high-quality terminologies that are themselves knowledge based. This paper reviews these efforts, including the Medical Entities Dictionary (MED) in use at Columbia University and the New York Presbyterian Hospital. A decade's experience with the MED is summarized to serve as a proof-of-concept that knowledge-based terminologies can support the use of coded patient data for a variety of knowledge-based activities, including the improved understanding of patient data, the access of information sources relevant to specific patient care problems, the application of expert systems directly to the care of patients, and the discovery of new medical knowledge. The terminological knowledge in the MED has also been used successfully to support clinical application development and maintenance, including that of the MED itself. On the basis of this experience, current efforts to create standard knowledge-based terminologies appear to be justified.
Knowledge-Based Aid: A Four Agency Comparative Study
ERIC Educational Resources Information Center
McGrath, Simon; King, Kenneth
2004-01-01
Part of the response of many development cooperation agencies to the challenges of globalisation, ICTs and the knowledge economy is to emphasise the importance of knowledge for development. This paper looks at the discourses and practices of ''knowledge-based aid'' through an exploration of four agencies: the World Bank, DFID, Sida and JICA. It…
Study of Sharing Knowledge Resources in Business Schools
ERIC Educational Resources Information Center
Ranjan, Jayanthi
2011-01-01
Purpose: The purpose of this paper is to propose a common business school framework based on knowledge resources that are available in business schools. To support the arguments made based on review literature, the paper presents the holistic framework of knowledge resources in a business school and also provides a knowledge value chain in sharing…
Knowledge Sharing in an American Multinational Company Based in Malaysia
ERIC Educational Resources Information Center
Ling, Chen Wai; Sandhu, Manjit S.; Jain, Kamal Kishore
2009-01-01
Purpose: This paper seeks to examine the views of executives working in an American based multinational company (MNC) about knowledge sharing, barriers to knowledge sharing, and strategies to promote knowledge sharing. Design/methodology/approach: This study was carried out in phases. In the first phase, a topology of organizational mechanisms for…
Achieving and Sustaining New Knowledge Development in High-Expectation Start-Ups
ERIC Educational Resources Information Center
Matricano, Diego
2010-01-01
In markets characterized by strong competition, new knowledge and new knowledge development are generally recognized as the key means for an enterprise to gain competitive advantage. This knowledge-based competitive advantage is critical for all commercial ventures, but is especially so for high-expectation start-ups (technology-based ventures…
ERIC Educational Resources Information Center
Mogharreban, Namdar
2004-01-01
A typical tutorial system functions by means of interaction between four components: the expert knowledge base component, the inference engine component, the learner's knowledge component and the user interface component. In typical tutorial systems the interaction and the sequence of presentation as well as the mode of evaluation are…
Arar, Nedal; Knight, Sara J; Modell, Stephen M; Issa, Amalia M
2011-03-01
The main mission of the Genomic Applications in Practice and Prevention Network™ is to advance collaborative efforts involving partners from across the public health sector to realize the promise of genomics in healthcare and disease prevention. We introduce a new framework that supports the Genomic Applications in Practice and Prevention Network mission and leverages the characteristics of the complex adaptive systems approach. We call this framework the Genome-based Knowledge Management in Cycles model (G-KNOMIC). G-KNOMIC proposes that the collaborative work of multidisciplinary teams utilizing genome-based applications will enhance translating evidence-based genomic findings by creating ongoing knowledge management cycles. Each cycle consists of knowledge synthesis, knowledge evaluation, knowledge implementation and knowledge utilization. Our framework acknowledges that all the elements in the knowledge translation process are interconnected and continuously changing. It also recognizes the importance of feedback loops, and the ability of teams to self-organize within a dynamic system. We demonstrate how this framework can be used to improve the adoption of genomic technologies into practice using two case studies of genomic uptake.
Issues on the use of meta-knowledge in expert systems
NASA Technical Reports Server (NTRS)
Facemire, Jon; Chen, Imao
1988-01-01
Meta knowledge is knowledge about knowledge; knowledge that is not domain specific but is concerned instead with its own internal structure. Several past systems have used meta-knowledge to improve the nature of the user interface, to maintain the knowledge base, and to control the inference engine. More extensive use of meta-knowledge is probable for the future as larger scale problems are considered. A proposed system architecture is presented and discussed in terms of meta-knowledge applications. The principle components of this system: the user support subsystem, the control structure, the knowledge base, the inference engine, and a learning facility are all outlined and discussed in light of the use of meta-knowledge. Problems with meta-constructs are also mentioned but it is concluded that the use of meta-knowledge is crucial for increasingly autonomous operations.
Bromme, Rainer; Beelmann, Andreas
2018-04-01
Many social science-based interventions entail the transfer of evidence-based knowledge to the "target population," because the acquisition and the acceptance of that knowledge are necessary for the intended improvement of behavior or development. Furthermore, the application of a certain prevention program is often legitimated by a reference to science-based reasons such as an evaluation according to scientific standards. Hence, any implementation of evidence-based knowledge and programs is embedded in the public understanding of (social) science. Based on recent research on such public understanding of science, we shall discuss transfer as a process of science communication.
Automated extraction of knowledge for model-based diagnostics
NASA Technical Reports Server (NTRS)
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
1990-01-01
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
Acar, Oguz Ali; van den Ende, Jan
2015-01-01
Global prize-based science contests have great potential for tapping into diverse knowledge on a global scale and overcoming important scientific challenges. A necessary step for knowledge to be utilized in these contests is for that knowledge to be disclosed. Knowledge disclosure, however, is paradoxical in nature: in order for the value of knowledge to be assessed, inventors must disclose their knowledge, but then the person who receives that knowledge does so at no cost and may use it opportunistically. This risk of potential opportunistic behavior in turn makes the inventor fearful of disclosing knowledge, and this is a major psychological barrier to knowledge disclosure. In this project, we investigated this fear of opportunism in global prize-based science contests by surveying 630 contest participants in the InnoCentive online platform for science contests. We found that participants in these science contests experience fear of opportunism to varying degrees, and that women and older participants have significantly less fear of disclosing their scientific knowledge. Our findings highlight the importance of taking differences in such fears into account when designing global prize-based contests so that the potential of the contests for reaching solutions to important and challenging problems can be used more effectively. PMID:26230086
Acar, Oguz Ali; van den Ende, Jan
2015-01-01
Global prize-based science contests have great potential for tapping into diverse knowledge on a global scale and overcoming important scientific challenges. A necessary step for knowledge to be utilized in these contests is for that knowledge to be disclosed. Knowledge disclosure, however, is paradoxical in nature: in order for the value of knowledge to be assessed, inventors must disclose their knowledge, but then the person who receives that knowledge does so at no cost and may use it opportunistically. This risk of potential opportunistic behavior in turn makes the inventor fearful of disclosing knowledge, and this is a major psychological barrier to knowledge disclosure. In this project, we investigated this fear of opportunism in global prize-based science contests by surveying 630 contest participants in the InnoCentive online platform for science contests. We found that participants in these science contests experience fear of opportunism to varying degrees, and that women and older participants have significantly less fear of disclosing their scientific knowledge. Our findings highlight the importance of taking differences in such fears into account when designing global prize-based contests so that the potential of the contests for reaching solutions to important and challenging problems can be used more effectively.
Hubble Space Telescope Design Engineering Knowledgebase (HSTDEK)
NASA Technical Reports Server (NTRS)
Johannes, James D.; Everetts, Clark
1989-01-01
The research covered here pays specific attention to the development of tools to assist knowledge engineers in acquiring knowledge and to assist other technical, engineering, and management personnel in automatically performing knowledge capture as part of their everyday work without adding any extra work to what they already do. Requirements for data products, the knowledge base, and methods for mapping knowledge in the documents onto the knowledge representations are discussed, as are some of the difficulties of capturing in the knowledge base the structure of the design process itself, along with a model of the system designed. The capture of knowledge describing the interactions of different components is also discussed briefly.
The Study on Collaborative Manufacturing Platform Based on Agent
NASA Astrophysics Data System (ADS)
Zhang, Xiao-yan; Qu, Zheng-geng
To fulfill the trends of knowledge-intensive in collaborative manufacturing development, we have described multi agent architecture supporting knowledge-based platform of collaborative manufacturing development platform. In virtue of wrapper service and communication capacity agents provided, the proposed architecture facilitates organization and collaboration of multi-disciplinary individuals and tools. By effectively supporting the formal representation, capture, retrieval and reuse of manufacturing knowledge, the generalized knowledge repository based on ontology library enable engineers to meaningfully exchange information and pass knowledge across boundaries. Intelligent agent technology increases traditional KBE systems efficiency and interoperability and provides comprehensive design environments for engineers.
A new intrusion prevention model using planning knowledge graph
NASA Astrophysics Data System (ADS)
Cai, Zengyu; Feng, Yuan; Liu, Shuru; Gan, Yong
2013-03-01
Intelligent plan is a very important research in artificial intelligence, which has applied in network security. This paper proposes a new intrusion prevention model base on planning knowledge graph and discuses the system architecture and characteristics of this model. The Intrusion Prevention based on plan knowledge graph is completed by plan recognition based on planning knowledge graph, and the Intrusion response strategies and actions are completed by the hierarchical task network (HTN) planner in this paper. Intrusion prevention system has the advantages of intelligent planning, which has the advantage of the knowledge-sharing, the response focused, learning autonomy and protective ability.
System and method for knowledge based matching of users in a network
Verspoor, Cornelia Maria [Santa Fe, NM; Sims, Benjamin Hayden [Los Alamos, NM; Ambrosiano, John Joseph [Los Alamos, NM; Cleland, Timothy James [Los Alamos, NM
2011-04-26
A knowledge-based system and methods to matchmaking and social network extension are disclosed. The system is configured to allow users to specify knowledge profiles, which are collections of concepts that indicate a certain topic or area of interest selected from an. The system utilizes the knowledge model as the semantic space within which to compare similarities in user interests. The knowledge model is hierarchical so that indications of interest in specific concepts automatically imply interest in more general concept. Similarity measures between profiles may then be calculated based on suitable distance formulas within this space.
Iyama, Yuji; Nakaura, Takeshi; Kidoh, Masafumi; Oda, Seitaro; Utsunomiya, Daisuke; Sakaino, Naritsugu; Tokuyasu, Shinichi; Osakabe, Hirokazu; Harada, Kazunori; Yamashita, Yasuyuki
2016-11-01
The purpose of this study was to evaluate the noise and image quality of images reconstructed with a knowledge-based iterative model reconstruction (knowledge-based IMR) in ultra-low dose cardiac computed tomography (CT). We performed submillisievert radiation dose coronary CT angiography on 43 patients. We also performed a phantom study to evaluate the influence of object size with the automatic exposure control phantom. We reconstructed clinical and phantom studies with filtered back projection (FBP), hybrid iterative reconstruction (hybrid IR), and knowledge-based IMR. We measured effective dose of patients and compared CT number, image noise, and contrast noise ratio in ascending aorta of each reconstruction technique. We compared the relationship between image noise and body mass index for the clinical study, and object size for phantom study. The mean effective dose was 0.98 ± 0.25 mSv. The image noise of knowledge-based IMR images was significantly lower than those of FBP and hybrid IR images (knowledge-based IMR: 19.4 ± 2.8; FBP: 126.7 ± 35.0; hybrid IR: 48.8 ± 12.8, respectively) (P < .01). The contrast noise ratio of knowledge-based IMR images was significantly higher than those of FBP and hybrid IR images (knowledge-based IMR: 29.1 ± 5.4; FBP: 4.6 ± 1.3; hybrid IR: 13.1 ± 3.5, respectively) (P < .01). There were moderate correlations between image noise and body mass index in FBP (r = 0.57, P < .01) and hybrid IR techniques (r = 0.42, P < .01); however, these correlations were weak in knowledge-based IMR (r = 0.27, P < .01). Compared to FBP and hybrid IR, the knowledge-based IMR offers significant noise reduction and improvement in image quality in submillisievert radiation dose cardiac CT. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Towards a standardised representation of a knowledge base for adverse drug event prevention.
Koutkias, Vassilis; Lazou, Katerina; de Clercq, Paul; Maglaveras, Nicos
2011-01-01
Knowledge representation is an important part of knowledge engineering activities that is crucial for enabling knowledge sharing and reuse. In this regard, standardised formalisms and technologies play a significant role. Especially for the medical domain, where knowledge may be tacit, not articulated and highly diverse, the development and adoption of standardised knowledge representations is highly challenging and of outmost importance to achieve knowledge interoperability. To this end, this paper presents a research effort towards the standardised representation of a Knowledge Base (KB) encapsulating rule-based signals and procedures for Adverse Drug Event (ADE) prevention. The KB constitutes an integral part of Clinical Decision Support Systems (CDSSs) to be used at the point of care. The paper highlights the requirements at the domain of discourse with respect to knowledge representation, according to which GELLO (an HL7 and ANSI standard) has been adopted. Results of our prototype implementation are presented along with the advantages and the limitations introduced by the employed approach.
Reducing the Conflict Factors Strategies in Question Answering System
NASA Astrophysics Data System (ADS)
Suwarningsih, W.; Purwarianti, A.; Supriana, I.
2017-03-01
A rule-based system is prone to conflict as new knowledge every time will emerge and indirectly must sign in to the knowledge base that is used by the system. A conflict occurred between the rules in the knowledge base can lead to the errors of reasoning or reasoning circulation. Therefore, when added, the new rules will lead to conflict with other rules, and the only rules that really can be added to the knowledge base. From these conditions, this paper aims to propose a conflict resolution strategy for a medical debriefing system by analyzing scenarios based upon the runtime to improve the efficiency and reliability of systems.
A knowledge base architecture for distributed knowledge agents
NASA Technical Reports Server (NTRS)
Riedesel, Joel; Walls, Bryan
1990-01-01
A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.
Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation
1989-08-01
pobitive, as false positives generated by a medical program can often be caught by a physician upon further testing . False negatives, however, may be...improvement over the knowledge base tested is obtained. Although our work is pretty much theoretical research oriented one example of ex- periments is...knowledge base, improves the performance by about 10%. of tests . First, we divide the cases into a training set and a validation set with 70% vs. 30% each
Effect of web-based education on nursing students' urinary catheterization knowledge and skills.
Öztürk, Deniz; Dinç, Leyla
2014-05-01
Nursing is a practice-based discipline that requires the integration of theory and practice. Nurse educators must continuously revise educational curricula and incorporate information technology into the curriculum to provide students with the necessary knowledge and skills. The aim of this study was to assess the effect of web-based education on students' urinary catheterization knowledge and skills. A convenience sample of 111 first year nursing students enrolled at two universities in Ankara during the academic year of 2011-2012 participated in this quasi-experimental study. The experimental group (n=59) received a web-based and web-enhanced learning approach along with learning and practicing the required material twice as much as the control group, whereas the control group (n=52) received traditional classroom instruction. A knowledge test of 20 multiple-choice questions and a skills checklist were used to assess student performance. There was no difference between the experimental group and the control group in knowledge scores; however, students in the web-based group had higher scores for urinary catheterization skills. The highest scores in knowledge and skills were obtained by students who experienced web-based education as a supplement to tradition instruction. Web-based education had positive effects on the urinary catheterization skills of nursing students, and its positive effect increased for both knowledge and skills when it supplements classroom instruction. Based on these results, we suggest the use of web-based education as a supplement to traditional classroom instruction for nursing education. © 2013.
XML-Based SHINE Knowledge Base Interchange Language
NASA Technical Reports Server (NTRS)
James, Mark; Mackey, Ryan; Tikidjian, Raffi
2008-01-01
The SHINE Knowledge Base Interchange Language software has been designed to more efficiently send new knowledge bases to spacecraft that have been embedded with the Spacecraft Health Inference Engine (SHINE) tool. The intention of the behavioral model is to capture most of the information generally associated with a spacecraft functional model, while specifically addressing the needs of execution within SHINE and Livingstone. As such, it has some constructs that are based on one or the other.
A knowledge base browser using hypermedia
NASA Technical Reports Server (NTRS)
Pocklington, Tony; Wang, Lui
1990-01-01
A hypermedia system is being developed to browse CLIPS (C Language Integrated Production System) knowledge bases. This system will be used to help train flight controllers for the Mission Control Center. Browsing this knowledge base will be accomplished either by having navigating through the various collection nodes that have already been defined, or through the query languages.
A Model to Assess the Behavioral Impacts of Consultative Knowledge Based Systems.
ERIC Educational Resources Information Center
Mak, Brenda; Lyytinen, Kalle
1997-01-01
This research model studies the behavioral impacts of consultative knowledge based systems (KBS). A study of graduate students explored to what extent their decisions were affected by user participation in updating the knowledge base; ambiguity of decision setting; routinization of usage; and source credibility of the expertise embedded in the…
A Discourse Based Approach to the Language Documentation of Local Ecological Knowledge
ERIC Educational Resources Information Center
Odango, Emerson Lopez
2016-01-01
This paper proposes a discourse-based approach to the language documentation of local ecological knowledge (LEK). The knowledge, skills, beliefs, cultural worldviews, and ideologies that shape the way a community interacts with its environment can be examined through the discourse in which LEK emerges. 'Discourse-based' refers to two components:…
The Knowledge Base as an Extension of Distance Learning Reference Service
ERIC Educational Resources Information Center
Casey, Anne Marie
2012-01-01
This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…
Feasibility of using a knowledge-based system concept for in-flight primary flight display research
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.
1991-01-01
A study was conducted to determine the feasibility of using knowledge-based systems architectures for inflight research of primary flight display information management issues. The feasibility relied on the ability to integrate knowledge-based systems with existing onboard aircraft systems. And, given the hardware and software platforms available, the feasibility also depended on the ability to use interpreted LISP software with the real time operation of the primary flight display. In addition to evaluating these feasibility issues, the study determined whether the software engineering advantages of knowledge-based systems found for this application in the earlier workstation study extended to the inflight research environment. To study these issues, two integrated knowledge-based systems were designed to control the primary flight display according to pre-existing specifications of an ongoing primary flight display information management research effort. These two systems were implemented to assess the feasibility and software engineering issues listed. Flight test results were successful in showing the feasibility of using knowledge-based systems inflight with actual aircraft data.
Using CLIPS in the domain of knowledge-based massively parallel programming
NASA Technical Reports Server (NTRS)
Dvorak, Jiri J.
1994-01-01
The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.
Radionuclide data analysis in connection of DPRK event in May 2009
NASA Astrophysics Data System (ADS)
Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei
2010-05-01
The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.
Probing the Atmosphere in Antarctica using continuous microbarom recordings
NASA Astrophysics Data System (ADS)
Ceranna, L.; Le Pichon, A.; Blanc, E.
2009-12-01
Germany is operating one of the four Antarctic infrasound stations to fulfill the compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). I27DE is a nine element array which is in continuous operation since its deployment in January 2003. Using the PMCC detection algorithm coherent signals are observed in the frequency range from 0.0002 to 4.0 Hz covering a large variety of infrasound sources such as low frequent mountain-associated wave or high frequency ice-quakes. The most prominent signals are related to microbaroms (mb) generated by the strong peri-Antarctic ocean swells. These continuous signals with a dominant period of 5 s show a clear trend in the direction of their detection being well correlated to the prevailing stratospheric winds. For mb-signals a strong increase in trace velocity along with a decrease in the number of detections were observed during the Austral summer 2006 indicating strong variations in the stratospheric duct. However, wind speed profiles at the stations give no evidence for such an anomaly. Nevertheless, strong events of sudden stratospheric warming (SSW) at latitude ranges of the peri-Antarctic belt occurring during Austral winter 2006 together with cooling in the upper stratosphere caused by eruption of the Manam volcano in Indonesia provide a potential explanation for the abnormal ducting conditions. This will be demonstrated computing 2-D numerical simulations for sound propagation from the ocean swell to I27DE using appropriate horizontal wind speed and temperature profiles.
Auroral Infrasound Observed at I53US at Fairbanks, Alaska
NASA Astrophysics Data System (ADS)
Wilson, C. R.; Olson, J. V.
2003-12-01
In this presentation we will describe two different types of auroral infrasound recently observed at Fairbanks, Alaska in the pass band from 0.015 to 0.10 Hz. Infrasound signals associated with auroral activity (AIW) have been observed in Fairbanks over the past 30 years with infrasonic microphone arrays. The installation of the new CTBT/IMS infrasonic array, I53US, at Fairbanks has resulted in a greatly increased quality of the infrasonic data with which to study natural sources of infrasound. In the historical data at Fairbanks all the auroral infrasonic waves (AIW) detected were found to be the result of bow waves that are generated by supersonic motion of auroral arcs that contain strong electrojet currents. This infrasound is highly anisotropic, moving in the same direction as that of the auroral arc. AIW bow waves observed in 2003 at I53US will be described. Recently at I53US we have observed many events of very high trace velocity that are comprised of continuous, highly coherent wave trains. These waves occur in the morning hours at times of strong auroral activity. This new type of very high trace velocity AIW appears to be associated with pulsating auroral displays. Pulsating auroras occur predominantly after magnetic midnight (10:00 UT at Fairbanks). They are a usual part of the recovery phase of auroral substorms and are produced by energetic electrons precipitating into the atmosphere. Given proper dark, cloudless sky conditions during the AIW events, bright pulsating auroral forms were sometimes visible overhead.
Infrasound workshop for CTBT monitoring: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christie, D.; Whitaker, R.
1998-11-01
It is expected that the establishment of new infrasound stations in the global IMS network by the Provisional Technical Secretariat of the CTBTO in Vienna will commence in the middle of 1998. Thus, decisions on the final operational design for IMS infrasound stations will have to be made within the next 12 months. Though many of the basic design problems have been resolved, it is clear that further work needs to be carried out during the coming year to ensure that IMS infrasound stations will operate with maximum capability in accord with the specifications determined during the May 1997 PrepCommore » Meeting. Some of the papers presented at the Workshop suggest that it may be difficult to design a four-element infrasound array station that will reliably detect and locate infrasound signals at all frequencies in the specified range from 0.02 to 4.0 Hz in all noise environments. Hence, if the basic design of an infrasound array is restricted to four array elements, the final optimized design may be suited only to the detection and location of signals in a more limited pass-band. Several participants have also noted that the reliable discrimination of infrasound signals could be quite difficult if the detection system leads to signal distortion. Thus, it has been emphasized that the detection system should not, if possible, compromise signal fidelity. This report contains the workshop agenda, a list of participants, and abstracts and viewgraphs from each presentation.« less
NASA Astrophysics Data System (ADS)
Wilkins, N.; Wookey, J. M.; Selby, N. D.
2017-12-01
Seismology is an important part of the International Monitoring System (IMS) installed to detect, identify, and locate nuclear detonations in breach of the Comprehensive nuclear Test Ban Treaty (CTBT) prior to and after its entry into force. Seismic arrays in particular provide not only a means of detecting and locating underground nuclear explosions, but in discriminating them from naturally occurring earthquakes of similar magnitude. One potential discriminant is the amplitude ratio of high frequency (> 2 Hz) P waves to S waves (P/S) measured at regional distances (3 - 17 °). Accurate measurement of such discriminants, and the ability to detect low-magnitude seismicity from a suspicious event relies on high signal-to-noise ratio (SNR) data. A correction to the slowness vector of the incident seismic wavefield, and static corrections applied to the waveforms recorded at each receiver within the array can be shown to improve the SNR. We apply codes we have developed to calculate slowness-azimuth station corrections (SASCs) and static corrections to the arrival time and amplitude of the seismic waveform to seismic arrays regional to the DPRK nuclear test site at Punggye-ri, North Korea. We use the F-statistic to demonstrate the SNR improvement to data from the nuclear tests and other seismic events in the vicinity of the test site. We also make new measurements of P/S with the corrected waveforms and compare these with existing measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kriz, M.; Hunter, D.; Riley, T.
2015-10-02
Radioactive xenon isotopes are a critical part of the Comprehensive Nuclear Test Ban Treaty (CTBT) for the detection or confirmation of nuclear weapons tests as well as on-site treaty verification monitoring. On-site monitoring is not currently conducted because there are no commercially available small/robust field detector devices to measure the radioactive xenon isotopes. Xenon is an ideal signature to detect clandestine nuclear events since they are difficult to contain and can diffuse and migrate through soils due to their inert nature. There are four key radioxenon isotopes used in monitoring: 135Xe (9 hour half-life), 133mXe (2 day half-life), 133Xe (5more » day half-life) and 131mXe (12 day half-life) that decay through beta emission and gamma emission. Savannah River National Laboratory (SRNL) is a leader in the field of gas collections and has developed highly selective molecular sieves that allow for the collection of xenon gas directly from air. Phase I assessed the development of a small, robust beta-gamma coincidence counting system, that combines collection and in situ detection methodologies. Phase II of the project began development of the custom electronics enabling 2D beta-gamma coincidence analysis in a field portable system. This will be a significant advancement for field detection/quantification of short-lived xenon isotopes that would not survive transport time for laboratory analysis.« less
A Knowledge Portal and Collaboration Environment for the Earth Sciences
NASA Astrophysics Data System (ADS)
D'Agnese, F. A.
2008-12-01
Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.
Conceptual model of knowledge base system
NASA Astrophysics Data System (ADS)
Naykhanova, L. V.; Naykhanova, I. V.
2018-05-01
In the article, the conceptual model of the knowledge based system by the type of the production system is provided. The production system is intended for automation of problems, which solution is rigidly conditioned by the legislation. A core component of the system is a knowledge base. The knowledge base consists of a facts set, a rules set, the cognitive map and ontology. The cognitive map is developed for implementation of a control strategy, ontology - the explanation mechanism. Knowledge representation about recognition of a situation in the form of rules allows describing knowledge of the pension legislation. This approach provides the flexibility, originality and scalability of the system. In the case of changing legislation, it is necessary to change the rules set. This means that the change of the legislation would not be a big problem. The main advantage of the system is that there is an opportunity to be adapted easily to changes of the legislation.
A Natural Language Interface Concordant with a Knowledge Base.
Han, Yong-Jin; Park, Seong-Bae; Park, Se-Young
2016-01-01
The discordance between expressions interpretable by a natural language interface (NLI) system and those answerable by a knowledge base is a critical problem in the field of NLIs. In order to solve this discordance problem, this paper proposes a method to translate natural language questions into formal queries that can be generated from a graph-based knowledge base. The proposed method considers a subgraph of a knowledge base as a formal query. Thus, all formal queries corresponding to a concept or a predicate in the knowledge base can be generated prior to query time and all possible natural language expressions corresponding to each formal query can also be collected in advance. A natural language expression has a one-to-one mapping with a formal query. Hence, a natural language question is translated into a formal query by matching the question with the most appropriate natural language expression. If the confidence of this matching is not sufficiently high the proposed method rejects the question and does not answer it. Multipredicate queries are processed by regarding them as a set of collected expressions. The experimental results show that the proposed method thoroughly handles answerable questions from the knowledge base and rejects unanswerable ones effectively.
Psychological Literacy in Applied Psychology Disciplines: Back to, or Beyond, the Basics?
ERIC Educational Resources Information Center
Tomcho, Thomas J.; Foels, Rob
2017-01-01
Undergraduate psychology majors need a broad base of knowledge in order to be viewed as competent graduates. In addition to acquiring basic knowledge, the American Psychological Association (APA) has guidelines for applied knowledge as well. In order to ensure a broad base of knowledge, teachers therefore need to know what the important…
ERIC Educational Resources Information Center
Kaiser, Gabriele; Busse, Andreas; Hoth, Jessica; König, Johannes; Blömeke, Sigrid
2015-01-01
Research on the evaluation of the professional knowledge of mathematics teachers (comprising for example mathematical content knowledge, mathematics pedagogical content knowledge and general pedagogical knowledge) has become prominent in the last decade; however, the development of video-based assessment approaches is a more recent topic. This…
The Visual Representation and Acquisition of Driving Knowledge for Autonomous Vehicle
NASA Astrophysics Data System (ADS)
Zhang, Zhaoxia; Jiang, Qing; Li, Ping; Song, LiangTu; Wang, Rujing; Yu, Biao; Mei, Tao
2017-09-01
In this paper, the driving knowledge base of autonomous vehicle is designed. Based on the driving knowledge modeling system, the driving knowledge of autonomous vehicle is visually acquired, managed, stored, and maintenanced, which has vital significance for creating the development platform of intelligent decision-making systems of automatic driving expert systems for autonomous vehicle.
Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills
ERIC Educational Resources Information Center
Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko
2012-01-01
Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…
Students' Refinement of Knowledge during the Development of Knowledge Bases for Expert Systems.
ERIC Educational Resources Information Center
Lippert, Renate; Finley, Fred
The refinement of the cognitive knowledge base was studied through exploration of the transition from novice to expert and the use of an instructional strategy called novice knowledge engineering. Six college freshmen, who were enrolled in an honors physics course, used an expert system to create questions, decisions, rules, and explanations…
Cui, Meng; Yang, Shuo; Yu, Tong; Yang, Ce; Gao, Yonghong; Zhu, Haiyan
2013-10-01
To design a model to capture information on the state and trends of knowledge creation, at both an individual and an organizational level, in order to enhance knowledge management. We designed a graph-theoretic knowledge model, the expert knowledge map (EKM), based on literature-based annotation. A case study in the domain of Traditional Chinese Medicine research was used to illustrate the usefulness of the model. The EKM successfully captured various aspects of knowledge and enhanced knowledge management within the case-study organization through the provision of knowledge graphs, expert graphs, and expert-knowledge biography. Our model could help to reveal the hot topics, trends, and products of the research done by an organization. It can potentially be used to facilitate knowledge learning, sharing and decision-making among researchers, academicians, students, and administrators of organizations.
Knowledge acquisition for temporal abstraction.
Stein, A; Musen, M A; Shahar, Y
1996-01-01
Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.
Boegl, Karl; Adlassnig, Klaus-Peter; Hayashi, Yoichi; Rothenfluh, Thomas E; Leitich, Harald
2004-01-01
This paper describes the fuzzy knowledge representation framework of the medical computer consultation system MedFrame/CADIAG-IV as well as the specific knowledge acquisition techniques that have been developed to support the definition of knowledge concepts and inference rules. As in its predecessor system CADIAG-II, fuzzy medical knowledge bases are used to model the uncertainty and the vagueness of medical concepts and fuzzy logic reasoning mechanisms provide the basic inference processes. The elicitation and acquisition of medical knowledge from domain experts has often been described as the most difficult and time-consuming task in knowledge-based system development in medicine. It comes as no surprise that this is even more so when unfamiliar representations like fuzzy membership functions are to be acquired. From previous projects we have learned that a user-centered approach is mandatory in complex and ill-defined knowledge domains such as internal medicine. This paper describes the knowledge acquisition framework that has been developed in order to make easier and more accessible the three main tasks of: (a) defining medical concepts; (b) providing appropriate interpretations for patient data; and (c) constructing inferential knowledge in a fuzzy knowledge representation framework. Special emphasis is laid on the motivations for some system design and data modeling decisions. The theoretical framework has been implemented in a software package, the Knowledge Base Builder Toolkit. The conception and the design of this system reflect the need for a user-centered, intuitive, and easy-to-handle tool. First results gained from pilot studies have shown that our approach can be successfully implemented in the context of a complex fuzzy theoretical framework. As a result, this critical aspect of knowledge-based system development can be accomplished more easily.
Moskowitz, H R; German, J B; Saguy, I S
2005-01-01
This article presents an integrated analysis of three emerging knowledge bases in the nutrition and consumer products industries, and how they may effect the food industry. These knowledge bases produce new vistas for corporate product development, especially with respect to those foods that are positioned as 'good for you.' Couched within the current thinking of state-of-the-art knowledge and information, this article highlights how today's thinking about accelerated product development can be introduced into the food and health industries to complement these three research areas. The 3 knowledge bases are: the genomics revolution, which has opened new insights into understanding the interactions of personal needs of individual consumers with nutritionally relevant components of the foods; the investigation of food choice by scientific studies; the development of large scale databases (mega-studies) about the consumer mind. These knowledge bases, combined with new methods to understand the consumer through research, make possible a more focused development. The confluence of trends outlined in this article provides the corporation with the beginnings of a new path to a knowledge-based, principles-grounded product-development system. The approaches hold the potential to create foods based upon people's nutritional requirements combined with their individual preferences. Integrating these emerging knowledge areas with new consumer research techniques may well reshape how the food industry develops new products to satisfy consumer needs and wants.
Algorithm Optimally Orders Forward-Chaining Inference Rules
NASA Technical Reports Server (NTRS)
James, Mark
2008-01-01
People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency. Although data flow analysis was first developed for use in compilers for producing optimal code sequences, its usefulness is now recognized in many software systems including knowledge-based systems. However, this approach for exhaustively computing data-flow information cannot directly be applied to inference systems because of the ubiquitous execution of the rules. An algorithm is presented that efficiently performs a complete producer/consumer analysis for each antecedent and consequence clause in a knowledge base to optimally order the rules to minimize inference cycles. An algorithm was developed that optimally orders a knowledge base composed of forwarding chaining inference rules such that independent inference cycle executions are minimized, thus, resulting in significantly faster execution. This algorithm was integrated into the JPL tool Spacecraft Health Inference Engine (SHINE) for verification and it resulted in a significant reduction in inference cycles for what was previously considered an ordered knowledge base. For a knowledge base that is completely unordered, then the improvement is much greater.
Neuro-Fuzzy Support of Knowledge Management in Social Regulation
NASA Astrophysics Data System (ADS)
Petrovic-Lazarevic, Sonja; Coghill, Ken; Abraham, Ajith
2002-09-01
The aim of the paper is to demonstrate the neuro-fuzzy support of knowledge management in social regulation. Knowledge could be understood for social regulation purposes as explicit and tacit. Explicit knowledge relates to the community culture indicating how things work in the community based on social policies and procedures. Tacit knowledge is ethics and norms of the community. The former could be codified, stored and transferable in order to support decision making, while the latter being based on personal knowledge, experience and judgments is difficult to codify and store. Tacit knowledge expressed through linguistic information can be stored and used to support knowledge management in social regulation through the application of fuzzy and neuro-fuzzy logic.
Computational neuroanatomy: ontology-based representation of neural components and connectivity
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-01-01
Background A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. Results We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Conclusion Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future. PMID:19208191
ERIC Educational Resources Information Center
Al-Edwan, Zaid Suleiman; Hamaidi, Diala Abdul Hadi
2011-01-01
Knowledge-based economy is a new implemented trend in the field of education in Jordan. The ministry of education in Jordan attempts to implement this trend's philosophy in its textbooks. This study examined the extent to which the (1st-3rd grade) social and national textbooks reflect knowledge-based economy criteria from the perspective of…
Mobilio, Dominick; Walker, Gary; Brooijmans, Natasja; Nilakantan, Ramaswamy; Denny, R Aldrin; Dejoannis, Jason; Feyfant, Eric; Kowticwar, Rupesh K; Mankala, Jyoti; Palli, Satish; Punyamantula, Sairam; Tatipally, Maneesh; John, Reji K; Humblet, Christine
2010-08-01
The Protein Data Bank is the most comprehensive source of experimental macromolecular structures. It can, however, be difficult at times to locate relevant structures with the Protein Data Bank search interface. This is particularly true when searching for complexes containing specific interactions between protein and ligand atoms. Moreover, searching within a family of proteins can be tedious. For example, one cannot search for some conserved residue as residue numbers vary across structures. We describe herein three databases, Protein Relational Database, Kinase Knowledge Base, and Matrix Metalloproteinase Knowledge Base, containing protein structures from the Protein Data Bank. In Protein Relational Database, atom-atom distances between protein and ligand have been precalculated allowing for millisecond retrieval based on atom identity and distance constraints. Ring centroids, centroid-centroid and centroid-atom distances and angles have also been included permitting queries for pi-stacking interactions and other structural motifs involving rings. Other geometric features can be searched through the inclusion of residue pair and triplet distances. In Kinase Knowledge Base and Matrix Metalloproteinase Knowledge Base, the catalytic domains have been aligned into common residue numbering schemes. Thus, by searching across Protein Relational Database and Kinase Knowledge Base, one can easily retrieve structures wherein, for example, a ligand of interest is making contact with the gatekeeper residue.
Zwarenstein, Merrick; Reeves, Scott
2006-01-01
Knowledge-translation interventions and interprofessional education and collaboration interventions all aim at improving health care processes and outcomes. Knowledge-translation interventions attempt to increase evidence-based practice by a single professional group and thus may fail to take into account barriers from difficulties in interprofessional relations. Interprofessional education and collaboration interventions aim to improve interprofessional relations, which may in turn facilitate the work of knowledge translation and thus evidence-based practice. We summarize systematic review work on the effects of interventions for interprofessional education and collaboration. The current evidence base contains mainly descriptive studies of these interventions. Knowledge is limited regarding the impact on care and outcomes and the extent to which the interventions increase the practice of evidence-based care. Rigorous multimethod research studies are needed to develop and strengthen the current evidence base in this field. We describe a Health Canada-funded randomized trial in which quantitative and qualitative data will be gathered in 20 general internal medicine units located at 5 Toronto, Ontario, teaching hospitals. The project examines the impact of interprofessional education and collaboration interventions on interprofessional relationships, health care processes (including evidence-based practice), and patient outcomes. Routes are suggested by which interprofessional education and collaboration interventions might affect knowledge translation and evidence-based practice.
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
Three Forms of the Knowledge Economy: Learning, Creativity and Openness
ERIC Educational Resources Information Center
Peters, Michael A.
2010-01-01
This paper outlines and reviews three forms and associated discourses of the "knowledge economy": the "learning economy", based on the work of Bengt-Ake Lundvall; the "creative economy" based on the work of Charles Landry, John Howkins and Richard Florida; and the "open knowledge economy" based on the work of Yochai Benkler and others. Arguably,…
What Do We Know and How Well Do We Know It? Identifying Practice-Based Insights in Education
ERIC Educational Resources Information Center
Miller, Barbara; Pasley, Joan
2012-01-01
Knowledge derived from practice forms a significant portion of the knowledge base in the education field, yet is not accessible using existing empirical research methods. This paper describes a systematic, rigorous, grounded approach to collecting and analysing practice-based knowledge using the authors' research in teacher leadership as an…
GUIDON-WATCH: A Graphic Interface for Viewing a Knowledge-Based System. Technical Report #14.
ERIC Educational Resources Information Center
Richer, Mark H.; Clancey, William J.
This paper describes GUIDON-WATCH, a graphic interface that uses multiple windows and a mouse to allow a student to browse a knowledge base and view reasoning processes during diagnostic problem solving. The GUIDON project at Stanford University is investigating how knowledge-based systems can provide the basis for teaching programs, and this…
Informed Ignorance and the Difficulty of Using Guidelines in Policy Processes
ERIC Educational Resources Information Center
Fernler, Karin
2015-01-01
Based on an ethnographic study, this article investigates an attempt by a multidisciplinary group to employ pre-developed guidelines for producing a knowledge base that was to be used in a policy decision. The article contributes to previous studies of the development and use of knowledge-based guidelines and knowledge syntheses in policy-research…
Image Understanding Architecture
1991-09-01
architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers
A Comparison of Functional Models for Use in the Function-Failure Design Method
NASA Technical Reports Server (NTRS)
Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.
2006-01-01
When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.
Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza
2017-01-01
Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.
Neuro-symbolic representation learning on biological knowledge graphs.
Alshahrani, Mona; Khan, Mohammad Asif; Maddouri, Omar; Kinjo, Akira R; Queralt-Rosinach, Núria; Hoehndorf, Robert
2017-09-01
Biological data and knowledge bases increasingly rely on Semantic Web technologies and the use of knowledge graphs for data integration, retrieval and federated queries. In the past years, feature learning methods that are applicable to graph-structured data are becoming available, but have not yet widely been applied and evaluated on structured biological knowledge. Results: We develop a novel method for feature learning on biological knowledge graphs. Our method combines symbolic methods, in particular knowledge representation using symbolic logic and automated reasoning, with neural networks to generate embeddings of nodes that encode for related information within knowledge graphs. Through the use of symbolic logic, these embeddings contain both explicit and implicit information. We apply these embeddings to the prediction of edges in the knowledge graph representing problems of function prediction, finding candidate genes of diseases, protein-protein interactions, or drug target relations, and demonstrate performance that matches and sometimes outperforms traditional approaches based on manually crafted features. Our method can be applied to any biological knowledge graph, and will thereby open up the increasing amount of Semantic Web based knowledge bases in biology to use in machine learning and data analytics. https://github.com/bio-ontology-research-group/walking-rdf-and-owl. robert.hoehndorf@kaust.edu.sa. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Caste- and ethnicity-based inequalities in HIV/AIDS-related knowledge gap: a case of Nepal.
Atteraya, Madhu; Kimm, HeeJin; Song, In Han
2015-05-01
Caste- and ethnicity-based inequalities are major obstacles to achieving health equity. The authors investigated whether there is any association between caste- and ethnicity-based inequalities and HIV-related knowledge within caste and ethnic populations. They used the 2011 Nepal Demographic and Health Survey, a nationally represented cross-sectional study data set. The study sample consisted of 11,273 women between 15 and 49 years of age. Univariate and logistic regression models were used to examine the relationship between caste- and ethnicity-based inequalities and HIV-related knowledge. The study sample was divided into high Hindu caste (47.9 percent), "untouchable" caste (18.4 percent), and indigenous populations (33.7 percent). Within the study sample, the high-caste population was found to have the greatest knowledge of the means by which HIV is prevented and transmitted. After controlling for socioeconomic and demographic characteristics, untouchables were the least knowledgeable. The odds ratio for incomplete knowledge about transmission among indigenous populations was 1.27 times higher than that for high Hindu castes, but there was no significant difference in knowledge of preventive measures. The findings suggest the existence of a prevailing HIV knowledge gap. This in turn suggests that appropriate steps need to be implemented to convey complete knowledge to underprivileged populations.
Haugstvedt, Anne; Aarflot, Morten; Igland, Jannicke; Landbakk, Tilla; Graue, Marit
2016-01-01
Providing high-quality diabetes care in nursing homes and home-based care facilities requires suitable instruments to evaluate the level of diabetes knowledge among the health-care providers. Thus, the aim of this study was to examine the psychometric properties of the Michigan Diabetes Knowledge Test adapted for use among nursing personnel. The study included 127 nursing personnel (32 registered nurses, 69 nursing aides and 26 nursing assistants) at three nursing homes and one home-based care facility in Norway. We examined the reliability and content and construct validity of the Michigan Diabetes Knowledge Test. The items in both the general diabetes subscale and the insulin-use subscale were considered relevant and appropriate. The instrument showed satisfactory properties for distinguishing between groups. Item response theory-based measurements and item information curves indicate maximum information at average or lower knowledge scores. Internal consistency and the item-total correlations were quite weak, indicating that the Michigan Diabetes Knowledge Test measures a set of items related to various relevant knowledge topics but not necessarily related to each other. The Michigan Diabetes Knowledge Test measures a broad range of topics relevant to diabetes care. It is an appropriate instrument for identifying individual and distinct needs for diabetes education among nursing personnel. The knowledge gaps identified by the Michigan Diabetes Knowledge Test could also provide useful input for the content of educational activities. However, some revision of the test should be considered.
KBGIS-II: A knowledge-based geographic information system
NASA Technical Reports Server (NTRS)
Smith, Terence; Peuquet, Donna; Menon, Sudhakar; Agarwal, Pankaj
1986-01-01
The architecture and working of a recently implemented Knowledge-Based Geographic Information System (KBGIS-II), designed to satisfy several general criteria for the GIS, is described. The system has four major functions including query-answering, learning and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial object language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is performing all its designated tasks successfully. Future reports will relate performance characteristics of the system.
ERIC Educational Resources Information Center
Bøllingtoft Knudsen, Søren
2018-01-01
Focus on evidence-based policymaking is greater than ever, and public spending on evaluations is rising. A primary merit of these expenditures is that politicians actually use new knowledge instrumentally--to influence and inform decision making. Nevertheless, we know surprisingly little about whether and how research-based knowledge is utilised.…
A Logical Framework for Service Migration Based Survivability
2016-06-24
platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge
ERIC Educational Resources Information Center
Clariana, Roy B.; Marker, Anthony W.
2007-01-01
This investigation considers the effects of learner-generated headings on memory. Participants (N = 63) completed a computer-based lesson with or without learner-generated text topic headings. Posttests included a cued recall test of factual knowledge and a sorting task measure of structural knowledge. A significant disordinal interaction was…
ERIC Educational Resources Information Center
Karaoglan Yilmaz, Fatma Gizem
2017-01-01
Today, the use of social network-based virtual learning communities is increasing rapidly in terms of knowledge management. An important dynamic of knowledge management processes is the knowledge sharing behaviors (KSB) in community. The purpose of this study is to examine the KSB of the students in a Facebook-based virtual community created…
ERIC Educational Resources Information Center
Golovachyova, Viktoriya N.; Menlibekova, Gulbakhyt Zh.; Abayeva, Nella F.; Ten, Tatyana L.; Kogaya, Galina D.
2016-01-01
Using computer-based monitoring systems that rely on tests could be the most effective way of knowledge evaluation. The problem of objective knowledge assessment by means of testing takes on a new dimension in the context of new paradigms in education. The analysis of the existing test methods enabled us to conclude that tests with selected…
Big data analytics in immunology: a knowledge-based approach.
Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir
2014-01-01
With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.
Enhancing acronym/abbreviation knowledge bases with semantic information.
Torii, Manabu; Liu, Hongfang
2007-10-11
In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.
Generating and Executing Complex Natural Language Queries across Linked Data.
Hamon, Thierry; Mougin, Fleur; Grabar, Natalia
2015-01-01
With the recent and intensive research in the biomedical area, the knowledge accumulated is disseminated through various knowledge bases. Links between these knowledge bases are needed in order to use them jointly. Linked Data, SPARQL language, and interfaces in Natural Language question-answering provide interesting solutions for querying such knowledge bases. We propose a method for translating natural language questions in SPARQL queries. We use Natural Language Processing tools, semantic resources, and the RDF triples description. The method is designed on 50 questions over 3 biomedical knowledge bases, and evaluated on 27 questions. It achieves 0.78 F-measure on the test set. The method for translating natural language questions into SPARQL queries is implemented as Perl module available at http://search.cpan.org/ thhamon/RDF-NLP-SPARQLQuery.
NASA Astrophysics Data System (ADS)
Wu, Jiangning; Wang, Xiaohuan
Rapidly increasing amount of mobile phone users and types of services leads to a great accumulation of complaining information. How to use this information to enhance the quality of customers' services is a big issue at present. To handle this kind of problem, the paper presents an approach to construct a domain knowledge map for navigating the explicit and tacit knowledge in two ways: building the Topic Map-based explicit knowledge navigation model, which includes domain TM construction, a semantic topic expansion algorithm and VSM-based similarity calculation; building Social Network Analysis-based tacit knowledge navigation model, which includes a multi-relational expert navigation algorithm and the criterions to evaluate the performance of expert networks. In doing so, both the customer managers and operators in call centers can find the appropriate knowledge and experts quickly and exactly. The experimental results show that the above method is very powerful for knowledge navigation.
KAT: A Flexible XML-based Knowledge Authoring Environment
Hulse, Nathan C.; Rocha, Roberto A.; Del Fiol, Guilherme; Bradshaw, Richard L.; Hanna, Timothy P.; Roemer, Lorrie K.
2005-01-01
As part of an enterprise effort to develop new clinical information systems at Intermountain Health Care, the authors have built a knowledge authoring tool that facilitates the development and refinement of medical knowledge content. At present, users of the application can compose order sets and an assortment of other structured clinical knowledge documents based on XML schemas. The flexible nature of the application allows the immediate authoring of new types of documents once an appropriate XML schema and accompanying Web form have been developed and stored in a shared repository. The need for a knowledge acquisition tool stems largely from the desire for medical practitioners to be able to write their own content for use within clinical applications. We hypothesize that medical knowledge content for clinical use can be successfully created and maintained through XML-based document frameworks containing structured and coded knowledge. PMID:15802477
Effective domain-dependent reuse in medical knowledge bases.
Dojat, M; Pachet, F
1995-12-01
Knowledge reuse is now a critical issue for most developers of medical knowledge-based systems. As a rule, reuse is addressed from an ambitious, knowledge-engineering perspective that focuses on reusable general purpose knowledge modules, concepts, and methods. However, such a general goal fails to take into account the specific aspects of medical practice. From the point of view of the knowledge engineer, whose goal is to capture the specific features and intricacies of a given domain, this approach addresses the wrong level of generality. In this paper, we adopt a more pragmatic viewpoint, introducing the less ambitious goal of "domain-dependent limited reuse" and suggesting effective means of achieving it in practice. In a knowledge representation framework combining objects and production rules, we propose three mechanisms emerging from the combination of object-oriented programming and rule-based programming. We show these mechanisms contribute to achieve limited reuse and to introduce useful limited variations in medical expertise.
People Use their Knowledge of Common Events to Understand Language, and Do So as Quickly as Possible
McRae, Ken; Matsuki, Kazunaga
2011-01-01
People possess a great deal of knowledge about how the world works, and it is undoubtedly true that adults use this knowledge when understanding and producing language. However, psycholinguistic theories differ regarding whether this extra-linguistic pragmatic knowledge can be activated and used immediately, or only after a delay. The authors present research that investigates whether people immediately use their generalized knowledge of common events when understanding language. This research demonstrates that (i) individual isolated words immediately activate event-based knowledge; (ii) combinations of words in sentences immediately constrain people’s event-based expectations for concepts that are upcoming in language; (iii) syntax modulates people’s expectations for ensuing concepts; and (iv) event-based knowledge can produce expectations for ensuing syntactic structures. It is concluded that theories of sentence comprehension must allow for the rapid dynamic interplay among these sources of information. PMID:22125574
Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training
NASA Astrophysics Data System (ADS)
Macris, A.; Malamateniou, F.; Vassilacopoulos, G.
Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.
The National Mechatronic Platform. The basis of the educational programs in the knowledge society
NASA Astrophysics Data System (ADS)
Maties, V.
2016-08-01
The shift from the information society to the knowledge based society caused by the mechatronic revolution, that took place in the 9th decade of the last century, launched a lot of challenges for education and researches activities too. Knowledge production development asks for new educational technologies to stimulate the initiative and creativity as a base to increase the productivity in the knowledge production. The paper presents details related on the innovative potential of mechatronics as educational environment for transdisciplinarity learning and integral education. The basic infrastructure of that environment is based on mechatronic platforms. In order to develop the knowledge production at the national level the specific structures are to be developed. The paper presents details related on the structure of the National Mechatronic Platform as a true knowledge factory. The benefits of the effort to develop the specific infrastructure for knowledge production in the field of mechatronics are outlined too.
Developing an ontological explosion knowledge base for business continuity planning purposes.
Mohammadfam, Iraj; Kalatpour, Omid; Golmohammadi, Rostam; Khotanlou, Hasan
2013-01-01
Industrial accidents are among the most known challenges to business continuity. Many organisations have lost their reputation following devastating accidents. To manage the risks of such accidents, it is necessary to accumulate sufficient knowledge regarding their roots, causes and preventive techniques. The required knowledge might be obtained through various approaches, including databases. Unfortunately, many databases are hampered by (among other things) static data presentations, a lack of semantic features, and the inability to present accident knowledge as discrete domains. This paper proposes the use of Protégé software to develop a knowledge base for the domain of explosion accidents. Such a structure has a higher capability to improve information retrieval compared with common accident databases. To accomplish this goal, a knowledge management process model was followed. The ontological explosion knowledge base (EKB) was built for further applications, including process accident knowledge retrieval and risk management. The paper will show how the EKB has a semantic feature that enables users to overcome some of the search constraints of existing accident databases.
Perspectives on knowledge in engineering design
NASA Technical Reports Server (NTRS)
Rasdorf, W. J.
1985-01-01
Various perspectives are given of the knowledge currently used in engineering design, specifically dealing with knowledge-based expert systems (KBES). Constructing an expert system often reveals inconsistencies in domain knowledge while formalizing it. The types of domain knowledge (facts, procedures, judgments, and control) differ from the classes of that knowledge (creative, innovative, and routine). The feasible tasks for expert systems can be determined based on these types and classes of knowledge. Interpretive tasks require reasoning about a task in light of the knowledge available, where generative tasks create potential solutions to be tested against constraints. Only after classifying the domain by type and level can the engineer select a knowledge-engineering tool for the domain being considered. The critical features to be weighed after classification are knowledge representation techniques, control strategies, interface requirements, compatibility with traditional systems, and economic considerations.
Pieterman, Elise D; Budde, Ricardo P J; Robbers-Visser, Daniëlle; van Domburg, Ron T; Helbing, Willem A
2017-09-01
Follow-up of right ventricular performance is important for patients with congenital heart disease. Cardiac magnetic resonance imaging is optimal for this purpose. However, observer-dependency of manual analysis of right ventricular volumes limit its use. Knowledge-based reconstruction is a new semiautomatic analysis tool that uses a database including knowledge of right ventricular shape in various congenital heart diseases. We evaluated whether knowledge-based reconstruction is a good alternative for conventional analysis. To assess the inter- and intra-observer variability and agreement of knowledge-based versus conventional analysis of magnetic resonance right ventricular volumes, analysis was done by two observers in a mixed group of 22 patients with congenital heart disease affecting right ventricular loading conditions (dextro-transposition of the great arteries and right ventricle to pulmonary artery conduit) and a group of 17 healthy children. We used Bland-Altman analysis and coefficient of variation. Comparison between the conventional method and the knowledge-based method showed a systematically higher volume for the latter group. We found an overestimation for end-diastolic volume (bias -40 ± 24 mL, r = .956), end-systolic volume (bias -34 ± 24 mL, r = .943), stroke volume (bias -6 ± 17 mL, r = .735) and an underestimation of ejection fraction (bias 7 ± 7%, r = .671) by knowledge-based reconstruction. The intra-observer variability of knowledge-based reconstruction varied with a coefficient of variation of 9% for end-diastolic volume and 22% for stroke volume. The same trend was noted for inter-observer variability. A systematic difference (overestimation) was noted for right ventricular size as assessed with knowledge-based reconstruction compared with conventional methods for analysis. Observer variability for the new method was comparable to what has been reported for the right ventricle in children and congenital heart disease with conventional analysis. © 2017 Wiley Periodicals, Inc.
Improved knowledge diffusion model based on the collaboration hypernetwork
NASA Astrophysics Data System (ADS)
Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo
2015-06-01
The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.
NASA Astrophysics Data System (ADS)
Okuzawa, Yuki; Kato, Shohei; Kanoh, Masayoshi; Itoh, Hidenori
A knowledge-based approach to imitation learning of motion generation for humanoid robots and an imitative motion generation system based on motion knowledge learning and modification are described. The system has three parts: recognizing, learning, and modifying parts. The first part recognizes an instructed motion distinguishing it from the motion knowledge database by the continuous hidden markov model. When the motion is recognized as being unfamiliar, the second part learns it using locally weighted regression and acquires a knowledge of the motion. When a robot recognizes the instructed motion as familiar or judges that its acquired knowledge is applicable to the motion generation, the third part imitates the instructed motion by modifying a learned motion. This paper reports some performance results: the motion imitation of several radio gymnastics motions.
Ji, Shanmei; Matsumura, Yasushi; Kuwata, Shigeki; Nakano, Hirohiko; Chen, Yufeng; Teratani, Tadamasa; Zhang, Qiyan; Mineno, Takahiro; Takeda, Hiroshi
2004-12-01
To develop a system for checking indication and contraindication of medicines in prescription order entry system, a master table consisting of the disease names corresponding to the medicines adopted in a hospital is needed. The creation of this table requires a considerable manpower. We developed a Web-based system for constructing a medicine/disease thesaurus and a knowledge base. By authority management of users, this system enables many specialists to create the thesaurus collaboratively without confusion. It supports the creation of a knowledge base using concept names by referring to the thesaurus, which is automatically converted to the check master table. When a disease name or medicine name was added to the thesaurus, the check table was automatically updated. We constructed a thesaurus and a knowledge base in the field of circulatory system disease. The knowledge base linked with the thesaurus proved to be efficient for making the check master table for indication/contraindication of medicines.
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2017-05-01
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.
NASA Technical Reports Server (NTRS)
Palumbo, David B.
1990-01-01
Relationships between human memory systems and hypermedia systems are discussed with particular emphasis on the underlying importance of associational memory. The distinctions between knowledge presentation, knowledge representation, and knowledge constructions are addressed. Issues involved in actually developing individualizable hypermedia based knowledge construction tools are presented.
1989-08-01
Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17
Public School Teachers' Knowledge, Perception, and Implementation of Brain-Based Learning Practices
ERIC Educational Resources Information Center
Wachob, David A.
2012-01-01
The purpose of this study was to determine K-12 teachers' knowledge, beliefs, and practices of brain-based learning strategies in western Pennsylvania schools. The following five research questions were explored: (a) What is the extent of knowledge K-12 public school teachers have about the indicators of brain-based learning and Brain Gym?; (b) To…
ERIC Educational Resources Information Center
Tramontana, G. Michael; Blood, Ingrid M.; Blood, Gordon W.
2013-01-01
The purpose of this study was to determine (a) the general knowledge bases demonstrated by school-based speech-language pathologists (SLPs) in the area of genetics, (b) the confidence levels of SLPs in providing services to children and their families with genetic disorders/syndromes, (c) the attitudes of SLPs regarding genetics and communication…
Incorporating Resilience into Dynamic Social Models
2016-07-20
solved by simply using the information provided by the scenario. Instead, additional knowledge is required from relevant fields that study these...resilience function by leveraging Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network framework[5],[6]. BKBs allow for inferencing...reasoning network framework based on Bayesian Knowledge Bases (BKBs). BKBs are central to our social resilience framework as they are used to
ERIC Educational Resources Information Center
Alhossein, Abdulkarim
2016-01-01
During the last decade, scholars and policymakers have emphasized the importance of using evidence-based practices in teaching students with disabilities. One barrier to using these practices might be teachers' lack of knowledge about them. This study investigated teachers' knowledge and use of evidence-based teaching practices (EBTPs) for…
ERIC Educational Resources Information Center
Nielsen, Dorte Guldbrand; Gotzsche, Ole; Sonne, Ole; Eika, Berit
2012-01-01
Two major views on the relationship between basic science knowledge and clinical knowledge stand out; the Two-world view seeing basic science and clinical science as two separate knowledge bases and the encapsulated knowledge view stating that basic science knowledge plays an overt role being encapsulated in the clinical knowledge. However, resent…
Huser, Vojtech; Sincan, Murat; Cimino, James J
2014-01-01
Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward.
Huser, Vojtech; Sincan, Murat; Cimino, James J
2014-01-01
Personalized medicine, the ability to tailor diagnostic and treatment decisions for individual patients, is seen as the evolution of modern medicine. We characterize here the informatics resources available today or envisioned in the near future that can support clinical interpretation of genomic test results. We assume a clinical sequencing scenario (germline whole-exome sequencing) in which a clinical specialist, such as an endocrinologist, needs to tailor patient management decisions within his or her specialty (targeted findings) but relies on a genetic counselor to interpret off-target incidental findings. We characterize the genomic input data and list various types of knowledge bases that provide genomic knowledge for generating clinical decision support. We highlight the need for patient-level databases with detailed lifelong phenotype content in addition to genotype data and provide a list of recommendations for personalized medicine knowledge bases and databases. We conclude that no single knowledge base can currently support all aspects of personalized recommendations and that consolidation of several current resources into larger, more dynamic and collaborative knowledge bases may offer a future path forward. PMID:25276091
15 CFR 4.33 - General exemptions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... knowledge of criminal activity and the evidentiary bases of possible enforcement actions, and to maintain... knowledge of criminal activity and the evidentiary bases of possible enforcement actions, and to maintain... the integrity of the law enforcement process, to avoid premature disclosure of the knowledge of...
A knowledge base for tracking the impact of genomics on population health.
Yu, Wei; Gwinn, Marta; Dotson, W David; Green, Ridgely Fisk; Clyne, Mindy; Wulf, Anja; Bowen, Scott; Kolor, Katherine; Khoury, Muin J
2016-12-01
We created an online knowledge base (the Public Health Genomics Knowledge Base (PHGKB)) to provide systematically curated and updated information that bridges population-based research on genomics with clinical and public health applications. Weekly horizon scanning of a wide variety of online resources is used to retrieve relevant scientific publications, guidelines, and commentaries. After curation by domain experts, links are deposited into Web-based databases. PHGKB currently consists of nine component databases. Users can search the entire knowledge base or search one or more component databases directly and choose options for customizing the display of their search results. PHGKB offers researchers, policy makers, practitioners, and the general public a way to find information they need to understand the complicated landscape of genomics and population health.Genet Med 18 12, 1312-1314.
Taglieri, Catherine; Schnee, David; Dvorkin Camiel, Lana; Zaiken, Kathy; Mistry, Amee; Nigro, Stefanie; Tataronis, Gary; Patel, Dhiren; Jacobson, Susan; Goldman, Jennifer
2017-05-01
To determine whether team based learning (TBL) is superior to traditional lecture -based learning in confidence and knowledge retention one year later. A survey was administered 17 months after a completion of a required over-the-counter /self-care (OTC) course to two different cohorts of students. The survey assessed confidence and knowledge related to OTC topics. The lecture group had a traditional lecture based classroom experience; the intervention group experienced a TBL format throughout the entire course. One hundred forty-seven students of 283 enrolled (51.9%) in the lecture group and 222 of 305 (72.8%) students in the TBL group participated in the knowledge assessment and survey. Demographic data including student grade point averages (GPA) and confidence were similar in both groups. Mean assessment scores (±SD) on OTC knowledge was significantly higher in the traditional lecture based group versus the TBL group; 62.9±19.3 vs. 54.9±15.7 (p=0.001). Although TBL is thought to improve student engagement and mastery of material, after an initial implementation of TBL, knowledge retention in the long term appears to be lower than lecture based learning. Copyright © 2017 Elsevier Inc. All rights reserved.
Sticky knowledge: A possible model for investigating implementation in healthcare contexts
Elwyn, Glyn; Taubert, Mark; Kowalczuk, Jenny
2007-01-01
Background In health care, a well recognized gap exists between what we know should be done based on accumulated evidence and what we actually do in practice. A body of empirical literature shows organizations, like individuals, are difficult to change. In the business literature, knowledge management and transfer has become an established area of theory and practice, whilst in healthcare it is only starting to establish a firm footing. Knowledge has become a business resource, and knowledge management theorists and practitioners have examined how knowledge moves in organisations, how it is shared, and how the return on knowledge capital can be maximised to create competitive advantage. New models are being considered, and we wanted to explore the applicability of one of these conceptual models to the implementation of evidence-based practice in healthcare systems. Methods The application of a conceptual model called sticky knowledge, based on an integration of communication theory and knowledge transfer milestones, into a scenario of attempting knowledge transfer in primary care. Results We describe Szulanski's model, the empirical work he conducted, and illustrate its potential applicability with a hypothetical healthcare example based on improving palliative care services. We follow a doctor through two different posts and analyse aspects of knowledge transfer in different primary care settings. The factors included in the sticky knowledge model include: causal ambiguity, unproven knowledge, motivation of source, credibility of source, recipient motivation, recipient absorptive capacity, recipient retentive capacity, barren organisational context, and arduous relationship between source and recipient. We found that we could apply all these factors to the difficulty of implementing new knowledge into practice in primary care settings. Discussion Szulanski argues that knowledge factors play a greater role in the success or failure of a knowledge transfer than has been suspected, and we consider that this conjecture requires further empirical work in healthcare settings. PMID:18096040
NASA Technical Reports Server (NTRS)
Pippin, H. G.; Woll, S. L. B.
2000-01-01
Institutions need ways to retain valuable information even as experienced individuals leave an organization. Modern electronic systems have enough capacity to retain large quantities of information that can mitigate the loss of experience. Performance information for long-term space applications is relatively scarce and specific information (typically held by a few individuals within a single project) is often rather narrowly distributed. Spacecraft operate under severe conditions and the consequences of hardware and/or system failures, in terms of cost, loss of information, and time required to replace the loss, are extreme. These risk factors place a premium on appropriate choice of materials and components for space applications. An expert system is a very cost-effective method for sharing valuable and scarce information about spacecraft performance. Boeing has an artificial intelligence software package, called the Boeing Expert System Tool (BEST), to construct and operate knowledge bases to selectively recall and distribute information about specific subjects. A specific knowledge base to evaluate the on-orbit performance of selected materials on spacecraft has been developed under contract to the NASA SEE program. The performance capabilities of the Spacecraft Materials Selector (SMS) knowledge base are described. The knowledge base is a backward-chaining, rule-based system. The user answers a sequence of questions, and the expert system provides estimates of optical and mechanical performance of selected materials under specific environmental conditions. The initial operating capability of the system will include data for Kapton, silverized Teflon, selected paints, silicone-based materials, and certain metals. For situations where a mission profile (launch date, orbital parameters, mission duration, spacecraft orientation) is not precisely defined, the knowledge base still attempts to provide qualitative observations about materials performance and likely exposures. Prior to the NASA contract, a knowledge base, the Spacecraft Environments Assistant (SEA,) was initially developed by Boeing to estimate the environmental factors important for a specific spacecraft mission profile. The NASA SEE program has funded specific enhancements to the capability of this knowledge base. The SEA qualitatively identifies over 25 environmental factors that may influence the performance of a spacecraft during its operational lifetime. For cases where sufficiently detailed answers are provided to questions asked by the knowledge base, atomic oxygen fluence levels, proton and/or electron fluence and dose levels, and solar exposure hours are calculated. The SMS knowledge base incorporates the previously developed SEA knowledge base. A case history for previous flight experiment will be shown as an example, and capabilities and limitations of the system will be discussed.
Wright, Adam; Laxmisan, Archana; Ottosen, Madelene J; McCoy, Jacob A; Butten, David; Sittig, Dean F
2012-01-01
Objective We describe a novel, crowdsourcing method for generating a knowledge base of problem–medication pairs that takes advantage of manually asserted links between medications and problems. Methods Through iterative review, we developed metrics to estimate the appropriateness of manually entered problem–medication links for inclusion in a knowledge base that can be used to infer previously unasserted links between problems and medications. Results Clinicians manually linked 231 223 medications (55.30% of prescribed medications) to problems within the electronic health record, generating 41 203 distinct problem–medication pairs, although not all were accurate. We developed methods to evaluate the accuracy of the pairs, and after limiting the pairs to those meeting an estimated 95% appropriateness threshold, 11 166 pairs remained. The pairs in the knowledge base accounted for 183 127 total links asserted (76.47% of all links). Retrospective application of the knowledge base linked 68 316 medications not previously linked by a clinician to an indicated problem (36.53% of unlinked medications). Expert review of the combined knowledge base, including inferred and manually linked problem–medication pairs, found a sensitivity of 65.8% and a specificity of 97.9%. Conclusion Crowdsourcing is an effective, inexpensive method for generating a knowledge base of problem–medication pairs that is automatically mapped to local terminologies, up-to-date, and reflective of local prescribing practices and trends. PMID:22582202
Knowledge Discovery from Posts in Online Health Communities Using Unified Medical Language System.
Chen, Donghua; Zhang, Runtong; Liu, Kecheng; Hou, Lei
2018-06-19
Patient-reported posts in Online Health Communities (OHCs) contain various valuable information that can help establish knowledge-based online support for online patients. However, utilizing these reports to improve online patient services in the absence of appropriate medical and healthcare expert knowledge is difficult. Thus, we propose a comprehensive knowledge discovery method that is based on the Unified Medical Language System for the analysis of narrative posts in OHCs. First, we propose a domain-knowledge support framework for OHCs to provide a basis for post analysis. Second, we develop a Knowledge-Involved Topic Modeling (KI-TM) method to extract and expand explicit knowledge within the text. We propose four metrics, namely, explicit knowledge rate, latent knowledge rate, knowledge correlation rate, and perplexity, for the evaluation of the KI-TM method. Our experimental results indicate that our proposed method outperforms existing methods in terms of providing knowledge support. Our method enhances knowledge support for online patients and can help develop intelligent OHCs in the future.
A Knowledge-Based Approach to Information Fusion for the Support of Military Intelligence
2004-03-01
and most reliable an appropriate picture of the battlespace. The presented approach of knowledge based information fusion is focussing on the...incomplete and imperfect information of military reports and background knowledge can be supported substantially in an automated system. Keywords
NASA Astrophysics Data System (ADS)
Hong, Haibo; Yin, Yuehong; Chen, Xing
2016-11-01
Despite the rapid development of computer science and information technology, an efficient human-machine integrated enterprise information system for designing complex mechatronic products is still not fully accomplished, partly because of the inharmonious communication among collaborators. Therefore, one challenge in human-machine integration is how to establish an appropriate knowledge management (KM) model to support integration and sharing of heterogeneous product knowledge. Aiming at the diversity of design knowledge, this article proposes an ontology-based model to reach an unambiguous and normative representation of knowledge. First, an ontology-based human-machine integrated design framework is described, then corresponding ontologies and sub-ontologies are established according to different purposes and scopes. Second, a similarity calculation-based ontology integration method composed of ontology mapping and ontology merging is introduced. The ontology searching-based knowledge sharing method is then developed. Finally, a case of human-machine integrated design of a large ultra-precision grinding machine is used to demonstrate the effectiveness of the method.
Building a Knowledge to Action Program in Stroke Rehabilitation.
Janzen, Shannon; McIntyre, Amanda; Richardson, Marina; Britt, Eileen; Teasell, Robert
2016-09-01
The knowledge to action (KTA) process proposed by Graham et al (2006) is a framework to facilitate the development and application of research evidence into clinical practice. The KTA process consists of the knowledge creation cycle and the action cycle. The Evidence Based Review of Stroke Rehabilitation is a foundational part of the knowledge creation cycle and has helped guide the development of best practice recommendations in stroke. The Rehabilitation Knowledge to Action Project is an audit-feedback process for the clinical implementation of best practice guidelines, which follows the action cycle. The objective of this review was to: (1) contextualize the Evidence Based Review of Stroke Rehabilitation and Rehabilitation Knowledge to Action Project within the KTA model and (2) show how this process led to improved evidence-based practice in stroke rehabilitation. Through this process, a single centre was able to change clinical practice and promote a culture that supports the use of evidence-based practices in stroke rehabilitation.
Advanced software development workstation project ACCESS user's guide
NASA Technical Reports Server (NTRS)
1990-01-01
ACCESS is a knowledge based software information system designed to assist the user in modifying retrieved software to satisfy user specifications. A user's guide is presented for the knowledge engineer who wishes to create for ACCESS a knowledge base consisting of representations of objects in some software system. This knowledge is accessible to an end user who wishes to use the catalogued software objects to create a new application program or an input stream for an existing system. The application specific portion of an ACCESS knowledge base consists of a taxonomy of object classes, as well as instances of these classes. All objects in the knowledge base are stored in an associative memory. ACCESS provides a standard interface for the end user to browse and modify objects. In addition, the interface can be customized by the addition of application specific data entry forms and by specification of display order for the taxonomy and object attributes. These customization options are described.
Development of Korean Rare Disease Knowledge Base
Seo, Heewon; Kim, Dokyoon; Chae, Jong-Hee; Kang, Hee Gyung; Lim, Byung Chan; Cheong, Hae Il
2012-01-01
Objectives Rare disease research requires a broad range of disease-related information for the discovery of causes of genetic disorders that are maladies caused by abnormalities in genes or chromosomes. A rarity in cases makes it difficult for researchers to elucidate definite inception. This knowledge base will be a major resource not only for clinicians, but also for the general public, who are unable to find consistent information on rare diseases in a single location. Methods We design a compact database schema for faster querying; its structure is optimized to store heterogeneous data sources. Then, clinicians at Seoul National University Hospital (SNUH) review and revise those resources. Additionally, we integrated other sources to capture genomic resources and clinical trials in detail on the Korean Rare Disease Knowledge base (KRDK). Results As a result, we have developed a Web-based knowledge base, KRDK, suitable for study of Mendelian diseases that commonly occur among Koreans. This knowledge base is comprised of disease summary and review, causal gene list, laboratory and clinic directory, patient registry, and so on. Furthermore, database for analyzing and giving access to human biological information and the clinical trial management system are integrated on KRDK. Conclusions We expect that KRDK, the first rare disease knowledge base in Korea, may contribute to collaborative research and be a reliable reference for application to clinical trials. Additionally, this knowledge base is ready for querying of drug information so that visitors can search a list of rare diseases that is relative to specific drugs. Visitors can have access to KRDK via http://www.snubi.org/software/raredisease/. PMID:23346478
EXPECT: Explicit Representations for Flexible Acquisition
NASA Technical Reports Server (NTRS)
Swartout, BIll; Gil, Yolanda
1995-01-01
To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.
Halatchliyski, Iassen; Cress, Ulrike
2014-01-01
Using a longitudinal network analysis approach, we investigate the structural development of the knowledge base of Wikipedia in order to explain the appearance of new knowledge. The data consists of the articles in two adjacent knowledge domains: psychology and education. We analyze the development of networks of knowledge consisting of interlinked articles at seven snapshots from 2006 to 2012 with an interval of one year between them. Longitudinal data on the topological position of each article in the networks is used to model the appearance of new knowledge over time. Thus, the structural dimension of knowledge is related to its dynamics. Using multilevel modeling as well as eigenvector and betweenness measures, we explain the significance of pivotal articles that are either central within one of the knowledge domains or boundary-crossing between the two domains at a given point in time for the future development of new knowledge in the knowledge base. PMID:25365319
Knowledge modeling of coal mining equipments based on ontology
NASA Astrophysics Data System (ADS)
Zhang, Baolong; Wang, Xiangqian; Li, Huizong; Jiang, Miaomiao
2017-06-01
The problems of information redundancy and sharing are universe in coal mining equipment management. In order to improve the using efficiency of knowledge of coal mining equipments, this paper proposed a new method of knowledge modeling based on ontology. On the basis of analyzing the structures and internal relations of coal mining equipment knowledge, taking OWL as ontology construct language, the ontology model of coal mining equipment knowledge is built with the help of Protégé 4.3 software tools. The knowledge description method will lay the foundation for the high effective knowledge management and sharing, which is very significant for improving the production management level of coal mining enterprises.
PVDaCS - A prototype knowledge-based expert system for certification of spacecraft data
NASA Technical Reports Server (NTRS)
Wharton, Cathleen; Shiroma, Patricia J.; Simmons, Karen E.
1989-01-01
On-line data management techniques to certify spacecraft information are mandated by increasing telemetry rates. Knowledge-based expert systems offer the ability to certify data electronically without the need for time-consuming human interaction. Issues of automatic certification are explored by designing a knowledge-based expert system to certify data from a scientific instrument, the Orbiter Ultraviolet Spectrometer, on an operating NASA planetary spacecraft, Pioneer Venus. The resulting rule-based system, called PVDaCS (Pioneer Venus Data Certification System), is a functional prototype demonstrating the concepts of a larger system design. A key element of the system design is the representation of an expert's knowledge through the usage of well ordered sequences. PVDaCS produces a certification value derived from expert knowledge and an analysis of the instrument's operation. Results of system performance are presented.
The importance of knowledge-based technology.
Cipriano, Pamela F
2012-01-01
Nurse executives are responsible for a workforce that can provide safer and more efficient care in a complex sociotechnical environment. National quality priorities rely on technologies to provide data collection, share information, and leverage analytic capabilities to interpret findings and inform approaches to care that will achieve better outcomes. As a key steward for quality, the nurse executive exercises leadership to provide the infrastructure to build and manage nursing knowledge and instill accountability for following evidence-based practices. These actions contribute to a learning health system where new knowledge is captured as a by-product of care delivery enabled by knowledge-based electronic systems. The learning health system also relies on rigorous scientific evidence embedded into practice at the point of care. The nurse executive optimizes use of knowledge-based technologies, integrated throughout the organization, that have the capacity to help transform health care.
Finding gene regulatory network candidates using the gene expression knowledge base.
Venkatesan, Aravind; Tripathi, Sushil; Sanz de Galdeano, Alejandro; Blondé, Ward; Lægreid, Astrid; Mironov, Vladimir; Kuiper, Martin
2014-12-10
Network-based approaches for the analysis of large-scale genomics data have become well established. Biological networks provide a knowledge scaffold against which the patterns and dynamics of 'omics' data can be interpreted. The background information required for the construction of such networks is often dispersed across a multitude of knowledge bases in a variety of formats. The seamless integration of this information is one of the main challenges in bioinformatics. The Semantic Web offers powerful technologies for the assembly of integrated knowledge bases that are computationally comprehensible, thereby providing a potentially powerful resource for constructing biological networks and network-based analysis. We have developed the Gene eXpression Knowledge Base (GeXKB), a semantic web technology based resource that contains integrated knowledge about gene expression regulation. To affirm the utility of GeXKB we demonstrate how this resource can be exploited for the identification of candidate regulatory network proteins. We present four use cases that were designed from a biological perspective in order to find candidate members relevant for the gastrin hormone signaling network model. We show how a combination of specific query definitions and additional selection criteria derived from gene expression data and prior knowledge concerning candidate proteins can be used to retrieve a set of proteins that constitute valid candidates for regulatory network extensions. Semantic web technologies provide the means for processing and integrating various heterogeneous information sources. The GeXKB offers biologists such an integrated knowledge resource, allowing them to address complex biological questions pertaining to gene expression. This work illustrates how GeXKB can be used in combination with gene expression results and literature information to identify new potential candidates that may be considered for extending a gene regulatory network.
Akese, M I; Adejumo, P O; Ilesanmi, R E; Obilor, H N
2014-09-01
The increase in the prevalence of pressure ulcer among patients with impaired physical mobility has currently been associated with nurses' inadequate knowledge of preventive interventions. To assess nurses' knowledge of pressure ulcer identification/staging, risk factors and evidence-based preventive practices. This descriptive study was carried out at the University Teaching Hospital Maiduguri (UMTH), Borno State, Nigeria. Total sampling technique was utilized in the recruitment of the study participants. An adapted 75-item-pressure ulcer questionnaire was used for data collection. The data were analyzed using SPSS version 16. The hypotheses on nurses' knowledge were tested at 0.05 level of significance using Chi square test. A total of 219 nurses participated in this study with response rate of 68.0%. The nurses' years of professional practice ranged from 1 to 35 years with a mean of 11.7 (± 7.8) years. Approximately, 73% of the nurses demonstrated a low level of knowledge of pressure ulcer identification/staging, 69.4% demonstrated an average level of knowledge of risk factors and 79.9% demonstrated high level of knowledge of preventive practices. The relationship between nurses' knowledge of risk factors and knowledge of preventive practices (p = 0.37) was not significant. Nurses demonstrated a knowledge deficit in core areas on pressure ulcer identification/staging, risk factors' assessment and evidence-based preventive practices. In order to address this dearth, there is a need to institute an educational-based practice-guideline on pressure ulcer prevention for nurses.
Identified research directions for using manufacturing knowledge earlier in the product lifecycle
Hedberg, Thomas D.; Hartman, Nathan W.; Rosche, Phil; Fischer, Kevin
2016-01-01
Design for Manufacturing (DFM), especially the use of manufacturing knowledge to support design decisions, has received attention in the academic domain. However, industry practice has not been studied enough to provide solutions that are mature for industry. The current state of the art for DFM is often rule-based functionality within Computer-Aided Design (CAD) systems that enforce specific design requirements. That rule-based functionality may or may not dynamically affect geometry definition. And, if rule-based functionality exists in the CAD system, it is typically a customization on a case-by-case basis. Manufacturing knowledge is a phrase with vast meanings, which may include knowledge on the effects of material properties decisions, machine and process capabilities, or understanding the unintended consequences of design decisions on manufacturing. One of the DFM questions to answer is how can manufacturing knowledge, depending on its definition, be used earlier in the product lifecycle to enable a more collaborative development environment? This paper will discuss the results of a workshop on manufacturing knowledge that highlights several research questions needing more study. This paper proposes recommendations for investigating the relationship of manufacturing knowledge with shape, behavior, and context characteristics of product to produce a better understanding of what knowledge is most important. In addition, the proposal includes recommendations for investigating the system-level barriers to reusing manufacturing knowledge and how model-based manufacturing may ease the burden of knowledge sharing. Lastly, the proposal addresses the direction of future research for holistic solutions of using manufacturing knowledge earlier in the product lifecycle. PMID:27990027
Identified research directions for using manufacturing knowledge earlier in the product lifecycle.
Hedberg, Thomas D; Hartman, Nathan W; Rosche, Phil; Fischer, Kevin
2017-01-01
Design for Manufacturing (DFM), especially the use of manufacturing knowledge to support design decisions, has received attention in the academic domain. However, industry practice has not been studied enough to provide solutions that are mature for industry. The current state of the art for DFM is often rule-based functionality within Computer-Aided Design (CAD) systems that enforce specific design requirements. That rule-based functionality may or may not dynamically affect geometry definition. And, if rule-based functionality exists in the CAD system, it is typically a customization on a case-by-case basis. Manufacturing knowledge is a phrase with vast meanings, which may include knowledge on the effects of material properties decisions, machine and process capabilities, or understanding the unintended consequences of design decisions on manufacturing. One of the DFM questions to answer is how can manufacturing knowledge, depending on its definition, be used earlier in the product lifecycle to enable a more collaborative development environment? This paper will discuss the results of a workshop on manufacturing knowledge that highlights several research questions needing more study. This paper proposes recommendations for investigating the relationship of manufacturing knowledge with shape, behavior, and context characteristics of product to produce a better understanding of what knowledge is most important. In addition, the proposal includes recommendations for investigating the system-level barriers to reusing manufacturing knowledge and how model-based manufacturing may ease the burden of knowledge sharing. Lastly, the proposal addresses the direction of future research for holistic solutions of using manufacturing knowledge earlier in the product lifecycle.
A theoretical framework for measuring knowledge in screening decision aid trials.
Smith, Sian K; Barratt, Alexandra; Trevena, Lyndal; Simpson, Judy M; Jansen, Jesse; McCaffery, Kirsten J
2012-11-01
To describe a theoretical framework for assessing knowledge about the possible outcomes of participating in bowel cancer screening for the faecal occult blood test. The content of the knowledge measure was based on the UK General Medical Council's screening guidelines and a theory-based approach to assessing gist knowledge (Fuzzy Trace Theory). It comprised conceptual and numeric questions to assess knowledge of the underlying construct (e.g. false positive concept) and the approximate numbers affected (e.g. likelihood of a false positive). The measure was used in a randomised controlled trial involving 530 adults with low education, to compare the impact of a bowel screening decision aid with a screening information booklet developed for the Australian Government National Bowel Cancer Screening Program. The numeric knowledge scale was particularly responsive to the effects of the decision aid; at follow-up decision aid participants' numeric knowledge was significantly greater than the controls (P<0.001). This contrasts with the conceptual knowledge scale which improved significantly in both groups from baseline to follow-up (P<0.001). Our theory-based knowledge measure was responsive to change in conceptual knowledge and to the effect on numeric knowledge of a decision aid. This theoretical framework has the potential to guide the development of knowledge measures in other screening settings. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Comparison of LISP and MUMPS as implementation languages for knowledge-based systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, A.C.
1984-01-01
Major components of knowledge-based systems are summarized, along with the programming language features generally useful in their implementation. LISP and MUMPS are briefly described and compared as vehicles for building knowledge-based systems. The paper concludes with suggestions for extensions to MUMPS which might increase its usefulness in artificial intelligence applications without affecting the essential nature of the language. 8 references.
ERIC Educational Resources Information Center
Clarke, James B.; Coyle, James R.
2011-01-01
This article reports the results of a case study in which an experimental wiki knowledge base was designed, developed, and tested by the Brill Science Library at Miami University for an undergraduate engineering senior capstone project. The wiki knowledge base was created to determine if the science library could enhance the engineering literature…
ERIC Educational Resources Information Center
Faez, Farahnaz
2011-01-01
In this paper I examine similarities and differences between the required knowledge base of teachers of English as a second language (ESL) and French as a second language (FSL) for teaching in Kindergarten through Grade 12 programs in Canada. Drawing on knowledge base frameworks in language teacher education (Freeman and Johnson, 1998; Richards,…
NASA Technical Reports Server (NTRS)
Liebowitz, J.
1986-01-01
The development of an expert system prototype for software functional requirement determination for NASA Goddard's Command Management System, as part of its process of transforming general requests into specific near-earth satellite commands, is described. The present knowledge base was formulated through interactions with domain experts, and was then linked to the existing Knowledge Engineering Systems (KES) expert system application generator. Steps in the knowledge-base development include problem-oriented attribute hierarchy development, knowledge management approach determination, and knowledge base encoding. The KES Parser and Inspector, in addition to backcasting and analogical mapping, were used to validate the expert system-derived requirements for one of the major functions of a spacecraft, the solar Maximum Mission. Knowledge refinement, evaluation, and implementation procedures of the expert system were then accomplished.
Paloma-Castro, Olga; Romero-Sánchez, José Manuel; Paramio-Cuevas, Juan Carlos; Pastor-Montero, Sonia María; Del Carmen Sánchez-Dalda, María; Rozadillas-Sanmiguel, Elena; Moreno-Corral, Luis Javier
2017-04-01
To develop and psychometrically evaluate a questionnaire based on the outcome "Knowledge: Breast-feeding" of the Nursing Outcomes Classification (NOC) to determine the knowledge of parents on breast-feeding. The NOC outcome "Knowledge: Breast-feeding" allows for nurses/midwives to assess the efficacy of interventions aimed to improve the knowledge on breast-feeding in parents thought the clinical interview/observation. However, the use of self-administered questionnaires by patients could facilitate its evaluation. Two-phased study: (1) Development of the questionnaire based on experts' opinions; (2) Methodological design to assess its psychometric properties. The availability of tools that enable the determination of the knowledge of patients would facilitate nurses/midwives to set objectives, individualize interventions, and measure their effectiveness. © 2015 NANDA International, Inc.
Approaching Etuaptmumk--introducing a consensus-based mixed method for health services research.
Chatwood, Susan; Paulette, Francois; Baker, Ross; Eriksen, Astrid; Hansen, Ketil Lenert; Eriksen, Heidi; Hiratsuka, Vanessa; Lavoie, Josée; Lou, Wendy; Mauro, Ian; Orbinski, James; Pabrum, Nathalie; Retallack, Hanna; Brown, Adalsteinn
2015-01-01
With the recognized need for health systems' improvements in the circumpolar and indigenous context, there has been a call to expand the research agenda across all sectors influencing wellness and to recognize academic and indigenous knowledge through the research process. Despite being recognized as a distinct body of knowledge in international forums and across indigenous groups, examples of methods and theories based on indigenous knowledge are not well documented in academic texts or peer-reviewed literature on health systems. This paper describes the use of a consensus-based, mixed method with indigenous knowledge by an experienced group of researchers and indigenous knowledge holders who collaborated on a study that explored indigenous values underlying health systems stewardship. The method is built on the principles of Etuaptmumk or two-eyed seeing, which aim to respond to and resolve the inherent conflicts between indigenous ways of knowing and the scientific inquiry that informs the evidence base in health care. Mixed methods' frameworks appear to provide a framing suitable for research questions that require data from indigenous knowledge sources and western knowledge. The nominal consensus method, as a western paradigm, was found to be responsive to embedding of indigenous knowledge and allowed space to express multiple perspectives and reach consensus on the question at hand. Further utilization and critical evaluation of this mixed methodology with indigenous knowledge are required.
Approaching Etuaptmumk – introducing a consensus-based mixed method for health services research
Chatwood, Susan; Paulette, Francois; Baker, Ross; Eriksen, Astrid; Hansen, Ketil Lenert; Eriksen, Heidi; Hiratsuka, Vanessa; Lavoie, Josée; Lou, Wendy; Mauro, Ian; Orbinski, James; Pabrum, Nathalie; Retallack, Hanna; Brown, Adalsteinn
2015-01-01
With the recognized need for health systems’ improvements in the circumpolar and indigenous context, there has been a call to expand the research agenda across all sectors influencing wellness and to recognize academic and indigenous knowledge through the research process. Despite being recognized as a distinct body of knowledge in international forums and across indigenous groups, examples of methods and theories based on indigenous knowledge are not well documented in academic texts or peer-reviewed literature on health systems. This paper describes the use of a consensus-based, mixed method with indigenous knowledge by an experienced group of researchers and indigenous knowledge holders who collaborated on a study that explored indigenous values underlying health systems stewardship. The method is built on the principles of Etuaptmumk or two-eyed seeing, which aim to respond to and resolve the inherent conflicts between indigenous ways of knowing and the scientific inquiry that informs the evidence base in health care. Mixed methods’ frameworks appear to provide a framing suitable for research questions that require data from indigenous knowledge sources and western knowledge. The nominal consensus method, as a western paradigm, was found to be responsive to embedding of indigenous knowledge and allowed space to express multiple perspectives and reach consensus on the question at hand. Further utilization and critical evaluation of this mixed methodology with indigenous knowledge are required. PMID:26004427
Scotland's Knowledge Network: translating knowledge into action to improve quality of care.
Wales, A; Graham, S; Rooney, K; Crawford, A
2012-11-01
The Knowledge Network (www.knowledge.scot.nhs.uk) is Scotland's online knowledge service for health and social care. It is designed to support practitioners to apply knowledge in frontline delivery of care, helping to translate knowledge into better health-care outcomes through safe, effective, person-centred care. The Knowledge Network helps to combine the worlds of evidence-based practice and quality improvement by providing access to knowledge about the effectiveness of clinical interventions ('know-what') and knowledge about how to implement this knowledge to support individual patients in working health-care environments ('know-how'). An 'evidence and guidance' search enables clinicians to quickly access quality-assured evidence and best practice, while point of care and mobile solutions provide knowledge in actionable formats to embed in clinical workflow. This research-based knowledge is complemented by social networking services and improvement tools which support the capture and exchange of knowledge from experience, facilitating practice change and systems improvement. In these cases, the Knowledge Network supports key components of the knowledge-to-action cycle--acquiring, creating, sharing and disseminating knowledge to improve performance and innovate. It provides a vehicle for implementing the recommendations of the national Knowledge into Action review, which outlines a new national approach to embedding knowledge in frontline practice and systems improvement.
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Collins, Donald J.; Doyle, Richard J.; Jacobson, Allan S.
1991-01-01
Viewgraphs on DataHub knowledge based assistance for science visualization and analysis using large distributed databases. Topics covered include: DataHub functional architecture; data representation; logical access methods; preliminary software architecture; LinkWinds; data knowledge issues; expert systems; and data management.
Developing Learning Progression-Based Teacher Knowledge Measures
ERIC Educational Resources Information Center
Jin, Hui; Shin, HyoJeong; Johnson, Michele E.; Kim, JinHo; Anderson, Charles W.
2015-01-01
This study developed learning progression-based measures of science teachers' content knowledge (CK) and pedagogical content knowledge (PCK). The measures focus on an important topic in secondary science curriculum using scientific reasoning (i.e., tracing matter, tracing energy, and connecting scales) to explain plants gaining weight and…
ERIC Educational Resources Information Center
Gowda, Veena Bhaskar S.; Nagaiah, Bhaskar Hebbani; Sengodan, Bharathi
2016-01-01
Medical students build clinical knowledge on the grounds of previously obtained basic knowledge. The study aimed to evaluate the competency of third year medical students to interpret biochemically based clinical scenarios using knowledge and skills gained during year 1 and 2 of undergraduate medical training. Study was conducted on year 3 MBBS…
ADEpedia 2.0: Integration of Normalized Adverse Drug Events (ADEs) Knowledge from the UMLS.
Jiang, Guoqian; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G
2013-01-01
A standardized Adverse Drug Events (ADEs) knowledge base that encodes known ADE knowledge can be very useful in improving ADE detection for drug safety surveillance. In our previous study, we developed the ADEpedia that is a standardized knowledge base of ADEs based on drug product labels. The objectives of the present study are 1) to integrate normalized ADE knowledge from the Unified Medical Language System (UMLS) into the ADEpedia; and 2) to enrich the knowledge base with the drug-disorder co-occurrence data from a 51-million-document electronic medical records (EMRs) system. We extracted 266,832 drug-disorder concept pairs from the UMLS, covering 14,256 (1.69%) distinct drug concepts and 19,006 (3.53%) distinct disorder concepts. Of them, 71,626 (26.8%) concept pairs from UMLS co-occurred in the EMRs. We performed a preliminary evaluation on the utility of the UMLS ADE data. In conclusion, we have built an ADEpedia 2.0 framework that intends to integrate known ADE knowledge from disparate sources. The UMLS is a useful source for providing standardized ADE knowledge relevant to indications, contraindications and adverse effects, and complementary to the ADE data from drug product labels. The statistics from EMRs would enable the meaningful use of ADE data for drug safety surveillance.
Dao, Tien Tuan; Hoang, Tuan Nha; Ta, Xuan Hien; Tho, Marie Christine Ho Ba
2013-02-01
Human musculoskeletal system resources of the human body are valuable for the learning and medical purposes. Internet-based information from conventional search engines such as Google or Yahoo cannot response to the need of useful, accurate, reliable and good-quality human musculoskeletal resources related to medical processes, pathological knowledge and practical expertise. In this present work, an advanced knowledge-based personalized search engine was developed. Our search engine was based on a client-server multi-layer multi-agent architecture and the principle of semantic web services to acquire dynamically accurate and reliable HMSR information by a semantic processing and visualization approach. A security-enhanced mechanism was applied to protect the medical information. A multi-agent crawler was implemented to develop a content-based database of HMSR information. A new semantic-based PageRank score with related mathematical formulas were also defined and implemented. As the results, semantic web service descriptions were presented in OWL, WSDL and OWL-S formats. Operational scenarios with related web-based interfaces for personal computers and mobile devices were presented and analyzed. Functional comparison between our knowledge-based search engine, a conventional search engine and a semantic search engine showed the originality and the robustness of our knowledge-based personalized search engine. In fact, our knowledge-based personalized search engine allows different users such as orthopedic patient and experts or healthcare system managers or medical students to access remotely into useful, accurate, reliable and good-quality HMSR information for their learning and medical purposes. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kitov, I.; Bobrov, D.; Rozhkov, M.
2016-12-01
Aftershocks of larger earthquakes represent an important source of information on the distribution and evolution of stresses and deformations in pre-seismic, co-seismic and post-seismic phases. For the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO) largest aftershocks sequences are also a challenge for automatic and interactive processing. The highest rate of events recorded by two and more seismic stations of the International Monitoring System from a relatively small aftershock area may reach hundreds per hour (e.g. Sumatra 2004 and Tohoku 2011). Moreover, there are thousands of reflected/refracted phases per hour with azimuth and slowness within the uncertainty limits of the first P-waves. Misassociation of these later phases, both regular and site specific, as the first P-wave results in creation of numerous wrong event hypotheses in automatic IDC pipeline. In turn, interactive review of such wrong hypotheses is direct waste of analysts' resources. Waveform cross correlation (WCC) is a powerful tool to separate coda phases from actual P-wave arrivals and to fully utilize the repeat character of waveforms generated by events close in space. Array seismic stations of the IMS enhance the performance of the WCC in two important aspects - they reduce detection threshold and effectively suppress arrivals from all sources except master events. An IDC specific aftershock tool has been developed and merged with standard IDC pipeline. The tool includes several procedures: creation of master events consisting of waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point of interactive analysis. Since global monitoring of underground nuclear tests is based on historical and synthetic data, each aftershock sequence can be tested for the CTBT violation with big earthquakes as an evasion scenario.
Infrasound analysis of I18DK, northwest Greenland
NASA Astrophysics Data System (ADS)
Evers, L. G.; Weemstra, C.
2010-12-01
Within the scope of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), four methods are used to verify the treaty. One of these methods is based on the detection of infrasound waves generated by a nuclear explosion. Seismological, hydroacoustical and radionuclide measurements are also applied. The International Monitoring System (IMS) will consist of 60 infrasound stations of which 35 stations are currently operational. Data obtained from an infrasound station situated on the northwestern shoreline of Greenland is analyzed. This station is operated by Denmark and labeled as I18DK. I18DK is situated in an area which receives an ever increasing attention from a geophysical perspective. I18DK has continuously been operational from April 2003 and onwards. The IMS station is an infrasound array with an aperture of about 1200 meters, where air-pressure fluctuations are recorded by eight microbarometers at a sample-rate of 20 Hz. The infrasonic recordings are filtered between 0.1 & 1.0 and 1.0 & 6.0 Hz. The slowness grid is searched for two different configurations in the higher frequency band. Once using all 8 stations and once only taking into account the 5 center stations. Several different source types are known to generate infrasound, for example, calving of icebergs and glaciers, explosions, earthquakes, oceanic wave-wave interaction, volcanic eruptions and aurora. The challenge is to distinguish between these different source types and use the outcome of the array analysis to better understand these phenomena. The rate of occurrence of icequakes, the calving of glaciers and the variation in extent of the sea ice in this area is of interest in relation to global warming. The processing results of the 1 to 6 Hz band seem to show dominating back-azimuths related to these sources. The glaciers south of I18DK produce significant infrasound during summer time. As well, a direct link can be found between the number of warm days in a year and the number of infrasound detections from a north-northeast direction. These signals seem to be generated by run- off of water from the local ice cap north of I18DK.
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.
NASA Astrophysics Data System (ADS)
Esterhazy, Sofi; Schneider, Felix; Schöberl, Joachim; Perugia, Ilaria; Bokelmann, Götz
2016-04-01
The research on purely numerical methods for modeling seismic waves has been more and more intensified over last decades. This development is mainly driven by the fact that on the one hand for subsurface models of interest in exploration and global seismology exact analytic solutions do not exist, but, on the other hand, retrieving full seismic waveforms is important to get insides into spectral characteristics and for the interpretation of seismic phases and amplitudes. Furthermore, the computational potential has dramatically increased in the recent past such that it became worthwhile to perform computations for large-scale problems as those arising in the field of computational seismology. Algorithms based on the Finite Element Method (FEM) are becoming increasingly popular for the propagation of acoustic and elastic waves in geophysical models as they provide more geometrical flexibility in terms of complexity as well as heterogeneity of the materials. In particular, we want to demonstrate the benefit of high-order FEMs as they also provide a better control on the accuracy. Our computations are done with the parallel Finite Element Library NGSOLVE ontop of the automatic 2D/3D mesh generator NETGEN (http://sourceforge.net/projects/ngsolve/). Further we are interested in the generation of synthetic seismograms including direct, refracted and converted waves in correlation to the presence of an underground cavity and the detailed simulation of the comprehensive wave field inside and around such a cavity that would have been created by a nuclear explosion. The motivation of this application comes from the need to find evidence of a nuclear test as they are forbidden by the Comprehensive Nuclear-Test Ban Treaty (CTBT). With this approach it is possible for us to investigate the wave field over a large bandwidth of wave numbers. This again will help to provide a better understanding on the characteristic signatures of an underground cavity, improve the protocols for OSI field deployment and create solid observational strategies for detecting the presence of an underground (nuclear) cavity.
NASA Technical Reports Server (NTRS)
Rogers, James L.; Feyock, Stefan; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
The purpose of this research effort is to investigate the benefits that might be derived from applying artificial intelligence tools in the area of conceptual design. Therefore, the emphasis is on the artificial intelligence aspects of conceptual design rather than structural and optimization aspects. A prototype knowledge-based system, called STRUTEX, was developed to initially configure a structure to support point loads in two dimensions. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user by integrating a knowledge base interface and inference engine, a data base interface, and graphics while keeping the knowledge base and data base files separate. The system writes a file which can be input into a structural synthesis system, which combines structural analysis and optimization.
An object-relational model for structured representation of medical knowledge.
Koch, S; Risch, T; Schneider, W; Wagner, I V
2006-07-01
Domain specific knowledge is often not static but continuously evolving. This is especially true for the medical domain. Furthermore, the lack of standardized structures for presenting knowledge makes it difficult or often impossible to assess new knowledge in the context of existing knowledge. Possibilities to compare knowledge easily and directly are often not given. It is therefore of utmost importance to create a model that allows for comparability, consistency and quality assurance of medical knowledge in specific work situations. For this purpose, we have designed on object-relational model based on structured knowledge elements that are dynamically reusable by different multi-media-based tools for case-based documentation, disease course simulation, and decision support. With this model, high-level components, such as patient case reports or simulations of the course of a disease, and low-level components (e.g., diagnoses, symptoms or treatments) as well as the relationships between these components are modeled. The resulting schema has been implemented in AMOS II, on object-relational multi-database system supporting different views with regard to search and analysis depending on different work situations.
Crowley, D Max; Greenberg, Mark T; Feinberg, Mark E; Spoth, Richard L; Redmond, Cleve R
2012-02-01
A substantial challenge in improving public health is how to facilitate the local adoption of evidence-based interventions (EBIs). To do so, an important step is to build local stakeholders' knowledge and decision-making skills regarding the adoption and implementation of EBIs. One EBI delivery system, called PROSPER (PROmoting School-community-university Partnerships to Enhance Resilience), has effectively mobilized community prevention efforts, implemented prevention programming with quality, and consequently decreased youth substance abuse. While these results are encouraging, another objective is to increase local stakeholder knowledge of best practices for adoption, implementation and evaluation of EBIs. Using a mixed methods approach, we assessed local stakeholder knowledge of these best practices over 5 years, in 28 intervention and control communities. Results indicated that the PROSPER partnership model led to significant increases in expert knowledge regarding the selection, implementation, and evaluation of evidence-based interventions. Findings illustrate the limited programming knowledge possessed by members of local prevention efforts, the difficulty of complete knowledge transfer, and highlight one method for cultivating that knowledge.
Enhancing Knowledge Sharing Management Using BIM Technology in Construction
Ho, Shih-Ping; Tserng, Hui-Ping
2013-01-01
Construction knowledge can be communicated and reused among project managers and jobsite engineers to alleviate problems on a construction jobsite and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology for the sharing of construction knowledge by using Building Information Modeling (BIM) technology. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format and facilitation of easy updating and transfer of information in the BIM environment. Using the BIM technology, project managers and engineers can gain knowledge related to BIM and obtain feedback provided by jobsite engineers for future reference. This study addresses the application of knowledge sharing management using BIM technology and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to demonstrate the effectiveness of sharing knowledge in the BIM environment. The results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM technology. PMID:24723790
Enhancing knowledge sharing management using BIM technology in construction.
Ho, Shih-Ping; Tserng, Hui-Ping; Jan, Shu-Hui
2013-01-01
Construction knowledge can be communicated and reused among project managers and jobsite engineers to alleviate problems on a construction jobsite and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology for the sharing of construction knowledge by using Building Information Modeling (BIM) technology. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format and facilitation of easy updating and transfer of information in the BIM environment. Using the BIM technology, project managers and engineers can gain knowledge related to BIM and obtain feedback provided by jobsite engineers for future reference. This study addresses the application of knowledge sharing management using BIM technology and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to demonstrate the effectiveness of sharing knowledge in the BIM environment. The results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM technology.
A Different Approach to the Generation of Patient Management Problems from a Knowledge-Based System
Barriga, Rosa Maria
1988-01-01
Several strategies are proposed to approach the generation of Patient Management Problems from a Knowledge Base and avoid inconsistencies in the results. These strategies are based on a different Knowledge Base structure and in the use of case introductions that describe the patient attributes which are not disease-dependent. This methodology has proven effective in a recent pilot test and it is on its way to implementation as part of an educational program at CWRU, School of Medicine.
An Interactive Medical Knowledge Assistant
NASA Astrophysics Data System (ADS)
Czejdo, Bogdan D.; Baszun, Mikolaj
This paper describes an interactive medical knowledge assistant that can help a doctor or a patient in making important health related decisions. The system is Web based and consists of several modules, including a medical knowledge base, a doctor interface module, patient interface module and a the main module of the medical knowledge assistant. The medical assistant is designed to help interpret the fuzzy data using rough sets approach. The patient interface includes sub-system for real time monitoring of patients' health parameters and sending them to the main module of the medical knowledge assistant.
English Learners' Knowledge of Prepositions: Collocational Knowledge or Knowledge Based on Meaning?
ERIC Educational Resources Information Center
Mueller, Charles M.
2011-01-01
Second language (L2) learners' successful performance in an L2 can be partly attributed to their knowledge of collocations. In some cases, this knowledge is accompanied by knowledge of the semantic and/or grammatical patterns that motivate the collocation. At other times, collocational knowledge may serve a compensatory role. To determine the…
Examining Collaborative Knowledge Construction in Microblogging-Based Learning Environments
ERIC Educational Resources Information Center
Luo, Tian; Clifton, Lacey
2017-01-01
Aim/Purpose: The purpose of the study is to provide foundational research to exemplify how knowledge construction takes place in microblogging-based learning environments, to understand learner interaction representing the knowledge construction process, and to analyze learner perception, thereby suggesting a model of delivery for microblogging.…
Improving Collaborative Learning in the Classroom: Text Mining Based Grouping and Representing
ERIC Educational Resources Information Center
Erkens, Melanie; Bodemer, Daniel; Hoppe, H. Ulrich
2016-01-01
Orchestrating collaborative learning in the classroom involves tasks such as forming learning groups with heterogeneous knowledge and making learners aware of the knowledge differences. However, gathering information on which the formation of appropriate groups and the creation of graphical knowledge representations can be based is very effortful…
Information technology: building nursing intellectual capital for the information age.
Simpson, Roy L
2007-01-01
Healthcare is evolving from a task-based industry to a knowledge-based one. To gain and retain value as intellectual capital, nursing likewise must evolve from a vocation of task performers to a profession of knowledge-workers. Information technology can transform nursing tasks into nursing knowledge.
ERIC Educational Resources Information Center
Vokatis, Barbara; Zhang, Jianwei
2016-01-01
Diffusing inquiry-based pedagogy in schools for deep and lasting change requires teacher transformation and capacity building. This study characterizes the professional identity of three elementary school teachers who have productively engaged in inquiry-based classroom practice using knowledge building pedagogy and Knowledge Forum, a…
Uncertainty and Clinical Psychology: Therapists' Responses.
ERIC Educational Resources Information Center
Bienenfeld, Sheila
Three sources of professional uncertainty have been described: uncertainty about the practitioner's mastery of knowledge; uncertainty due to gaps in the knowledge base itself; and uncertainty about the source of the uncertainty, i.e., the practitioner does not know whether his uncertainty is due to gaps in the knowledge base or to personal…
Athletic Training Educators' Knowledge, Comfort, and Perceived Importance of Evidence-Based Practice
ERIC Educational Resources Information Center
Welch, Cailee E.; Van Lunen, Bonnie L.; Walker, Stacy E.; Manspeaker, Sarah A.; Hankemeier, Dorice A.; Brown, Sara D.; Laursen, R. Mark; Onate, James A.
2011-01-01
Context: Before new strategies and effective techniques for implementation of evidence-based practice (EBP) into athletic training curricula can occur, it is crucial to recognize the current knowledge and understanding of EBP concepts among athletic training educators. Objective: To assess athletic training educators' current knowledge, comfort,…
Learning Science-Based Fitness Knowledge in Constructivist Physical Education
ERIC Educational Resources Information Center
Sun, Haichun; Chen, Ang; Zhu, Xihe; Ennis, Catherine D.
2012-01-01
Teaching fitness-related knowledge has become critical in developing children's healthful living behavior. The purpose of this study was to examine the effects of a science-based, constructivist physical education curriculum on learning fitness knowledge critical to healthful living in elementary school students. The schools (N = 30) were randomly…
A Text Knowledge Base from the AI Handbook.
ERIC Educational Resources Information Center
Simmons, Robert F.
1987-01-01
Describes a prototype natural language text knowledge system (TKS) that was used to organize 50 pages of a handbook on artificial intelligence as an inferential knowledge base with natural language query and command capabilities. Representation of text, database navigation, query systems, discourse structuring, and future research needs are…
Weaver, D; Sorrells-Jones, J
1999-09-01
Our economy is shifting from a hard goods and material products base to one in which knowledge is the primary mode of production. Organizations are experimenting with designs that support knowledge work by clustering individuals with different but complementary skills in focused teams. The goal is to increase applied knowledge that furthers the organization's strategic intent. The team-based knowledge work model holds promise for healthcare organizations that are under pressure to use knowledge to improve clinical care, integrate care across disciplines and settings, and accept accountability for costs. However, the shift from the traditional bureaucratic model to the flexible team-based design mandates changes in the design of the organization, the role of leadership, and the attributes of the teams and team members. In Part 2 of this three-part series, the authors explore the necessary design changes and the new roles for leadership, teams, and their members. Additionally, implications for healthcare clinicians, particularly nurses, are discussed.
Online Knowledge-Based Model for Big Data Topic Extraction.
Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan
2016-01-01
Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.
Knowledge-based nursing diagnosis
NASA Astrophysics Data System (ADS)
Roy, Claudette; Hay, D. Robert
1991-03-01
Nursing diagnosis is an integral part of the nursing process and determines the interventions leading to outcomes for which the nurse is accountable. Diagnoses under the time constraints of modern nursing can benefit from a computer assist. A knowledge-based engineering approach was developed to address these problems. A number of problems were addressed during system design to make the system practical extended beyond capture of knowledge. The issues involved in implementing a professional knowledge base in a clinical setting are discussed. System functions, structure, interfaces, health care environment, and terminology and taxonomy are discussed. An integrated system concept from assessment through intervention and evaluation is outlined.
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
1992-01-01
The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.
Using background knowledge for picture organization and retrieval
NASA Astrophysics Data System (ADS)
Quintana, Yuri
1997-01-01
A picture knowledge base management system is described that is used to represent, organize and retrieve pictures from a frame knowledge base. Experiments with human test subjects were conducted to obtain further descriptions of pictures from news magazines. These descriptions were used to represent the semantic content of pictures in frame representations. A conceptual clustering algorithm is described which organizes pictures not only on the observable features, but also on implicit properties derived from the frame representations. The algorithm uses inheritance reasoning to take into account background knowledge in the clustering. The algorithm creates clusters of pictures using a group similarity function that is based on the gestalt theory of picture perception. For each cluster created, a frame is generated which describes the semantic content of pictures in the cluster. Clustering and retrieval experiments were conducted with and without background knowledge. The paper shows how the use of background knowledge and semantic similarity heuristics improves the speed, precision, and recall of queries processed. The paper concludes with a discussion of how natural language processing of can be used to assist in the development of knowledge bases and the processing of user queries.
Knowledge-driven genomic interactions: an application in ovarian cancer.
Kim, Dokyoon; Li, Ruowang; Dudek, Scott M; Frase, Alex T; Pendergrass, Sarah A; Ritchie, Marylyn D
2014-01-01
Effective cancer clinical outcome prediction for understanding of the mechanism of various types of cancer has been pursued using molecular-based data such as gene expression profiles, an approach that has promise for providing better diagnostics and supporting further therapies. However, clinical outcome prediction based on gene expression profiles varies between independent data sets. Further, single-gene expression outcome prediction is limited for cancer evaluation since genes do not act in isolation, but rather interact with other genes in complex signaling or regulatory networks. In addition, since pathways are more likely to co-operate together, it would be desirable to incorporate expert knowledge to combine pathways in a useful and informative manner. Thus, we propose a novel approach for identifying knowledge-driven genomic interactions and applying it to discover models associated with cancer clinical phenotypes using grammatical evolution neural networks (GENN). In order to demonstrate the utility of the proposed approach, an ovarian cancer data from the Cancer Genome Atlas (TCGA) was used for predicting clinical stage as a pilot project. We identified knowledge-driven genomic interactions associated with cancer stage from single knowledge bases such as sources of pathway-pathway interaction, but also knowledge-driven genomic interactions across different sets of knowledge bases such as pathway-protein family interactions by integrating different types of information. Notably, an integration model from different sources of biological knowledge achieved 78.82% balanced accuracy and outperformed the top models with gene expression or single knowledge-based data types alone. Furthermore, the results from the models are more interpretable because they are framed in the context of specific biological pathways or other expert knowledge. The success of the pilot study we have presented herein will allow us to pursue further identification of models predictive of clinical cancer survival and recurrence. Understanding the underlying tumorigenesis and progression in ovarian cancer through the global view of interactions within/between different biological knowledge sources has the potential for providing more effective screening strategies and therapeutic targets for many types of cancer.
Romero, Daniela C; Sauris, Aileen; Rodriguez, Fátima; Delgado, Daniela; Reddy, Ankita; Foody, JoAnne M
2016-03-01
Hispanic women suffer from high rates of cardiometabolic risk factors and an increasingly disproportionate burden of cardiovascular disease (CVD). Particularly, Hispanic women with limited English proficiency suffer from low levels of CVD knowledge associated with adverse CVD health outcomes. Thirty-two predominantly Spanish-speaking Hispanic women completed, Vivir Con un Corazón Saludable (VCUCS), a culturally tailored Spanish language-based 6-week intensive community program targeting CVD health knowledge through weekly interactive health sessions. A 30-question CVD knowledge questionnaire was used to assess mean changes in CVD knowledge at baseline and postintervention across five major knowledge domains including CVD epidemiology, dietary knowledge, medical information, risk factors, and heart attack symptoms. Completion of the program was associated with a statistically significant (p < 0.001) increase in total mean CVD knowledge scores from 39 % (mean 11.7/30.0) to 66 % (mean 19.8/30.0) postintervention consistent with a 68 % increase in overall mean CVD scores. There was a statistically significant (p < 0.001) increase in mean knowledge scores across all five CVD domains. A culturally tailored Spanish language-based health program is effective in increasing CVD awareness among high CVD risk Hispanic women with low English proficiency and low baseline CVD knowledge.
Intrusive and Non-Intrusive Instruction in Dynamic Skill Training.
1981-10-01
less sensitive to the processing load imposed by the dynaic task together with instructional feedback processing than were the decison - making and...betwee computer based instruction of knowledge systems and computer based instruction of dynamic skills. There is reason to expect that the findings of...knowledge 3Ytm and computer based instruction of dynlamic skill.. There is reason to expect that the findings of research on knowledge system
Introduction to Radar Signal and Data Processing: The Opportunity
2006-09-01
SpA) Director of Analysis of Integrated Systems Group Via Tiburtina Km. 12.400 00131 Rome ITALY e.mail: afarina@selex-si.com Key words: radar...signal processing, data processing, adaptivity, space-time adaptive processing, knowledge based systems , CFAR. 1. SUMMARY This paper introduces to...the lecture series dedicated to the knowledge-based radar signal and data processing. Knowledge-based expert system (KBS) is in the realm of
2010-01-01
Background The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. Results In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. Conclusion High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data. PMID:20122245
Awareness, knowledge, and attitude of dentistry students in Kerman towards evidence-based dentistry
Sarani, Arezoo; Sarani, Melika; Abdar, Mohammad Esmaeli; Abdar, Zahra Esmaeili
2016-01-01
Introduction Evidence-based care helps dentists provide quality dental services to patients, and such care is based on the use of reliable information about treatment and patient care from a large number of papers, books, and published textbooks. This study aimed to determine the knowledge, awareness, and attitude of dentistry students towards evidence-based dentistry. Methods In this cross-sectional study, all dentistry students who were studying in their sixth semester and higher in the Kerman School of Dentistry (n = 73) were studied. The data were analyzed using SPSS version 17 and the independent-samples t-tests and the ANOVA test. Results The means of the students’ knowledge, awareness, and attitude scores were 29.2 ± 10.8, 29.9 ± 8.12 and 44.5 ± 5.3, respectively. Among demographic variables, only the number of semesters showed a significant difference with knowledge, awareness, and attitude of dentistry students toward evidence-based dentistry (p = 0.001). Conclusion According to the results of this study, knowledge and awareness of dentistry students at Kerman University of Medical Sciences towards evidence-based dentistry were average and have a neutral attitude. Thus, providing necessary training in this regard will cause promoting the knowledge, awareness, and improved attitudes of dentistry students. PMID:27382446
Seok, Junhee; Kaushal, Amit; Davis, Ronald W; Xiao, Wenzhong
2010-01-18
The large amount of high-throughput genomic data has facilitated the discovery of the regulatory relationships between transcription factors and their target genes. While early methods for discovery of transcriptional regulation relationships from microarray data often focused on the high-throughput experimental data alone, more recent approaches have explored the integration of external knowledge bases of gene interactions. In this work, we develop an algorithm that provides improved performance in the prediction of transcriptional regulatory relationships by supplementing the analysis of microarray data with a new method of integrating information from an existing knowledge base. Using a well-known dataset of yeast microarrays and the Yeast Proteome Database, a comprehensive collection of known information of yeast genes, we show that knowledge-based predictions demonstrate better sensitivity and specificity in inferring new transcriptional interactions than predictions from microarray data alone. We also show that comprehensive, direct and high-quality knowledge bases provide better prediction performance. Comparison of our results with ChIP-chip data and growth fitness data suggests that our predicted genome-wide regulatory pairs in yeast are reasonable candidates for follow-up biological verification. High quality, comprehensive, and direct knowledge bases, when combined with appropriate bioinformatic algorithms, can significantly improve the discovery of gene regulatory relationships from high throughput gene expression data.
ERIC Educational Resources Information Center
Steele, Annfrid R.
2017-01-01
There is an increased focus in teacher education on research-based teaching as a means to develop a more research-based professional knowledge. However, research from several Western countries shows that neither school-based nor university-based teachers are familiar with how to integrate research-based knowledge in professional teacher practice.…
A knowledge-based flight status monitor for real-time application in digital avionics systems
NASA Technical Reports Server (NTRS)
Duke, E. L.; Disbrow, J. D.; Butler, G. F.
1989-01-01
The Dryden Flight Research Facility of the National Aeronautics and Space Administration (NASA) Ames Research Center (Ames-Dryden) is the principal NASA facility for the flight testing and evaluation of new and complex avionics systems. To aid in the interpretation of system health and status data, a knowledge-based flight status monitor was designed. The monitor was designed to use fault indicators from the onboard system which are telemetered to the ground and processed by a rule-based model of the aircraft failure management system to give timely advice and recommendations in the mission control room. One of the important constraints on the flight status monitor is the need to operate in real time, and to pursue this aspect, a joint research activity between NASA Ames-Dryden and the Royal Aerospace Establishment (RAE) on real-time knowledge-based systems was established. Under this agreement, the original LISP knowledge base for the flight status monitor was reimplemented using the intelligent knowledge-based system toolkit, MUSE, which was developed under RAE sponsorship. Details of the flight status monitor and the MUSE implementation are presented.
The Knowledge Gap Versus the Belief Gap and Abstinence-Only Sex Education.
Hindman, Douglas Blanks; Yan, Changmin
2015-08-01
The knowledge gap hypothesis predicts widening disparities in knowledge of heavily publicized public affairs issues among socioeconomic status groups. The belief gap hypothesis extends the knowledge gap hypothesis to account for knowledge and beliefs about politically contested issues based on empirically verifiable information. This analysis of 3 national surveys shows belief gaps developed between liberals and conservatives regarding abstinence-only sex education; socioeconomic status-based knowledge gaps did not widen. The findings partially support both belief gap and knowledge gap hypotheses. In addition, the unique contributions of exposure to Fox News, CNN, and MSNBC in this process were investigated. Only exposure to Fox News was linked to beliefs about abstinence-only sex education directly and indirectly through the cultivation of conservative ideology.
Segmentation of medical images using explicit anatomical knowledge
NASA Astrophysics Data System (ADS)
Wilson, Laurie S.; Brown, Stephen; Brown, Matthew S.; Young, Jeanne; Li, Rongxin; Luo, Suhuai; Brandt, Lee
1999-07-01
Knowledge-based image segmentation is defined in terms of the separation of image analysis procedures and representation of knowledge. Such architecture is particularly suitable for medical image segmentation, because of the large amount of structured domain knowledge. A general methodology for the application of knowledge-based methods to medical image segmentation is described. This includes frames for knowledge representation, fuzzy logic for anatomical variations, and a strategy for determining the order of segmentation from the modal specification. This method has been applied to three separate problems, 3D thoracic CT, chest X-rays and CT angiography. The application of the same methodology to such a range of applications suggests a major role in medical imaging for segmentation methods incorporating representation of anatomical knowledge.
Ambulatory Morning Report: A Case-Based Method of Teaching EBM Through Experiential Learning.
Luciano, Gina L; Visintainer, Paul F; Kleppel, Reva; Rothberg, Michael B
2016-02-01
Evidence-based medicine (EBM) skills are important to daily practice, but residents generally feel unskilled incorporating EBM into practice. The Kolb experiential learning theory, as applied to curricular planning, offers a unique methodology to help learners build an EBM skill set based on clinical experiences. We sought to blend the learner-centered, case-based merits of the morning report with an experientially based EBM curriculum. We describe and evaluate a patient-centered ambulatory morning report combining the User's Guides to the Medical Literature approach to EBM and experiential learning theory in the internal medicine department at Baystate Medical Center. The Kolb experiential learning theory postulates that experience transforms knowledge; within that premise we designed a curriculum to build EBM skills incorporating residents' patient encounters. By developing structured clinical questions based on recent clinical problems, residents activate prior knowledge. Residents acquire new knowledge through selection and evaluation of an article that addresses the structured clinical questions. Residents then apply and use new knowledge in future patient encounters. To assess the curriculum, we designed an 18-question EBM test, which addressed applied knowledge and EBM skills based on the User's Guides approach. Of the 66 residents who could participate in the curriculum, 61 (92%) completed the test. There was a modest improvement in EBM knowledge, primarily during the first year of training. Our experiential curriculum teaches EBM skills essential to clinical practice. The curriculum differs from traditional EBM curricula in that ours blends experiential learning with an EBM skill set; learners use new knowledge in real time.
Curran, Janet A; Murphy, Andrea L; Sinclair, Douglas; McGrath, Patrick
2013-01-01
Rural emergency departments (EDs) generally have limited access to continuing education and are typically staffed by clinicians without pediatric emergency specialty training. Emergency care of children is complex and the majority of children receive emergency care in non-pediatric tertiary care centers. In recent decades, there has been a call to action to improve quality and safety in the emergency care of children. Of the one million ED visits by children in Ontario in 2005-2006, one in three visited more than once in a year and one in 15 returned to the ED within 72 hours of the index visit. This study explored factors influencing rural and urban ED clinicians' participation in a Web-based knowledge exchange intervention that focused on best practice knowledge about pediatric emergency care. The following questions guided the study: (i) What are the individual, context of practice or knowledge factors which impact a clinician's decision to participate in a Web-based knowledge exchange intervention?; (ii) What are clinicians' perceptions of organizational expectations regarding knowledge and information sources to be used in practice?; and (iii) What are the preferred knowledge sources of rural and urban emergency clinicians? A Web-based knowledge exchange intervention, the Pediatric Emergency Care Web Based Knowledge Exchange Project, for rural and urban ED clinicians was developed. The website contained 12 pediatric emergency practice learning modules with linked asynchronous discussion forums. The topics for the modules were determined through a needs assessment and the module content was developed by known experts in the field. A follow-up survey was sent to a convenience sample of 187 clinicians from nine rural and two urban Canadian EDs participating in the pediatric emergency Web-based knowledge exchange intervention study. The survey response rate was 56% (105/187). Participation in the knowledge exchange intervention was related to individual involvement in research activities (χ(2)=5.23, p=0.019), consultation with colleagues from other EDs (χ(2)=6.37, p=0.01) and perception of organizational expectations to use research evidence to guide practice (χ(2)=5.52, p=0.015). Most clinicians (95/105 or 92%) reported relying on colleagues from their own ED as a primary knowledge source. Urban clinicians were more likely than their rural counterparts to perceive that use of research evidence to guide practice was an expectation. Rural clinicians were more likely to rely on physicians from their own ED as a preferred knowledge source. The decision made by emergency clinicians to participate in a Web-based knowledge exchange intervention was influenced by a number of individual and contextual factors. Differences in these factors and preferences for knowledge sources require further characterization to enhance engagement of rural ED clinicians in online knowledge exchange interventions.
Gustafsson, Markus; Borglin, Gunilla
2013-08-19
Registered Nurses (RNs) play an important role in caring for patients suffering from cancer pain. A lack of knowledge regarding pain management and the RNs' own perception of cancer pain could act as barriers to effective pain management. Educational interventions that target RNs' knowledge and attitudes have proved promising. However, an intervention consisting of evidence-based practice is a multifaceted process and demands behavioural and cognitive changes to sustain the effects of the intervention. Therefore, our study aimed to investigate if a theory-based educational intervention could change RNs' knowledge and attitudes to cancer pain and pain management, both four and 12 weeks after the start of the intervention. A quasi-experimental design with non-equivalent control groups was used. The primary outcome was measured using a modified version of the instrument Nurses' Knowledge and Attitudes Survey Regarding Pain (NKAS) at baseline, four weeks and 12 weeks after the start of the intervention to evaluate its persistence. The intervention's educational curriculum was based on the principles of Ajzen's Theory of Planned Behaviour and consisted of interactive learning activities conducted in workshops founded on evidence-based knowledge. The RN's own experiences from cancer pain management were used in the learning process. The theory-based educational intervention aimed at changing RNs knowledge and attitudes regarding cancer pain management measured by primary outcome NKAS resulted in a statistical significant (p<0.05) improvement of total mean score from baseline to four weeks at the intervention ward. The findings of this study, suggest that a theory-based educational intervention focused at RNs can be effective in changing RN's knowledge and attitudes regarding cancer pain management. However, the high number of dropouts between baseline and four weeks needs to be taken into account when evaluating our findings. Finally, this kind of theory-based educational intervention with interactive learning activities has been sparsely researched and needs to be evaluated further in larger projects. Clinical Trials. Gov: NCT01313234.
Integrative pathway knowledge bases as a tool for systems molecular medicine.
Liang, Mingyu
2007-08-20
There exists a sense of urgency to begin to generate a cohesive assembly of biomedical knowledge as the pace of knowledge accumulation accelerates. The urgency is in part driven by the emergence of systems molecular medicine that emphasizes the combination of systems analysis and molecular dissection in the future of medical practice and research. A potentially powerful approach is to build integrative pathway knowledge bases that link organ systems function with molecules.
2003-03-01
information technologies that can: (a) represent knowledge and skills, (b) identify people with all or parts of the knowledge and task experience...needed but lacked, A might be at too advanced a level for the 8 individual to understand given his or her previous knowledge , B might overlap too...SEMANTIC ANALYSIS-BASED TECHNOLOGY Darrell Laham Knowledge Analysis Technologies 4940 Pearl East Circle #200 Boulder, CO 80301 Winston
Towards Semantic e-Science for Traditional Chinese Medicine
Chen, Huajun; Mao, Yuxin; Zheng, Xiaoqing; Cui, Meng; Feng, Yi; Deng, Shuiguang; Yin, Aining; Zhou, Chunying; Tang, Jinming; Jiang, Xiaohong; Wu, Zhaohui
2007-01-01
Background Recent advances in Web and information technologies with the increasing decentralization of organizational structures have resulted in massive amounts of information resources and domain-specific services in Traditional Chinese Medicine. The massive volume and diversity of information and services available have made it difficult to achieve seamless and interoperable e-Science for knowledge-intensive disciplines like TCM. Therefore, information integration and service coordination are two major challenges in e-Science for TCM. We still lack sophisticated approaches to integrate scientific data and services for TCM e-Science. Results We present a comprehensive approach to build dynamic and extendable e-Science applications for knowledge-intensive disciplines like TCM based on semantic and knowledge-based techniques. The semantic e-Science infrastructure for TCM supports large-scale database integration and service coordination in a virtual organization. We use domain ontologies to integrate TCM database resources and services in a semantic cyberspace and deliver a semantically superior experience including browsing, searching, querying and knowledge discovering to users. We have developed a collection of semantic-based toolkits to facilitate TCM scientists and researchers in information sharing and collaborative research. Conclusion Semantic and knowledge-based techniques are suitable to knowledge-intensive disciplines like TCM. It's possible to build on-demand e-Science system for TCM based on existing semantic and knowledge-based techniques. The presented approach in the paper integrates heterogeneous distributed TCM databases and services, and provides scientists with semantically superior experience to support collaborative research in TCM discipline. PMID:17493289